Quick start#
The bulk of the configuration of Platypush lives under the config.yaml
file.
An extensive config.yaml
example
is provided in the repo. All the sections are optional - the only one enabled by
default is the HTTP server, backend.http
, but that is optional too.
Let’s take an example where we want to control the following entities:
A Philips Hue bridge and its connected smart lights.
An on-device voice assistant (we’ll consider the Google Assistant in this example as it’s the easiest to configure, although Google deprecated the Assistant libraries long ago).
A compatible music player - we’ll consider MPD/Mopidy in this example as they are the ones best supported in Platypush, and Mopidy also offers plugins with basically any audio backend out there.
We’ll need the following plugins enabled in the config.yaml
:
music.mopidy
ormusic.mpd
(they expose the same API)
The documentation pages of these plugins already provide some comprehensive configuration snippets that you can use.
The most basic configuration would be something like this:
# Enable it if you want the enable the HTTP API and the Web interface
backend.http:
light.hue:
# IP/hostname of the Hue bridge
bridge: 192.168.1.10
# Default groups that should be targeted by actions if none is specified
# (default: all lights/groups)
groups:
- Living Room
# Check the plugin documentation on how to get the credentials
assistant.google:
music.mopidy: # Or music.mpd
# IP/hostname of the MPD/Mopidy server
host: 192.168.1.2
Now that we have our integrations configured, let’s build some automation routines.
Turn on the lights when I say so#
In this case we will have to create a hook that listens to a
SpeechRecognizedEvent
triggered by the assistant - for example, when we say “OK, Google” followed
by “turn on the lights”.
We can declare the hook in YAML format directly in the config.yaml
, or in one
of the files included in it through the include:
directive:
event.hook.turn_lights_on_voice_command:
if:
type: platypush.message.event.assistant.SpeechRecognizedEvent
# Note that a minimal regex-like syntax is supported here.
# This condition matches both a phrase that contains
# "turn on the lights" and one that contains "turn on lights"
phrase: "turn on (the)? lights"
then:
- action: light.hue.on
args:
groups:
- Living Room
Or we can declare the hook in a Python script - you just have to create a .py
file (e.g. lights.py
) under a scripts
directory located under the same
folder as your config.yaml
:
from platypush import run, when
from platypush.events.assistant import SpeechRecognizedEvent
@when(SpeechRecognizedEvent, phrase="turn on (the)? lights")
def lights_on_voice_command(): # Also accepts an optional `event` argument
run('light.hue.on', groups=['Living Room'])
Or, using the get_plugin
API:
from platypush import get_plugin, when
from platypush.events.assistant import SpeechRecognizedEvent
@when(SpeechRecognizedEvent, phrase="turn on (the)? lights")
def lights_on_voice_command():
get_plugin('light.hue').on(groups=['Living Room'])
Play the music when I say so#
The approach is similar for a “play the music” voice command. YAML:
event.hook.play_music_voice_command:
if:
type: platypush.message.event.assistant.SpeechRecognizedEvent
phrase: "play (the)? music"
then:
- action: music.mopidy.play
Python:
from platypush import run, when
from platypush.events.assistant import SpeechRecognizedEvent
@when(SpeechRecognizedEvent, phrase="play (the)? music")
def lights_on_voice_command():
run('music.mopidy.play')
Turn on the lights when the sun goes down#
This example requires the sun
plugin configured:
sun:
latitude: LAT
longitude: LONG
You can then simply subscribe to
SunsetEvent
.
YAML:
event.hook.sunset_lights_on:
if:
type: platypush.message.event.sun.SunsetEvent
then:
- action: light.hue.on
Python:
from platypush import run, when
from platypush.events.sun import SunsetEvent
@when(SunsetEvent)
def sunset_lights_on():
run('light.hue.on')
Event matching and token extraction through hook templates#
You can also operate token extraction from event arguments if the values are strings.
For example, you can use advanced pattern matching and token extraction to create voice assistant hooks that will match a template with parametrized field which will be passed as arguments to your event hook:
from platypush import run, when
from platypush.events.assistant import SpeechRecognizedEvent
@when(SpeechRecognizedEvent, phrase='play ${title} by ${artist}')
def on_music_play_command(event, title, artist):
results = run(
'music.mpd.search',
filter={
'artist': artist,
'title': title,
}
)
if results:
run('music.mpd.play', results[0]['file'])
Complex hook conditions#
Your event hooks can include more complex filters too. Structured filters against partial event arguments are also possible, and relational operators are supported as well. For example:
from platypush import when
from platypush.events.sensor import SensorDataChangeEvent
@when(SensorDataChangeEvent, data=1):
def hook_1(event):
"""
Triggered when event.data == 1
"""
@when(SensorDataChangeEvent, data={'state': 1}):
def hook_2(event):
"""
Triggered when event.data['state'] == 1
"""
@when(SensorDataChangeEvent, data={
'temperature': {'$gt': 25},
'humidity': {'$le': 15}
}):
def hook_3(event):
"""
Triggered when event.data['temperature'] > 25 and
event.data['humidity'] <= 15.
"""
The supported relational fields are the same supported by ElasticSearch - $gt
for greater than, $lt
for lesser than, $ge
for greater or equal, $ne
for
not equal, etc.
Turn off the lights at 1 AM#
We can use a cron
for this case. YAML:
cron.lights_off_night:
# Run this every day at 1 AM
cron_expression: '0 1 * * *'
actions:
- action: light.hue.off
Python:
from platypush import cron, run
@cron('0 1 * * *')
def lights_off_night():
run('light.hue.off')
Greet me with lights and music when I come home#
Let’s create an at_home
procedure for this purpose. We can also use a
text-to-speech plugin like the tts
plugin (it requires no
configuration as it relies on the Google Translate frontend API, but other,
more sophisticated plugins are also available) to have a warm voice to welcome
us home. YAML:
# Make sure that the sound plugin is also enabled, for audio processing
sound:
procedure.at_home:
- action: tts.say
args:
text: "Welcome home!"
# Get luminosity data from a sensor - e.g. LTR559
- action: gpio.sensor.ltr559.get_data
# If it's lower than a certain threshold, turn on the lights.
# Note that we can directly access attributes returned by the
# previous request(s) as local context variables within the
# procedure/hook/cron. In this case, `light` is an attribute returned
# on the response of the previous command.
# Otherwise, you can also use the special `output` variable to get only
# the response of the latest action, e.g. `output['light']`
# Also note the use of the special `if ${}` construct. It accepts
# a snippet of Python code and it can access variables within the
# current context.
- if ${light is not None and light < 110}:
- action: light.hue.on
- action: music.mopidy.play
args:
resource: "uri:to:my:favourite:playlist"
Python:
from platypush import procedure, run
@procedure("at_home")
def at_home_proc():
run('tts.say', text='Welcome home!')
luminosity = run('gpio.sensor.ltr559.get_data').get('light', 0)
if luminosity < 110:
run('light.hue.on')
run('music.mopidy.play', resource='uri:to:my:favourite:playlist')
You can then call the procedure from a hook or another script:
from platypush import run
run('procedure.at_home')
Or, from YAML:
procedure.some_other_procedure:
- action: procedure.at_home
Or using the API (see next section).