I have a multi-room speaker system from Denon called Heos which I want to control by using python script. To communicate with the multi-room system I have to telnet to port 1255 on the device and send commands like this:
heos://player/set_play_state?pid=player_id&state=play_state
The response back is in json:
{
"heos": {
"command": " player/set_play_state ",
"result": "success",
"message": "pid='player_id'&state='play_state'"
}
}
I have successfully used python telnet lib to send simple commands like this:
command = "heos://player/set_play_state?pid=player_id&state=play_state"
telnet.write(command.encode('ASCII') + b'\r\n')
But what is the best way to get the response back in a usable format? Loop with telnet.read_until? I want to result and message lines back to a clean variable.
This method with using telnet to communicate with api feels a bit dirty. Is it possible to use something else, for example socket?
Thanks in advance
The API/CLI is documented here: http://rn.dmglobal.com/euheos/HEOS_CLI_ProtocolSpecification.pdf
While it may be possible to use loop_until() here, it would depend on exactly how the response JSON is formatted, and it would probably be unwise to rely on it.
If the remote device closes the connection after sending the response, the easy way would be a simple
response = json.loads(telnet.read_all().decode())
If it remains open for more commands, then you'll instead need to keep receiving until you have a complete JSON object. Here's a possibility that just keeps trying to parse the JSON until it succeeds:
response = ''
while True:
response += telnet.read_some().decode()
try:
response = json.loads(response)
break
except ValueError:
pass
Either way, your result and message are response['heos']['result'] and response['heos']['message'].
FWIW, here is my GitHub repo (inspired by this repo) for controlling a HEOS speaker with Python. It uses a similar approach as the accepted result, but additionally waits if the HEOS player is busy.
def telnet_request(self, command, wait = True):
"""Execute a `command` and return the response(s)."""
command = self.heosurl + command
logging.debug("telnet request {}".format(command))
self.telnet.write(command.encode('ASCII') + b'\r\n')
response = b''
while True:
response += self.telnet.read_some()
try:
response = json.loads(response)
if not wait:
logging.debug("I accept the first response: {}".format(response))
break
# sometimes, I get a response with the message "under
# process". I might want to wait here
message = response.get("heos", {}).get("message", "")
if "command under process" not in message:
logging.debug("I assume this is the final response: {}".format(response))
break
logging.debug("Wait for the final response")
response = b'' # forget this message
except ValueError:
# response is not a complete JSON object
pass
except TypeError:
# response is not a complete JSON object
pass
if response.get("result") == "fail":
logging.warn(response)
return None
return response
Related
I can easly send messages from my server to one connected client, you can see it in my code writer.write(("you write: ").encode('utf8') + response.encode('utf8'))
But how for example send a list/array/matrix over asyncio sockets? in the bellow example, how send the hi list to one connected client... I try just writer.write(hi.encode('utf8')) but its not work !
import asyncio
hi=[1, 2, 3, 4]
#asyncio.coroutine
def handle_client(reader, writer):
request = None
response = ""
word=""
while word != 'quit' and request != b'':
request = (yield from reader.read(255))
if str(b'\r\n') == str((request)[-2:]) and len(request) > 2:
response=str((request.decode('utf8'))).rstrip()
request=str(b'\r\n')
if str(b'\r\n') == str(request):
word=response
response = response + str('\n\r')
writer.write(("you write: ").encode('utf8') + response.encode('utf8'))
response =""
addr = writer.get_extra_info('peername')
print("I get: " + word)
print(addr)
else:
response = response + str((request.decode('utf8')))
yield from writer.drain()
writer.close()
loop = asyncio.get_event_loop()
asyncio.set_event_loop(loop)
loop.create_task(asyncio.start_server(handle_client, '127.0.0.1', 4312))
loop.run_forever()
In the provided example hi is a list type object, as such it does not have an encode() method. Also writer is an asyncio.StreamWriter type object, whose write() method expects a bytes type object. What this means is that the object -> bytes encoding has to be handled by the caller of write() method. Also the receiver of the data has to do the bytes -> list decoding on their end.
To answer your immediate question, you'll need to serialize the data in a way both ends of the communication agree on and understand. Couple options could be for instance:
JSON:
import json
json.dumps(hi).encode("utf8")
To get bytes you can send over and receive it with:
received = json.loads(response)
This seems perfectly adequate for your example and does have an advantage of the data being passed around being also readable to a human being.
pickle:
import pickle
pickle.dumps(hi)
This also produces bytes you can de-serialize on the receiving end:
received = pickle.loads(response)
This has an advantage of being more general method which could also be used for types you could not serialize with JSON.
I'm following this Route_Guide sample.
The sample in question fires off and reads messages without replying to a specific message. The latter is what i'm trying to achieve.
Here's what i have so far:
import grpc
...
channel = grpc.insecure_channel(conn_str)
try:
grpc.channel_ready_future(channel).result(timeout=5)
except grpc.FutureTimeoutError:
sys.exit('Error connecting to server')
else:
stub = MyService_pb2_grpc.MyServiceStub(channel)
print('Connected to gRPC server.')
this_is_just_read_maybe(stub)
def this_is_just_read_maybe(stub):
responses = stub.MyEventStream(stream())
for response in responses:
print(f'Received message: {response}')
if response.something:
# okay, now what? how do i send a message here?
def stream():
yield my_start_stream_msg
# this is fine, i receive this server-side
# but i can't check for incoming messages here
I don't seem to have a read() or write() on the stub, everything seems to be implemented with iterators.
How do i send a message from this_is_just_read_maybe(stub)?
Is that even the right approach?
My Proto is a bidirectional stream:
service MyService {
rpc MyEventStream (stream StreamingMessage) returns (stream StreamingMessage) {}
}
What you're trying to do is perfectly possible and will probably involve writing your own request iterator object that can be given responses as they arrive rather than using a simple generator as your request iterator. Perhaps something like
class MySmarterRequestIterator(object):
def __init__(self):
self._lock = threading.Lock()
self._responses_so_far = []
def __iter__(self):
return self
def _next(self):
# some logic that depends upon what responses have been seen
# before returning the next request message
return <your message value>
def __next__(self): # Python 3
return self._next()
def next(self): # Python 2
return self._next()
def add_response(self, response):
with self._lock:
self._responses.append(response)
that you then use like
my_smarter_request_iterator = MySmarterRequestIterator()
responses = stub.MyEventStream(my_smarter_request_iterator)
for response in responses:
my_smarter_request_iterator.add_response(response)
. There will probably be locking and blocking in your _next implementation to handle the situation of gRPC Python asking your object for the next request that it wants to send and your responding (in effect) "wait, hold on, I don't know what request I want to send until after I've seen how the next response turned out".
Instead of writing a custom iterator, you can also use a blocking queue to implement send and receive like behaviour for client stub:
import queue
...
send_queue = queue.SimpleQueue() # or Queue if using Python before 3.7
my_event_stream = stub.MyEventStream(iter(send_queue.get, None))
# send
send_queue.push(StreamingMessage())
# receive
response = next(my_event_stream) # type: StreamingMessage
This makes use of the sentinel form of iter, which converts a regular function into an iterator that stops when it reaches a sentinel value (in this case None).
This shouldn't be that complicated, but it seems that both the Ruby and Python Telnet libs have awkward APIs. Can anyone show me how to write a command to a Telnet host and then read the response into a string for some processing?
In my case "SEND" with a newline retrieves some temperature data on a device.
With Python I tried:
tn.write(b"SEND" + b"\r")
str = tn.read_eager()
which returns nothing.
In Ruby I tried:
tn.puts("SEND")
which should return something as well, the only thing I've gotten to work is:
tn.cmd("SEND") { |c| print c }
which you can't do much with c.
Am I missing something here? I was expecting something like the Socket library in Ruby with some code like:
s = TCPSocket.new 'localhost', 2000
while line = s.gets # Read lines from socket
puts line # and print them
end
I found out that if you don't supply a block to the cmd method, it will give you back the response (assuming the telnet is not asking you for anything else). You can send the commands all at once (but get all of the responses bundled together) or do multiple calls, but you would have to do nested block callbacks (I was not able to do it otherwise).
require 'net/telnet'
class Client
# Fetch weather forecast for NYC.
#
# #return [String]
def response
fetch_all_in_one_response
# fetch_multiple_responses
ensure
disconnect
end
private
# Do all the commands at once and return everything on one go.
#
# #return [String]
def fetch_all_in_one_response
client.cmd("\nNYC\nX\n")
end
# Do multiple calls to retrieve the final forecast.
#
# #return [String]
def fetch_multiple_responses
client.cmd("\r") do
client.cmd("NYC\r") do
client.cmd("X\r") do |forecast|
return forecast
end
end
end
end
# Connect to remote server.
#
# #return [Net::Telnet]
def client
#client ||= Net::Telnet.new(
'Host' => 'rainmaker.wunderground.com',
'Timeout' => false,
'Output_log' => File.open('output.log', 'w')
)
end
# Close connection to the remote server.
def disconnect
client.close
end
end
forecast = Client.new.response
puts forecast
I have asked a few questions about this before, but still haven't solved my problem.
I am trying to allow Salesforce to remotely send commands to a Raspberry Pi via JSON (REST API). The Raspberry Pi controls the power of some RF Plugs via an RF Transmitter called a TellStick. This is all setup, and I can use Python to send these commands. All I need to do now is make the Pi accept JSON, then work out how to send the commands from Salesforce.
Someone kindly forked my repo on GitHub, and provided me with some code which should make it work. But unfortunately it still isn't working.
Here is the previous question: How to accept a JSON POST?
And here is the forked repo: https://github.com/bfagundez/RemotePiControl/blob/master/power.py
What do I need to do? I have sent test JSON messages n the Postman extension and in cURL but keep getting errors.
I just want to be able to send various variables, and let the script work the rest out.
I can currently post to a .py script I have with some URL variables, so /python.py?power=on&device=1&time=10&pass=whatever and it figures it out. Surely there's a simple way to send this in JSON?
Here is the power.py code:
# add flask here
from flask import Flask
app = Flask(__name__)
app.debug = True
# keep your code
import time
import cgi
from tellcore.telldus import TelldusCore
core = TelldusCore()
devices = core.devices()
# define a "power ON api endpoint"
#app.route("/API/v1.0/power-on/<deviceId>",methods=['POST'])
def powerOnDevice(deviceId):
payload = {}
#get the device by id somehow
device = devices[deviceId]
# get some extra parameters
# let's say how long to stay on
params = request.get_json()
try:
device.turn_on()
payload['success'] = True
return payload
except:
payload['success'] = False
# add an exception description here
return payload
# define a "power OFF api endpoint"
#app.route("/API/v1.0/power-off/<deviceId>",methods=['POST'])
def powerOffDevice(deviceId):
payload = {}
#get the device by id somehow
device = devices[deviceId]
try:
device.turn_off()
payload['success'] = True
return payload
except:
payload['success'] = False
# add an exception description here
return payload
app.run()
Your deviceID variable is a string, not an integer; it contains a '1' digit, but that's not yet an integer.
You can either convert it explicitly:
device = devices[int(deviceId)]
or tell Flask you wanted an integer parameter in the route:
#app.route("/API/v1.0/power-on/<int:deviceId>", methods=['POST'])
def powerOnDevice(deviceId):
where the int: part is a URL route converter.
Your views should return a response object, a string or a tuple instead of a dictionary (as you do now), see About Responses. If you wanted to return JSON, use the flask.json.jsonify() function:
# define a "power ON api endpoint"
#app.route("/API/v1.0/power-on/<int:deviceId>", methods=['POST'])
def powerOnDevice(deviceId):
device = devices[deviceId]
# get some extra parameters
# let's say how long to stay on
params = request.get_json()
try:
device.turn_on()
return jsonify(success=True)
except SomeSpecificException as exc:
return jsonify(success=False, exception=str(exc))
where I also altered the exception handler to handle a specific exception only; try to avoid Pokemon exception handling; do not try to catch them all!
To retrieve the Json Post values you must use request.json
if request.json and 'email' in request.json:
request.json['email']
I'm writing an http server in python3.3, just to learn how to do this sort of thing. In my function that parses a request, I want to use fcntl.ioctl to get the number of bytes that I can read in the socket, and I only do this when I see a kevent in the result of checking a kqueue that says there is stuff to read on the socket. But whenever I try to call fcntl.ioctl, I get OSError: [Errno 14] Bad address. What am I doing wrong? Also, this seems to be happening on the first call. Here is the relevant code:
def client_thread(kq, client_socket, methods):
while True:
events = kq.control([], 2, POLLTIME) #we pass an empty list of changes, because we don't have any changes to make to the events we are interested in.
#we want a list that is at most two long. We listen for POLLTIME seconds.
for event in events:
if event != KILL_KEV: #there are only two events in our kqueue
handle_client(client_socket, methods)
else: #KILL_SOCK has a connection
break
client_socket.close()
client_socket.shutdown()
def handle_client(client_socket, methods):
request = parse_request(client_socket) #parse the request data in the client socket
handlers = methods[request["request"]["method"]] #retrieve the appropriate list of handlers from the methods dict
for path_match_pred, handler_func in handlers:
if path_match_pred(path): #if the path matches whatever path predicate you've created...
break
response = handler_func(request) #... then call the appropriate handler function to handle the request
send_response(client_socket, response) #and finally, send the response.
def parse_request(client_socket):
"""Returns the request data, parsed into a dictionary like this:
{
"request": {
"method": method,
"path": path,
"version": HTTP version
},
"headers": header dictionary,
"body": body data as a string
}
This should only be called if the client socket is ready for reading!
"""
client_fd = client_socket.fileno() #get the file descriptor for the socket
bytes_in_socket = 0
fcntl.ioctl(client_fd, termios.FIONREAD, bytes_in_socket) #count the bytes in it
#^^^^^^^^^THIS IS WHERE IT BREAKS
print(bytes_in_socket, "bytes in socket")
msg = bytearray() #make empty byte array
while bytes_in_socket:
msg.extend(client_socket.recv(bytes_in_socket)) #read the bytes we counted earlier
fcntl.ioctl(client_fd, termios.FIONREAD, bytes_in_socket) #check for more bytes
print(bytes_in_socket, "bytes left to read")
Note that in fcntl.ioctl's documentation, they mention the acceptable types for arg (the argument to the ioctl operation). In some cases, (like this), you need to pass a buffer, or an object that supports its interface. In particular, when you want to receive a value back. You're just passing an integer.