Telemetry data through python socket, without stopping execution of the program - python

I'm building photovoltaic motorized solar trackers. They're controlled by Raspberry Pi's running python script. RPI's are connected to my public openVPN server for remote control and continuous software development. That's working fine. Recently a passionate customer asked me for some sort of telemetry data for his tracker - let's say, it's current orientation, measured wind speed etc.. By being new to python, I'm really struggling with this part.
I've decided to use socket approach from guides like this. Python script listens on a socket, and my openVPN server, which is also web server, connects to it using PHP fsockopen. Python sends telemetry data, PHP makes it user friendly and displays it on the web. Everything so far works, however I don't know how to design my python script around it.
The problem is, that my script has to run continuously, and socket.accept() halts it's execution, waiting for a connection. Didn't find any obvious solution on the web. Would multi-threading work for this? Sounds a bit like overkill.
Is there a way to run socket listening asynchronously? Like, for example, pigpio callback's which I'm using abundantly?
Or alternatively, is there a better way to accomplish my goal?
I tried with remote accessing status file that my script is maintaining, but that proved to be extremely involved with setup and prone to errors when the file was being written.
I also tried running the second script. Problem is, then I have no access to relevant data, or I need to read beforementioned status file, and that leads to the same problems as above.
Relevant bit of code is literally only this:
# Main loop
try:
while True:
# Telemetry
conn, addr = S.accept()
conn.send(data.encode())
conn.close()
Best regards.

For a simple case like this I would probably just wrap the socket code into a separate thread.
With multithreading in python, the Global Interpreter Lock (GIL) means that only one thread executes at a time, so you don't really need to add any further locks to the data if you're just reading the values, and don't care if it's also being updated at the same time.
Your code would essentially read something like:
from threading import Thread
def handle_telemetry_requests():
# Main loop
try:
while True:
# Telemetry
conn, addr = S.accept()
conn.send(data.encode())
conn.close()
except:
# Error handling here (this will cause thread to exit if any error occurs)
pass
socket_thread = Thread(target=handle_telemetry_requests)
socket_thread.daemon = True
socket_thread.start()
Setting the daemon flag means that when the main application ends, the thread will also be terminated.
Python does provide the asyncio module - which may provide the callbacks you're looking for (though I don't have any experience with this).
Other options are to run a flask server in the python apps which will handle the sockets for you and you can just code the endpoints to request the data. Or think about using an MQTT broker - the current data can be written to that - and other apps can subscribe to updates.

Related

Passing arguments to a running python script

I have a script running on my raspberry, these script is started from a command from an php page. I’ve multiple if stetements, now I would like to pass new arguments to the script whithout stopping it. I found lots of information by passing arguments to the python script, but not if its possible while the svpcript is already running to pass new arguments. Thanks in advance!
The best option for me is to use a configuration file input for your script.
Some simple yaml will do. Then in a separate thread you must observe the hash of the file, if it gets changed that
means somebody has updated your file and you must re/adjust your inputs.
Basically you have that constant observer running all the time.
You need some sort of IPC mechanism really. As you are executing/updating the script from a PHP application, I'd suggest you'll look into something like ZeroMQ which supports both Python and PHP, and will allow you to do a quick and dirty Pub/Sub implementation.
The basic idea is, treat your python script as a subscriber to messages coming from the PHP application which publishes them as and when needed. To achieve this, you'll want to start your python "script" once and leave it running in the background, listening for messages on ZeroMQ. Something like this should get you going
import zmq
context = zmq.Context()
socket = context.socket(zmq.REP)
socket.bind("tcp://*:5555")
while True:
# Wait for next message from from your PHP application
message = socket.recv()
print("Recieved a message: %s" % message)
# Here you should do the work you need to do in your script
# Once you are done, tell the PHP application you are done
socket.send(b"Done and dusted")
Then, in your PHP application, you can use something like the following to send a message to your Python service
$context = new ZMQContext();
// Socket to talk to server
$requester = new ZMQSocket($context, ZMQ::SOCKET_REQ);
$requester->connect("tcp://localhost:5555");
$requester->send("ALL THE PARAMS TO SEND YOU YOUR PYTHON SCRIPT");
$reply = $requester->recv();
Note, I found the above examples using a quick google search (and amended slightly for educational purposes), but they aren't tested, and purely meant to get you started. For more information, visit ZeroMQ and php-zmq
Have fun.

Running infinite loop and getting commands from "outside" (e.g. shell or other scripts)

I am working on my Raspberry Pi, that is handling some WS2812B RGB-LEDs. I can control the light and everything with the neopixel library and Python. So fine right now.
I want this Python script running an infinite loop that only deals with light management. Dimming LEDs, changing color and lots more. But, I want to be able to get commands from other scripts. Let's say I want to type in a shell command that will change the color. In my infinite Python script (LED Handler), I will be able to recognize this command and change the color or the light mode softly to the desired color.
One idea is, to constantly look into a text file, if there is a new command. And my shell script is able to insert command lines into this text file.
But can you tell me, if there is a better solution of doing it?
Many thanks in advance.
One method would be to expose a TCP server, then communicate with the Python process over TCP. A simple example on how to create a TCP server is here, showcasing both the server script (running the LEDs) and the command scripts: example
I suggest opening a port with your python script and make it receive commands from that port (network programming). Although this would make your project more complicated, it is a very robust implementation.
You can use ZeroMQ and host it locally. It provides bindings for Python. Here is an example script (sender and receiver):
from threading import Thread
import zmq
class Sender(Thread):
def run(self):
context = zmq.Context()
socket = context.socket(zmq.PUB)
socket.connect('tcp://127.0.0.1:8000')
while True:
socket.send_string(input('Enter command: '))
class Receiver(Thread):
def run(self):
context = zmq.Context()
socket = context.socket(zmq.SUB)
socket.bind('tcp://127.0.0.1:8000')
socket.setsockopt(zmq.SUBSCRIBE, b'')
while True:
data = socket.recv().decode('ascii')
print(data) # Do stuff with data.
The receiver would be the instance that controls the lights on the RPi and the sender is the command line script that let's you input the various commands. An advantage is that ZeroMQ supports bindings for various programming languages and you can also send/receive commands over a network.
Another solution is to allow commands from a network connection. The script with the "infinite loop" will read input from a socket and perform the commands.

Kill a tcp connection in a separate thread in Python

So here's the problem, I have a small server script in Python that is supposed to accept multiple clients and based on the message they are sending, receiving a certain command back to them. It's a simple concept and it's working like I want to, with one really big problem: I put each connection on hold and in separate thread, and I want when a certain connected users puts EXIT to close the connection...Which works, with one really big problem - the thread is kept alive and there is no way to kill it and that really bothers me.
sock = socket()
sock.bind((host,port))
sock.listen(50)
def clientthread(conn):
while True:
data = conn.recv(1024).strip()
if(data == "HELO"):
conn.send("HELO")
elif(data == "EXIT"):
conn.close()
break
return
while True:
conn,addr = sock.accept()
start_new_thread(clientthread, (conn,))
conn.close()
sock.close()
I searched of a way to terminate a thread but just couldn't find it, .join() is not working here since it detects the thread as "dummy", it does not recognize the __stop() and since a couple of searches on google for this topic I'm really out of options. Any idea? I'll be really grateful, thanks.
AFAIK, you can't kill a thread from another - you have to arrange for the thread-to-be-killed to notice some flag has changed, and terminate itself.
BTW, your socket code looks a little off - you need a loop around your send's and recv's unless you use something like twisted or bufsock. IMO, bufsock is much easier and less error prone than twisted, but I may be biased because I wrote bufsock. http://stromberg.dnsalias.org/~strombrg/bufsock.html
The problem with what I'm seeing is that TCP reserves the right to split or aggregate transmission units. Usually it won't, but under high load, or with a changing Path MTU, or even just Nagle, it probably will.
Assuming you're using Python v2.4+, you should be using the newer Threading module. Check out a tutorial on it here - It explains the use of the threading module you're using now and how and why you should use the newer Threading module.

Window socket programming with Python

I'm trying to implement Window socket using Python.
Mostly, everything has been so far solved using ctypes.windll.ws2_32 and pywin32 lib.
However, I haven't been able to find out how to translate the following C++ codes into Python and I wonder if anyone is kind enough to help:
LRESULT WINAPI AsyncSocketProc(
__in HWND hwnd,
__in UINT uMsg,
__in WPARAM wParam,
__in LPARAM lParam
)
switch(uMsg) {
case WM_CREATE:
//...
case WM_SOCKET: {# this is basically an int constant
switch(WSAGETSELECTEVENT(lParam)){
case FD_ACCEPT:
//accepting new conn
case FD_READ:
//receiving data
}
}
}
In the above code, I couldn't find Python's equivalent for WSAGETSELECTEVENT.
For the FD_ACCEPT, FD_READ, I could find them inside win32file package (of pywin32 lib)
Lastly, the reason why I'm trying to implement this Window socket programming is that the C++ version of the window socket server (above) is non-blocking for an application of mine but Python's built-in select.select is blocking. So I'm trying to see if I can port the C++ version to Python and see if it works.
EDITED:
I would like to clarify that the socket server works as a 'plug in' to an existing C++ program, which doesn't support threading.
The socket server needs to wait (indefinitely) for clients to connect so it needs to continuously listen.
So using a normal Python socket or select.select would entail a while loop (or otherwise how can it acts as a server continuously listening for events? Please correct me I'm wrong), which would block the main program.
Somehow, using the Window Socket server callback above, the main program is not blocked. And this is the main reason while I'm trying to port it to Python.
The socket server is preferably in Python because many related libs the server needs are written in Python.
Thanks a lot.
Have a look at the socket module instead. It already contains all the code you need to work with sockets without using the win32 API.
[EDIT] You can write multi threaded code that can handle several connections. Just accept the connection and then start a new thread, give it the connection and let it read the data in a while 1: data = conn.recv(1024) ... kind of loop.
That said, Python also has a module for just that: SocketServer
[EDIT2] You say
the socket server works as a 'plug in' to an existing program, which doesn't support threading.
It's a bit hard to help with so little information but think about it this way:
You can run the socket server loop in a new thread. This code is isolated from the rest of your app, so it doesn't matter whether the other code uses/supports threads. This solves your "endless loop" problem.
Now this socket server loop will get connections from clients. My guess is that the clients will call methods from the rest of the app and here, things get hairy.
You need a way to synchronize these calls. In other places (like all UI frameworks), there is a single thread which runs any UI calls (drawing something, creating the UI, responding to user input).
But if I understand you correctly, then you can in fact modify the "main loop" of the existing app and let it do more things (like listening to new connections). If you can do this, then there is a way out:
Create a new thread for the socket server as described above. When the server gets a connection, spawn a new thread that talks to the client. When the client sends commands, create "work objects" (see command pattern) and put them into a queue.
In the main loop, you can look into the queue. If something is in there, pop the work objects and call it's run() method.
You don't need or want to port this code. This code is specific to how the WIN32 API notifies native code that a socket operation has completed. It doesn't apply in Python.
The equivalent in python would be, roughly, to paste the "accepting new conn" code in wherever your python code accepts a new connection. And paste the "receiving data" code wherever your python code receives data.
You can also use select, just keep in mind that the semantics are a bit of the reverse of async sockets. With async sockets, you start an operation whenever you want and you get a callback when it completes. With 'select', it tell you when to perform an operation such that it completes immediately.

Only one python program running (like Firefox)?

When I open Firefox, then run the command:
firefox http://somewebsite
the url opens in a new tab of Firefox (same thing happens with Chromium as well). Is there some way to replicate this behavior in Python? For example, calling:
processStuff.py file/url
then calling:
processStuff.py anotherfile
should not start two different processes, but send a message to the currently running program. For example, you could have info in one tabbed dialog box instead of 10 single windows.
Adding bounty for anyone who can describe how Firefox/Chromium do this in a cross-platform way.
The way Firefox does it is: the first instance creates a socket file (or a named pipe on Windows). This serves both as a way for the next instances of Firefox to detect and communicate with the first instance, and forward it the URL before dying. A socket file or named pipe being only accessible from processes running on the local system (as files are), no network client can have access to it. As they are files, firewalls will not block them either (it's like writing on a file).
Here is a naive implementation to illustrate my point. On first launch, the socket file lock.sock is created. Further launches of the script will detect the lock and send the URL to it:
import socket
import os
SOCKET_FILENAME = 'lock.sock'
def server():
print 'I\'m the server, creating the socket'
s = socket.socket(socket.AF_UNIX, socket.SOCK_DGRAM)
s.bind(SOCKET_FILENAME)
try:
while True:
print 'Got a URL: %s' % s.recv(65536)
except KeyboardInterrupt, exc:
print 'Quitting, removing the socket file'
s.close
os.remove(SOCKET_FILENAME)
def client():
print 'I\'m the client, opening the socket'
s = socket.socket(socket.AF_UNIX, socket.SOCK_DGRAM)
s.connect(SOCKET_FILENAME)
s.send('http://stackoverflow.com')
s.close()
def main():
if os.path.exists(SOCKET_FILENAME):
try:
client()
except (socket.error):
print "Bad socket file, program closed unexpectedly?"
os.remove(SOCKET_FILENAME)
server()
else:
server()
main()
You should implement a proper protocol (send proper datagrams instead of hardcoding the length for instance), maybe using SocketServer, but this is beyond this question. The Python Socket Programming Howto might also help you. I have no Windows machine available, so I cannot confirm that it works on that platform.
You could create a data directory where you create a "locking file" once your program is running, after having checked if the file doesn't exist yet.
If it exists, you should try to communicate with the existing process, which creates a socket or a pipe or something like this and communicates its address or its path in an appropriate way.
There are many different ways to do so, depending on which platform the program runs.
While I doubt this is how Firefox / Chrome does it, it would be possible to archive your goal with out sockets and relying solely on the file system. I found it difficult to put into text, so see below for a rough flow chart on how it could be done. I would consider this approach similar to a cookie :). One last thought on this is that with this it could be possible to store workspaces or tabs across multiple sessions.
EDIT
Per a comment, environment variables are not shared between processes. All of my work thus far has been a single process calling multiple modules. Sorry for any confusion.
I think you could use multiprocessing connections with a subprocess to accomplish this. Your script would just have to try to connect to the "remote" connection on localhost and if it's not available then it could start it.
Very Basic is use sockets.
http://wiki.python.org/moin/ParallelProcessing
Use Threading, http://www.valuedlessons.com/2008/06/message-passing-conccurrency-actor.html
Example for Socket Programming: http://code.activestate.com/recipes/52218-message-passing-with-socket-datagrams/

Categories