Execute Tasks in Periodic Times non blocked with Python - python

I've already read a million posts about these questions, but i can't find any solution to my case.
I have a python process which read data from arduino serial port. These reads are execute inside a While true loop.
But every one second, i want to invite the read result to a database. Are many suggestions about Threading and timer.
But the data is read inside while loop, and i can't invite these data to a function in another thread. How can a make this?
Any example? Thanks a lot! Bellow a sketch of my attempts
from threading import Timer
def send_db(data):
#invite data to serial
t = Timer(1,send_db,[data])
t.start()
t = Timer(1,send_db,[data])
t.start()
while True: #main function
#read data from serial
#how invite the data to another function?

Related

Pyserial in_waiting CPU usage

I have a large python script with a thread that listens to a serial port and puts new data to a queue whenever it's received. I've been trying to improve the performance of the script, as right now even when nothing is happening it's using ~ 12% of my Ryzen 3600 CPU. That seems very excessive.
Here's the serial listener thread:
def listen(self):
"""
Main listener
"""
while self.doListen:
# wait for a message
if self.bus.ser.in_waiting:
# Read rest of message
msg = self.bus.read(self.bus.ser.in_waiting)
# Put into queue
self.msgQueue.put_nowait(msg)
I profiled the script using yappi and found that the serial.in_waiting call seems to be hogging the majority of the cycles. See the KCachegrind screenshot below:
I tried the trick suggested in this question, doing a blocking read(1) call to wait for data. However read(1) just continuously returns empty data and never actually blocks (and yes, I've made sure my pyserial timeout is set to None)
Is there a more elegant and CPU-friendly way to wait for data on the serial bus? My messages are of variable length, so doing a blocking read(n) wouldn't work. They also don't end in newlines or any specific terminators, so readline() wouldn't work either.
Aaron's suggestion was great. A simple time.sleep(0.01) in my serial thread dramatically cut down on CPU usage. So far it looks like I'm not missing any messages either, which was my big fear with adding in sleeps.
The new listener thread:
def listen(self):
"""
Main listener
"""
while self.doListen:
# wait for a message
if self.bus.ser.in_waiting:
# Read rest of message
msg = self.bus.read(self.bus.ser.in_waiting)
# Put into queue
self.msgQueue.put_nowait(msg)
# give the CPU a break
time.sleep(0.01)

How to execute processing task and socket communication concurrently in python 3?

I have some trouble trying to understand how to use the threading module
in python 3.
Origin: I wrote a python script to do some image processing on every
frame of a camera stream in a for loop.
Therefor I wrote some functions which are used inside the main script. The main script/loop isnĀ“t encapsulated inside a function.
Aim: I want the main loop to run the whole time. The result of the
processing of the latest frame have to be send to a socket client only
if the client sends a request to the server socket.
My idea was to use two threads. One for the image processing and one for
the server socket which listens for a request, takes the latest image
processing result and sends it to the client socket.
I saw different tutorials how to use threading and understand the
workflow in general, but not how to use it to cope with this particular
case. So I hope for your help.
Below there is the rough structure of the origin script:
import cv2
import numpy
import json
import socket
from threading import Thread
def crop(image, coords):
...
def cont(image):
...
# load parameters
a = json_data["..."]
# init cam
camera = PiCamers()
# main loop
for frame in camera.capture_continuous(...):
#######
# some image processing
#######
result = (x, y, z)
Thank you in advance for your ideas!
Greetings
Basically you have to create a so called ThreadPool.
In this ThreadPool function you can add the functions you want to be executed in a thread with their specific parameters. Afterwards you can start the Threadpool.
https://www.codementor.io/lance/simple-parallelism-in-python-du107klle
Here the threadPool with .map is used. There are more advanced functions that do the job. You can read the documentary of ThreadPools or search other tutorials.
Hope it helped

Python - Multithreads for calling the same function to run in parallel and independently

I'm new in Python and I'm struggling a lot trying to solve a problem. I have three programs running. One of them has the objective to send data, the other to receive and the third one works in the middle (transparently). The difficulty is happening with this third one, which I'm calling delay_loss.py.
It has to emulate delay of packets before delivering them to the receiving program.
Searching a lot I have found a solution (multithreading), which I'm not sure is the best one. Since delay_loss.py can receive a lot of packets "at once" and has to select for each a random time to emulate a delay in the network, I have to be able to send each packet to the receiving program after the random time selected for this packet, independently of the others.
I'm trying to use multithread for this, and I think I'm not using it correctly because all the packets are sent at the same time after some time. The threads seem to not be running the function send_up() independently.
Part of the code of delay_loss.py is shown below:
import threading
import time
from multiprocessing.dummy import Pool as ThreadPool
...
pool = ThreadPool(window_size)
def send_up (pkt, time_delay, id_pkt):
time.sleep(time_delay)
sock_server.sendto(pkt, (C_IP, C_PORT))
def delay_pkt(pkt_recv_raw, rtt, average_delay, id_pkt):
x = random.expovariate(1/average_delay)
time_delay = rtt/(2+x)
pool.apply_async(send_up, [pkt_recv_raw, time_delay, id_pkt])
...
delay_pkt(pkt_recv_raw, rtt, average_delay, id_pkt_recv)
id_pkt_recv += 1
If anyone has some idea of what am I doing wrong. Or just to say don't take this approach of multithreads for doing this task, it would be of much help!
Thanks in advance :)
I have found a solution to my problem. I was using the pool without necessity.
It is much simpler to just use the threading.Timer() function, as shown below:
t = threading.Timer(time_delay, send_up, [pkt_recv_raw, id_pkt])
t.start()

A Process to check if Infinite Loop is still running in Python3

I am unable to grasp this with the help of Programming concepts in general with the following scenario:
Note: All Data transmission in this scenario is done via UDP packets using socket module of Python3
I have a Server which sends some certain amount of data, assume 300 Packets over a WiFi Channel
At the other end, I have a receiver which works on a certain Decoding process to decode the data. This Decoding Process is kind of Infinite Loop which returns Boolean Value true or false at every iteration depending on certain aspects which can be neglected as of now
a Rough Code Snippet is as follows:Python3
incomingPacket = next(bringNextFromBuffer)
if decoder.consume_data(incomingPacket):
# this if condition is inside an infinite loop
# unless the if condition becomes True keep
# keep consuming data in a forever for loop
print("Data has been received")
Everything as of moment works since the Server and Client are in proximity and the data can be decoded. But in practical scenarios I want to check the loop that is mentioned above. For instance, after a certain amount of time, if the above loop is still in the Forever (Infinite) state I would like to send out something back to the server to start the data sending again.
I am not much clear with multithreading concept, but can I use a thread over here in this scenario?
For Example:
Thread a Process for a certain amount of time and keep checking the decoder.consume_data() function and if the time expires and the output is still False can I then send out a kind of Feedback to the server using struct.pack() over sockets.
Of course the networking logic, need NOT be addressed as of now. But is python capable of MONITORING THIS INFINITE LOOP VIA A PARALLEL THREAD OR OTHER CONCEPT OF PROGRAMMING?
Caveats
Unfortunately the Receiver in question is a dumb receiver i.e. No user control is specified. Only thing Receiver can do is decode the data and perhaps send a Feedback to the Server stating whether the data is received or not and that is possible only when the above mentioned LOOP is completed.
What is a possible solution here?
(Would be happy to share more information on request)
Yes you can do this. Roughly it'll look like this:
from threading import Thread
from time import sleep
state = 'running'
def monitor():
while True:
if state == 'running':
tell_client()
sleep(1) # to prevent too much happening here
Thread(target=monitor).start()
while state == 'running':
receive_data()

Pause one thread from other in python

I was looking how to do a multithread (2 threads) in python.
I want one of them is reading the serial port constantly. reading every frame and saving them to a database. I already have done a script to do this.
For the second one, I want it to listen a socket port. When it receives something from that port, I want it to pause the first thread, write something to the serial port and write to the socket. After that, unpause the first thread and go back to listen socket port.
I think the best idea is pausing one thread from the other to read serial port in that moment because if I read the answer by serial port in the 1th thread, I have to pass value read to the second one and it is more complicated, isn't it?
I already have the part of writing on serial port and check some tutorials for socket part so I have no problems with that. But I haven't find anything about pause one thread from another and I am thinking it is not possible.
What should I do in this case?
EDIT: Ask about shared variables: SO I can declare a global variable and make something like:
global1
global2
Thread 1:
while(global1 == 0)
do whatever
global2 = 1
thread 2:
wait socket
if dataReceived: global1 = 1
if global2 = 1 do whatever on serial port
global2 = 0
when finish global1 = 0
with 2 globals I can notify to thread1 to stop to go ahead next iteration and with global2, the second thread knows when the serial port is not being used...
How do I declare a shared variable in python? or it is just another variable....
I'm not sure you can share objects directly between processes, but since every process can share objects with the main process, you can use the main process to pass them back and forth:
http://docs.python.org/2/library/multiprocessing.html#exchanging-objects-between-processes

Categories