Python multiprocessing second process doesn't start - python

Hello I am trying to run 2 functions at the same time in python. Both read data from 2 separate meters over USB and they are not dependant on each other. I have tried multiprocessing but the second meter never starts.
def readMeter1():
while True:
#read Meter1
def readMeter2():
while True:
#read Meter2
if __name__ == "__main__":
Process(target = readMeter1()).start()
Process(target = readMeter2()).start()

Parameter target must be something callable (a function, in your case). You don't need to call that function yourself, start() will do it after launching a new process:
Process(target=readMeter1).start() # fork a new process, call readMeter1
Process(target=readMeter2).start() # fork a new process, call readMeter2
Because you call readMeter1, it starts an infinite loop in the current process and blocks everything else.

Related

Can I run two functions simultaneously no matter what these functions run?

What I'd like to do is the following program to print out:
Running Main
Running Second
Running Main
Running Second
[...]
Code:
from multiprocessing import Process
import time
def main():
while True:
print('Running Main')
time.sleep(1)
def second():
while True:
print('Running Second')
time.sleep(1)
p1 = Process(main())
p2 = Process(second())
p1.start()
p2.start()
But it doesn't have the desired behavior. Instead it just prints out:
Running Main
Running Main
[...]
I suspect my program doesn't work because of the while statement?
Is there any way I can overcome this problem and have my program print out what I mentioned no matter what I execute in my function?
The issue here seems to be when you make the process vars. I suspect the reason for why the process inclusively runs the first function is because of syntax. My interpretation is that instead of creating a process out of a function you are making a process that executes a function exclusively.
When you want to create Process object you want to avoid using this
p1 = Process(target=main())
and rather write
p1 = Process(target=main)
That also means if you want to include any input for the function you will have to
p1 = Process(target=main, args=('hi',))

Kill a python process when busy in a loop

I'm running a script on my Raspberry. Sometimes happens that the program freezes, so I've to close the terminal and re-run the .py
So I wanted to "multiprocess" this program. I made two function, the first one does the work, the second one has the job to check the time, and kill the process of the first function in the case the condition is true.
However I tried to do like so:
def AntiFreeze():
print("AntiFreeze partito\n")
global stop
global endtime
global freq
proc_SPN = multiprocessing.Process(target=SPN(), args=())
proc_SPN.start()
time.sleep(2)
proc_SPN.terminate()
proc_SPN.join()
if __name__ == '__main__':
proc_AF = multiprocessing.Process(target=AntiFreeze(), args=())
proc_AF.start()
The main function start the "AntiFreeze" function on a process, this one create another process to run the function that will do the Job I want.
THE PROBLEM (I think):
The function "SPN()" (that is the one that does the job) is busy in a very long while loop that calls function in another .py file.
So when I use proc_SPN.terminate() or proc_SPN.kill() nothing happens... why?
There is another way to force a process to kill? maybe I've to do two different programs?
Thanks in advance for help
You are calling your function at process creation, so most likely the process is never correctly spawned. Your code should be changed into:
def AntiFreeze():
print("AntiFreeze partito\n")
global stop
global endtime
global freq
proc_SPN = multiprocessing.Process(target=SPN, args=())
proc_SPN.start()
time.sleep(2)
proc_SPN.terminate()
proc_SPN.join()
if __name__ == '__main__':
proc_AF = multiprocessing.Process(target=AntiFreeze, args=())
proc_AF.start()
Furthermore, you shouldn't use globals (unless strictly necessarry). You could pass the needed arguments to the AntiFreeze function instead.

How to correctly use queues in python?

I am a beginner when it comes to python threading and multiprocessing so please bear with me.
I want to make a system that consists of three python scripts. The first one creates some data and sends this data to the second script continuously. The second script takes the data and saves on some file until the file exceeds defined memory limit. When that happens, the third script sends the data to an external device and gets rid of this "cache". I need all of this to happen concurrently. The pseudo code sums up what I am trying to do.
def main_1():
data = [1,2,3]
send_to_second_script(data)
def main_2():
rec_data = receive_from_first_script()
save_to_file(rec_data)
if file>limit:
signal_third_script()
def main_3():
if signal is true:
send_data_to_external_device()
remove_data_from_disk()
I understand that I can use queues to make this happen but I am not sure how.
Also , so far to do this, I tried a different approach where I created one python script and used threading to spawn threads for each part of the process. Is this correct or using queues is better?
Firstly, for Python you need to be really aware what the benefits of multithreading/multiprocessing gives you. IMO you should be considering multiprocessing instead of multithreading. Threading in Python is not actually concurrent due to GIL and there are many explanations out on which one to use. Easiest way to choose is to see if your program is IO-bound or CPU-bound. Anyways on to the Queue which is a simple way to work with multiple processes in python.
Using your pseudocode as an example, here is how you would use a Queue.
import multiprocessing
def main_1(output_queue):
test = 0
while test <=10: # simple limit to not run forever
data = [1,2,3]
print("Process 1: Sending data")
output_queue.put(data) #Puts data in queue FIFO
test+=1
output_queue.put("EXIT") # triggers the exit clause
def main_2(input_queue,output_queue):
file = 0 # Dummy psuedo variables
limit = 1
while True:
rec_data = input_queue.get() # Get the latest data from queue. Blocking if empty
if rec_data == "EXIT": # Exit clause is a way to cleanly shut down your processes
output_queue.put("EXIT")
print("Process 2: exiting")
break
print("Process 2: saving to file:", rec_data, "count = ", file)
file += 1
#save_to_file(rec_data)
if file>limit:
file = 0
output_queue.put(True)
def main_3(input_queue):
while(True):
signal = input_queue.get()
if signal is True:
print("Process 3: Data sent and removed")
#send_data_to_external_device()
#remove_data_from_disk()
elif signal == "EXIT":
print("Process 3: Exiting")
break
if __name__== '__main__':
q1 = multiprocessing.Queue() # Intializing the queues and the processes
q2 = multiprocessing.Queue()
p1 = multiprocessing.Process(target = main_1,args = (q1,))
p2 = multiprocessing.Process(target = main_2,args = (q1,q2,))
p3 = multiprocessing.Process(target = main_3,args = (q2,))
p = [p1,p2,p3]
for i in p: # Start all processes
i.start()
for i in p: # Ensure all processes are finished
i.join()
The prints may be a little off because I did not bother to lock the std_out. But using a queue ensures that stuff moves from one process to another.
EDIT: DO be aware that you should also have a look at multiprocessing locks to ensure that your file is 'thread-safe' when performing the move/delete. The pseudo code above only demonstrates how to use queue

Can't access global variable in python

I'm using multi processing library in python in code below:
from multiprocessing import Process
import os
from time import sleep as delay
test = "First"
def f():
global test
print('hello')
print("before: "+test)
test = "Second"
if __name__ == '__main__':
p = Process(target=f, args=())
p.start()
p.join()
delay(1)
print("after: "+test)
It's supposed to change the value of test so at last the value of test must be Second, but the value doesn't change and remains First.
here is the output:
hello
before: First
after: First
The behavior you're seeing is because p is a new process, not a new thread. When you spawn a new process, it copies your initial process's state completely and then starts executing in parallel. When you spawn a thread, it shares memory with your initial thread.
Since processes have memory isolation, they won't create race-condition errors caused by reading and writing to shared memory. However, to get data from your child process back into the parent, you'll need to use some form of inter-process communication like a pipe, and because they fork memory, they are more expensive to spawn. As always in computer science, you have to make a tradeoff.
For more information, see:
https://en.wikipedia.org/wiki/Process_(computing)
https://en.wikipedia.org/wiki/Thread_(computing)
https://en.wikipedia.org/wiki/Inter-process_communication
Based on what you're actually trying to accomplish, consider using threads instead.
Global state is not shared so the changes made by child processes has no effect.
Here is why:
Actually it does change the global variable but only for the spawned
process. If you would access it within your process you can see it. As
its a process your global variable environment will be initialized but
the modification you make will be limited to the process itself and
not the whole.
Try this It explains whats happening
from multiprocessing import Process
import os
from time import sleep as delay
test = "First"
def f2():
print ("f2:" + test)
def f():
global test
print ('hello')
print ("before: "+test)
test = "Second"
f2()
if __name__ == '__main__':
p = Process(target=f, args=())
p.start()
p.join()
delay(1)
print("after: "+test)
If you really need to use modify from the process their's another way of doing it, read this doc or post it might help you.

multi-process Queue - High RAM consuming

I set a multiprocessing.Queue code to run 2 functions in parallel. 1st func parses and writes data to a text file, 2nd function pulls data from same text file and show a live graph. 2nd func must kick off once the 1st func has created a text file. The code works well.
However:
It takes almost all RAM (ca. 6gb), is that because is a multi-process? in task manager I see 3 python.exe processes of 2gb each running at the same time, while when I run only the 1st func (the most RAM consuming) I can see only 1 python.exe of 2gb.
Once the code has parsed all the text and the graph stopped the processes keep running until I terminate manually the code using eclipse console red button, is that normal?
I have a small script that run before and out of the multi-process functions. It provides a value I need to define in order to run the multi-process functions. It runs OK, but once the the multi-process functions are called, it run again!! I don't know why, because it's definitely out of the multi-process functions.
Part of this was resolved in another stackoverflow question.
define newlista
define key
def get_all(myjson, kind, type)
def on_data(data)
small script run out of the multi-process function
def parsing(q):
keep_running = True
numentries = 0
for text in get_all(newlista, "sentence", "text"):
if 80 < len(text) < 500:
firstword = key.lower().split(None, 1)[0]
if text.lower().startswith(firstword):
pass
else:
on_data(text)
numentries += 1
q.put(keep_running)
keep_running = False
q.put(keep_running)
def live_graph(q):
keep_running = True
while keep_running:
keep_running = q.get()
# do the graph updates here
if __name__=='__main__':
q = Queue()
p1 = Process(target = writing, args=(q,))
p1.start()
p2 = Process(target = live_graph, args=(q,))
p2.start()
UPDATE
The graph function is the one that generates two .py processes and once the 1st function terminated the second function keeps running.

Categories