multiprocessing - blanket process termination - python

I'm building a GUI to control a robot in Python.
The GUI has a few buttons, each one executes a function that loops indefinitely.
It's like a roomba, where the "clean kitchen" function makes it continually clean until interrupted.
In order to keep the GUI interactive, I'm executing the function in a separate process using multiprocessing.
I've got a stop function that I call that returns the robot to home, and kills the child process (otherwise the child process would reach the next line, and the robot would start turning around when it's left the kitchen and gone home). The stop function runs in the main/parent process as it doesn't loop.
I've got a GUI button which calls Stop, and I'll also call it whenever I start a new process.
My processes are started like this:
from file1 import kitchen
from file2 import bedroom
if event == "Clean kitchen":
stoptour()
p = multiprocessing.Process(target=kitchen,args=(s,),daemon=True)
p.start()
if event == "Clean bedroom":
stoptour()
p = multiprocessing.Process(target=bedroom,args=(s,),daemon=True)
p.start()
The argument being passed is just the socket that the script is using to connect to the robot.
My stop function is:
def stoptour():
p.terminate()
p.kill()
s.send(bytes.fromhex(XXXX)) #command to send the stop signal to the robot
p.join()
This all runs without error and the robot stops, but then starts up again (because the child process is still running). I confirmed this by adding to the stop function:
if p.is_alive:
print('Error, still not dead')
else:
print('Success, its dead')
Every time it prints "Error, still not dead"...
Why are p.kill and p.terminate not working? Is something spawning more child processes?
Is there a way to write my stoptour() function so that it kills any and all child processes completely indiscriminately?
Edited to show the code:
import socket
import PySimpleGUI as sg
import multiprocessing
import time
from file1 import room1
from file2 import room2
from file3 import room3
#Define the GUI window
layout = [[sg.Button("Room 1")],
[sg.Text("Start cleaning of room1")],
[sg.Button("Room 2")],
[sg.Text("Start cleaning of room2")],
[sg.Button("Room 3")],
[sg.Text("Start cleaning room3")],
[sg.Button("Stop")],
[sg.Text("Stop what you're doing")]]
# Create the window
window = sg.Window("Robot Utility", layout)
#Setup TCP Connection & Connect
TCP_IP = '192.168.1.100' #IP
TCP_port = 2222 #Port
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM) #Setup TCP connection
s.connect((TCP_IP, TCP_port)) #Connect
#Define p so I can define stop function
if __name__ == '__main__':
p = multiprocessing.Process(target=room1, args=(s,), daemon=True)
p.start()
p.terminate()
p.kill()
p.join()
#Define stop function
def stoptour():
s.send(bytes.fromhex('longhexkey'))
p.terminate()
p.kill()
p.join()
s.send(bytes.fromhex('longhexkey')) #No harm stopping twice...
if p.is_alive:
print('Error, still not dead')
else:
print('Success, its dead')
stoptour()
#GUI event loop
while True:
event, values = window.read()
if event == "Room 1":
if __name__ == '__main__':
stoptour()
p = multiprocessing.Process(target=room1, args=(s,), daemon=True)
p.start()
if event == "Room 2":
if __name__ == '__main__':
stoptour()
p = multiprocessing.Process(target=room2, args=(s,), daemon=True)
p.start()
if event == "Room 3":
if __name__ == '__main__':
stoptour()
p = multiprocessing.Process(target=room3, args=(s,), daemon=True)
p.start()
if event == "Stop":
stoptour()
sg.popup("Tour stopped")
if event == sg.WIN_CLOSED:
stoptour()
s.close()
print('Closing Program')
break
window.close()

Related

PySimpleGUI doesn't reroute cprint of mulitprocessing to its Multiline element

I am using multiprocessing module along with PySimpleGUI.
Instead of Multiline everything gets printed in my IDE console. This problem only happens when I use multiprocessing. Other function with sg.cprint statement and even with simple print statement would print the result in Multiline, as it should.
Here's my code, that shows the problem
import PySimpleGUI as sg
import multiprocessing
def func(link: str):
sg.cprint(link)
sg.cprint('string')
print(link)
print('string')
def make_window():
layout = [[sg.Multiline(key='-Multiline-', reroute_stdout=True)],
[[sg.Button('Start', key='-Start-')]]]
window = sg.Window('Forum', layout, finalize=True)
sg.cprint_set_output_destination(window, '-Multiline-')
while True:
event, values = window.read()
if event == sg.WIN_CLOSED:
break
elif event == '-Start-':
sg.cprint('Process has started')
process = multiprocessing.Process(target=func,
args=('https://stackoverflow.com/questions/ask',),
daemon=True)
process.start()
window.close()
if __name__ == '__main__':
make_window()
I have tried to reroute everything to Multiline with reroute_stdout=True — doesn't work.
According to so called "Cookbook" of PySimpleGUI, it's possible to reroute print like this:
window['-Multiline-'].print('Testing 1 2 3')
It doesn't work if I put something like that in my function (I assume that is because my function is above the GUI code)
In conclusion - the issue doesn't appear when I use threading module. But multiprocessing solves other problem - it allows me to terminate a process with .terminate(). Couldn't do that with threadingas easily.
My application uses OOP and a bit more complicated than the code I provided.
When running a subprocess, you get an entirely new environment. You can't share global variables for example.
Python essentially runs a completely new Python interpreter instance whenever you start() a multiprocess.Process instance and since different processes get different stacks they don't get to share the internal memory so Python actually pickles the passed arguments, sends them to the Process and then unpickles them there creating an illusion that both processes (the parent and the child) have the same data.
I know nothing about library multiprocessing, following code work, but maybe not perfect.
from time import sleep
import threading
import multiprocessing
import PySimpleGUI as sg
def func(queue:multiprocessing.Queue, link: str):
for i in range(10):
queue.put(f'Message #{i}')
sleep(1)
def thread(window, url):
queue = multiprocessing.Queue()
process = multiprocessing.Process(target=func, args=(queue, url), daemon=True)
process.start()
while process.is_alive():
if not queue.empty():
line = queue.get()
window.write_event_value("Message", line)
while not queue.empty():
line = queue.get()
window.write_event_value("Message", line)
window.write_event_value("Message", 'Process has stopped')
def make_window():
layout = [[sg.Multiline(key='-Multiline-', size=(40, 10), reroute_stdout=True)],
[[sg.Button('Start', key='-Start-')]]]
window = sg.Window('Forum', layout, finalize=True)
sg.cprint_set_output_destination(window, '-Multiline-')
while True:
event, values = window.read()
if event == sg.WIN_CLOSED:
break
elif event == '-Start-':
sg.cprint('Process has started')
url = 'https://stackoverflow.com/questions/ask'
threading.Thread(target=thread, args=(window, url), daemon=True).start()
elif event == 'Message':
message = values[event]
print(message)
window.close()
if __name__ == '__main__':
make_window()

How do I get an input function to work whilst another code is running (using multiprocessing)?

I can't get this code to run an input whilst another block of code is running. I want to know if there are any workarounds, my code is as follows.
import multiprocessing
def test1():
input('hello')
def test2():
a=True
while a == True:
b = 5
if __name__ == "__main__":
p1 = multiprocessing.Process(target=test1)
p2 = multiprocessing.Process(target=test2)
p1.start()
p2.start()
p1.join()
p2.join()
When the code is run I get an EOF error which apparently happens when the input function is interrupted.
I would have the main process create a daemon thread responsible for doing the input in conjunction with creating the greatly under-utilized full duplex Pipe which provides two two-way communication Connection instances. For simplicity the following demo just creates one Process instance that loops doing input requests echoing the response until the user enters 'quit':
import multiprocessing
import threading
def test1(conn):
while True:
conn.send('Please enter a value: ')
s = conn.recv()
if s == 'quit':
break
print(f'You entered: "{s}"')
def inputter(conn):
while True:
# The contents of the request is the prompt to be used:
prompt = conn.recv()
conn.send(input(prompt))
if __name__ == "__main__":
conn1, conn2 = multiprocessing.Pipe(duplex=True)
t = threading.Thread(target=inputter, args=(conn1,), daemon=True)
p = multiprocessing.Process(target=test1, args=(conn2,))
t.start()
p.start()
p.join()
That's not all of your code, because it doesn't show the multiprocessing. However, the issue is that only the main process can interact with the console. The other processes do not have a stdin. You can use a Queue to communicate with the main process if you need to, but in general you want the secondary processes to be pretty much standalone.

What is the best way to control the flow of the child processes in Python?

I am trying to run, pause and terminate the child processes in Python from the parent process. I have tried to use multiprocessing.Value but for some reason the parent process never finishes completely although I terminate and join all the processes. My use case is something like:
def child(flow_flag):
while True:
with flow_flag.get_lock():
flag_value = flow_flag.value
if flag_value == 0:
print("This is doing some work")
elif flag_value == 1:
print("This is waiting for some time to check back later")
time.sleep(5)
else:
print("Time to exit")
break
def main():
flow_flag = Value('i', 0)
processes = [Process(target=child, args=(flow_flag,)) for i in range(10)]
[p.start() for p in processes]
print("Waiting for some work")
with flow_flag.get_lock():
flow_flag.value = 1
print("Do something else")
with flow_flag.get_lock():
flow_flag.value = 0
print("Waiting for more work")
with flow_flag.get_lock():
flow_flag.value = 2
print("Exiting")
for p in processes:
p.terminate()
p.join()
This never finishes properly and I have to Ctrl+C eventually. Then I see this message:
Traceback (most recent call last):
File "/home/abcde/anaconda3/lib/python3.7/threading.py", line 1308, in _shutdown
lock.acquire()
KeyboardInterrupt
What is a better way? FYI, while waiting for something else, I am spawning some other processes. I also had them not terminating properly, and I was using Value with them too. It got fixed when I switched to using Queue for them. However, Queue does not seem to be appropriate for the case above.
P.S. : I am ssh'ing into Ubuntu 18.04.
EDIT: After a lot of debugging, not exiting turned out to be because of a library I am using that I did not suspect to cause this. My apologies for false alarm. Thanks for the suggestions on the better way of controlling the child processes.
Your program works for me, but let me chime in on "is there another way". Instead of polling at 5 second intervals you could create a shared event object that lets the child processes know when they can do their work. Instead of polling for Value 1, wait for the event.
from multiprocessing import *
import time
import os
def child(event, times_up):
while True:
event.wait()
if times_up.value:
print(os.getpid(), "time to exit")
return
print(os.getpid(), "doing work")
time.sleep(.5)
def main():
manager = Manager()
event = manager.Event()
times_up = manager.Value(bool, False)
processes = [Process(target=child, args=(event, times_up)) for i in range(10)]
[p.start() for p in processes]
print("Let processes work")
event.set()
time.sleep(2)
print("Make them stop")
event.clear()
time.sleep(4)
print("Make them go away")
times_up.value = True
event.set()
print("Exiting")
for p in processes:
p.join()
if __name__ == "__main__":
main()
With Python 3.7.7 running on FreeBSD 12.1 (64-bit) I cannot reproduce your problem.
After fixing the indentation and adding the necessary imports The changed program runs fine AFAICT.
BTW, you might want to import sys and add
sys.stdout.reconfigure(line_buffering=True)
to the beginning of your main().

exiting python with hanged thread

When you import and use package, this package can run non daemon threads. Until these threads are finished, python cannot exit properly (like with sys.exit(0)). For example imagine that thread t is from some package. When unhandled exception occurs in the main thread, you want to terminate. But this won't exit immediately, it will wait 60s till the thread terminates.
import time, threading
def main():
t = threading.Thread(target=time.sleep, args=(60,))
t.start()
a = 5 / 0
if __name__ == '__main__':
try:
main()
except:
sys.exit(1)
So I came up with 2 things. Replace sys.exit(1) with os._exit(1) or enumerate all threads and make them daemon. Both of them seems to work, but what do you thing is better? os._exit won't flush stdio buffers but setting daemon attribute to threads seems like a hack and maybe it's not guaranteed to work all the time.
import time, threading
def main():
t = thread.Thread(target=time.sleep, args=(60,))
t.start()
a = 5 / 0
if __name__ == '__main__':
try:
main()
except:
for t in threading.enumerate():
if not t.daemon and t.name != "MainThread":
t._daemonic = True
sys.exit(1)

python master/child looping unintentionally

Problem: I expect child to time out and be done. but instead it times out and begins to run again.
Can anyone tell me why this program runs forever? I expect it to run one time and exit...
Here is a working program. Master threads a function to spawn a child. Works great except it ends up looping.
Here is the master:
# master.py
import multiprocessing, subprocess, sys, time
def f():
p = subprocess.Popen(["C:\\Python32\\python.exe", "child.py"])
# wait until child ends and check exit code
while p.poll() == None:
time.sleep(2)
if p.poll() != 0:
print("something went wrong with child.py")
# multithread a function process to launch and monitor a child
p1 = multiprocessing.Process(target = f())
p1.start()
and the child:
# child.py
import socket, sys
def main(args):
try:
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
sock.settimeout(10)
sock.bind(('', 54324))
data, addr = sock.recvfrom(1024) # buffer size is 1024 bytes
print(data)
sock.close()
return 0
except KeyboardInterrupt as e:
try:
sock.close()
return 0
except:
return 0
if __name__ == "__main__":
sys.exit(main(sys.argv))
The problem is that your master.py doesn't have an if __name__ == '__main__' guard. On Windows, multiprocessing has to be able to reimport the main module in the child process, and if you don't use this if guard, you will re-execute the multiprocessing.Process in the child (resulting in an accidental forkbomb).
To fix, simply put all of the commands in master.py in the if guard:
if __name__ == '__main__':
# multithread a function process to launch and monitor a child
p1 = multiprocessing.Process(target = f())
p1.start()

Categories