PySimpleGUI doesn't reroute cprint of mulitprocessing to its Multiline element - python

I am using multiprocessing module along with PySimpleGUI.
Instead of Multiline everything gets printed in my IDE console. This problem only happens when I use multiprocessing. Other function with sg.cprint statement and even with simple print statement would print the result in Multiline, as it should.
Here's my code, that shows the problem
import PySimpleGUI as sg
import multiprocessing
def func(link: str):
sg.cprint(link)
sg.cprint('string')
print(link)
print('string')
def make_window():
layout = [[sg.Multiline(key='-Multiline-', reroute_stdout=True)],
[[sg.Button('Start', key='-Start-')]]]
window = sg.Window('Forum', layout, finalize=True)
sg.cprint_set_output_destination(window, '-Multiline-')
while True:
event, values = window.read()
if event == sg.WIN_CLOSED:
break
elif event == '-Start-':
sg.cprint('Process has started')
process = multiprocessing.Process(target=func,
args=('https://stackoverflow.com/questions/ask',),
daemon=True)
process.start()
window.close()
if __name__ == '__main__':
make_window()
I have tried to reroute everything to Multiline with reroute_stdout=True — doesn't work.
According to so called "Cookbook" of PySimpleGUI, it's possible to reroute print like this:
window['-Multiline-'].print('Testing 1 2 3')
It doesn't work if I put something like that in my function (I assume that is because my function is above the GUI code)
In conclusion - the issue doesn't appear when I use threading module. But multiprocessing solves other problem - it allows me to terminate a process with .terminate(). Couldn't do that with threadingas easily.
My application uses OOP and a bit more complicated than the code I provided.

When running a subprocess, you get an entirely new environment. You can't share global variables for example.
Python essentially runs a completely new Python interpreter instance whenever you start() a multiprocess.Process instance and since different processes get different stacks they don't get to share the internal memory so Python actually pickles the passed arguments, sends them to the Process and then unpickles them there creating an illusion that both processes (the parent and the child) have the same data.
I know nothing about library multiprocessing, following code work, but maybe not perfect.
from time import sleep
import threading
import multiprocessing
import PySimpleGUI as sg
def func(queue:multiprocessing.Queue, link: str):
for i in range(10):
queue.put(f'Message #{i}')
sleep(1)
def thread(window, url):
queue = multiprocessing.Queue()
process = multiprocessing.Process(target=func, args=(queue, url), daemon=True)
process.start()
while process.is_alive():
if not queue.empty():
line = queue.get()
window.write_event_value("Message", line)
while not queue.empty():
line = queue.get()
window.write_event_value("Message", line)
window.write_event_value("Message", 'Process has stopped')
def make_window():
layout = [[sg.Multiline(key='-Multiline-', size=(40, 10), reroute_stdout=True)],
[[sg.Button('Start', key='-Start-')]]]
window = sg.Window('Forum', layout, finalize=True)
sg.cprint_set_output_destination(window, '-Multiline-')
while True:
event, values = window.read()
if event == sg.WIN_CLOSED:
break
elif event == '-Start-':
sg.cprint('Process has started')
url = 'https://stackoverflow.com/questions/ask'
threading.Thread(target=thread, args=(window, url), daemon=True).start()
elif event == 'Message':
message = values[event]
print(message)
window.close()
if __name__ == '__main__':
make_window()

Related

If two multiprocessing can request input on the terminal, is there a way to pause one of them until the answer is given?

As can be seen in the code below, two multiprocessing runs together, but both have a moment that can ask for an input() in the Terminal, is there any way to pause the other multiprocessing until the answer is given in the Terminal?
File Code_One archaic and simple example to speed up the explanation:
from time import sleep
def main():
sleep(1)
print('run')
sleep(1)
print('run')
sleep(1)
input('Please, give the number:')
File Code_Two archaic and simple example to speed up the explanation:
from time import sleep
def main():
sleep(2)
input('Please, give the number:')
sleep(1)
print('run 2')
sleep(1)
print('run 2')
sleep(1)
print('run 2')
sleep(1)
print('run 2')
sleep(1)
print('run 2')
File Main_Code:
import Code_One
import Code_Two
import multiprocessing
from time import sleep
def main():
while True:
pression = multiprocessing.Process(target=Code_One.main)
xgoals = multiprocessing.Process(target=Code_Two.main)
pression.start()
xgoals.start()
pression.join()
xgoals.join()
print('Done')
sleep(5)
if __name__ == '__main__':
main()
How should I proceed in this situation?
In this example, as it doesn't pause the other multi, whenever it asks for an input this error happens:
input('Please, give the number:')
EOFError: EOF when reading a line
Sure, this is possible. To do it you will need to use some sort of interprocess communication (IPC) mechanism to allow the two processes to coordinate. time.sleep is not the best option though, and there are much more efficient ways of tackling it that are specifically made just for this problem.
Probably the most efficient way is to use a multiprocessing.Event, like this:
import multiprocessing
import sys
import os
def Code_One(event, fno):
proc_name = multiprocessing.current_process().name
print(f'running {proc_name}')
sys.stdin = os.fdopen(fno)
val = input('give proc 1 input: ')
print(f'proc 1 got input: {val}')
event.set()
def Code_Two(event, fno):
proc_name = multiprocessing.current_process().name
print(f'running {proc_name} and waiting...')
event.wait()
sys.stdin = os.fdopen(fno)
val = input('give proc 2 input: ')
print(f'proc 2 got input {val}')
if __name__ == '__main__':
event = multiprocessing.Event()
pression = multiprocessing.Process(name='code_one', target=Code_One, args=(event, sys.stdin.fileno()))
xgoals = multiprocessing.Process(name='code_two', target=Code_Two, args=(event, sys.stdin.fileno()))
xgoals.start()
pression.start()
xgoals.join()
pression.join()
This creates the event object, and the two subprocesses. Event objects have an internal flag that starts out False, and can then be toggled True by any process calling event.set(). If a process calls event.wait() while the flag is False, that process will block until another process calls event.set().
The event is created in the parent process, and passed to each subprocess as an argument. Code_Two begins and calls event.wait(), which blocks until the internal flag in the event is set to True. Code_One executes immediately and then calls event.set(), which sets event's internal flag to True, and allows Code_Two to proceed. At that point both processes have returned and called join, and the program ends.
This is a little hacky because it is also passing the stdin file number from the parent to the child processes. That is necessary because when subprocesses are forked, those file descriptors are closed, so for a child process to read stdin using input it first needs to open the correct input stream (that is what sys.stdin = os.fdopen(fno) is doing). It won't work to just send sys.stdin to the child as another argument, because of the mechanics that Python uses to set up the environment for forked processes (sys.stdin is a IO wrapper object and is not pickleable).

Stop Thread without closing GUI window

I am learning python on my own and my level is probably a poor excuse for a "script kiddie" as I kinda understand and mostly end up borrowing and mashing together different scripts till it does what I want. However this is the first time I'm trying to create a GUI for one of the scripts I have. I'm using PySimpleGUI and I've been able to understand it surprisingly well. All but one thing is working the way I want it to.
The issue is I want to stop a running daemon thread without exiting the GUI. If I get the stop button to work the GUI closes and if the GUI does not close it doesn't stop the thread. The issue is between lines '64-68'. I have tried a few things and just put a place holder on line '65' to remember that I was trying to keep the GUI ("Main Thread" in my head-speak) running. The script will run in this state but the 'Stop' button does not work.
Note: I put a lot of comments in my scripts so I remember what each part is, what it does and what I need to clean up. I don't know if this is a good practice if I plan on sharing a script. Also, if it matters, I use Visual Studio Code.
#!/usr/local/bin/python3
import PySimpleGUI as sg
import pyautogui
import queue
import threading
import time
import sys
from datetime import datetime
from idlelib import window
pyautogui.FAILSAFE = False
numMin = None
# ------------------ Thread ---------------------
def move_cursor(gui_queue):
if ((len(sys.argv)<2) or sys.argv[1].isalpha() or int(sys.argv[1])<1):
numMin = 3
else:
numMin = int(sys.argv[1])
while(True):
x=0
while(x<numMin):
time.sleep(5) # Set short for debugging (will set to '60' later)
x+=1
for i in range(0,50):
pyautogui.moveTo(0,i*4)
pyautogui.moveTo(1,1)
for i in range(0,3):
pyautogui.press("shift")
print("Movement made at {}".format(datetime.now().time()))
# --------------------- GUI ---------------------
def the_gui():
sg.theme('LightGrey1') # Add a touch of color
gui_queue = queue.Queue() # Used to communicate between GUI and thread
layout = [ [sg.Text('Execution Log')],
[sg.Output(size=(30, 6))],
[sg.Button('Start'), sg.Button('Stop'), sg.Button('Click Me'), sg.Button('Close')] ]
window = sg.Window('Stay Available', layout)
# -------------- EVENT LOOP ---------------------
# Event Loop to process "events"
while True:
event, values = window.read(timeout=100)
if event in (None,'Close'):
break
elif event.startswith('Start'): # Start button event
try:
print('Starting "Stay Available" app')
threading.Thread(target=move_cursor,
args=(gui_queue,), daemon=True).start()
except queue.Empty:
print('App did not run')
elif event.startswith('Stop'): # Stop button event
try:
print('Stopping "Stay Available" app')
threading.main_thread # To remind me I want to go back to the original state
except queue.Empty:
print('App did not stop')
elif event == 'Click Me': # To see if GUI is responding (will be removed later)
print('Your GUI is alive and well')
window.close(); del window
if __name__ == '__main__':
gui_queue = queue.Queue() # Not sure if it goes here or where it is above
the_gui()
print('Exiting Program')
From this answer: create the class stoppable_thread.
Then: store the threads on a global variable:
# [...]
# store the threads on a global variable or somewhere
all_threads = []
# Create the function that will send events to the ui loop
def start_reading(window, sudo_password = ""):
While True:
window.write_event_value('-THREAD-', 'event')
time.sleep(.5)
# Create start and stop threads function
def start_thread(window):
t1 = Stoppable_Thread(target=start_reading, args=(window,), daemon=True)
t1.start()
all_threads.append(t1)
def stop_all_threads():
for thread in all_threads:
thread.terminate()
Finally, on the main window loop, handle the events that start, stop or get information from the thread.

Control running Python Process (multiprocessing)

I have yet another question about Python multiprocessing.
I have a module that creates a Process and just runs in a while True loop.
This module is meant to be enabled/disabled from another Python module.
That other module will import the first one once and is also run as a process.
How would I better implement this?
so for a reference:
#foo.py
def foo():
while True:
if enabled:
#do something
p = Process(target=foo)
p.start()
and imagine second module to be something like that:
#bar.py
import foo, time
def bar():
while True:
foo.enable()
time.sleep(10)
foo.disable()
Process(target=bar).start()
Constantly running a process checking for condition inside a loop seems like a waste, but I would gladly accept the solution that just lets me set the enabled value from outside.
Ideally I would prefer to be able to terminate and restart the process, again from outside of this module.
From my understanding, I would use a Queue to pass commands to the Process. If it is indeed just that, can someone show me how to set it up in a way that I can add something to the queue from a different module.
Can this even be easily done with Python or is it time to abandon hope and switch to something like C or Java
I purposed in comment two different approches :
using a shared variable from multiprocessing.Value
pause / resume the process with signals
Control by sharing a variable
def target_process_1(run_statement):
while True:
if run_statement.value:
print "I'm running !"
time.sleep(1)
def target_process_2(run_statement):
time.sleep(3)
print "Stoping"
run_statement.value = False
time.sleep(3)
print "Resuming"
run_statement.value = True
if __name__ == "__main__":
run_statement = Value("i", 1)
process_1 = Process(target=target_process_1, args=(run_statement,))
process_2 = Process(target=target_process_2, args=(run_statement,))
process_1.start()
process_2.start()
time.sleep(8)
process_1.terminate()
process_2.terminate()
Control by sending a signal
from multiprocessing import Process
import time
import os, signal
def target_process_1():
while True:
print "Running !"
time.sleep(1)
def target_process_2(target_pid):
time.sleep(3)
os.kill(target_pid, signal.SIGSTOP)
time.sleep(3)
os.kill(target_pid, signal.SIGCONT)
if __name__ == "__main__":
process_1 = Process(target=target_process_1)
process_1.start()
process_2 = Process(target=target_process_2, args=(process_1.pid,))
process_2.start()
time.sleep(8)
process_1.terminate()
process_2.terminate()
Side note: if possible do not run a while True.
EDIT: if you want to manage your process in two different files, supposing you want to use a control by sharing a variable, this is a way to do.
# file foo.py
from multiprocessing import Value, Process
import time
__all__ = ['start', 'stop', 'pause', 'resume']
_statement = None
_process = None
def _target(run_statement):
""" Target of the foo's process """
while True:
if run_statement.value:
print "I'm running !"
time.sleep(1)
def start():
global _process, _statement
_statement = Value("i", 1)
_process = Process(target=_target, args=(_statement,))
_process.start()
def stop():
global _process, _statement
_process.terminate()
_statement, _process = None, _process
def enable():
_statement.value = True
def disable():
_statement.value = False

Python multiprocessing won't work

Here's the relevant part of the code:
x=None
def pp():
global x
x=MyClass()
x.start()
def main():
global x
p=Process(target=pp)
p.start()
while x==None:
print("Not yet...")
while 1:
print(x.getoutput(),end="")
p.join()
if __name__=='__main__':
main()
The x.start() method opens a TKInter window, so it runs forever (or at least until the user closes the window). I'm trying to run another process that would get information from the used window, but it doesn't work.
How can i make it work?
I feel like the first thing to point out here is that each child process will import the main script and have it's own local copy. It's not possible to use global variables in the way you're trying to here, because the child processes don't use the same namespace. If you're set on using multiprocessing for this, you need to use a communication pipe of some description, as described in the following documentation:
http://pymotw.com/2/multiprocessing/communication.html#multiprocessing-queues
I am a little curious what your ultimate objective is here for using multiprocessing. Still, if you really want to do it, it's possible:
import multiprocessing
import tkinter
import time
def worker(q):
#q is a queue for communcation.
#Set up tkinter:
root = tkinter.Tk()
localvar = tkinter.StringVar()
wind = tkinter.Entry(root,textvariable=localvar)
wind.grid()
#This callback puts the contents of the entry into the queue
#when the entry widget is modified
def clbk(name,index,mode,q=q):
q.put(localvar.get())
localvar.trace_variable('w',clbk)
#Some window dressing that will signal to the main process that
#the window has been closed
def close(root=root,q=q):
q.put('EXIT')
root.destroy()
root.quit()
root.protocol("WM_DELETE_WINDOW",close)
root.mainloop()
def main():
#Make a queue to facilitate communication:
queue = multiprocessing.Queue()
p = multiprocessing.Process(target=worker,args=(queue,))
p.start()
#Wait for input...
while True:
ret = queue.get()
print(ret)
if ret=='EXIT':
break
time.sleep(0.1)
#Finally, join the process back in.
p.join()
if __name__ == '__main__':
main()
Ignoring the window dressing, that will now print text entered into your tkinter entry widget, and exits when the window is closed.

how real programmer do server loop?

everytime when running this program, I hear my cpu fan is boosting. I suspected the busy waiting while loops in the code is the cause. I wonder how a real programmer will do to optimize this?
from multiprocessing import Process, Queue
import threading
class PThread(threading.Thread):
def __init__(self):
threading.Thread.__init__(self)
#view leave will set this event
self.event = threading.Event()
def run(self):
while 1:
if not self.event.is_set():
print 'run'
else:
break
def server_control(queue):
while True:
try:
event = queue.get(False)
except:
event = None
if event == 'DETECTED':
print 'DETECTED'
t = PThread()
t.start()
elif event == 'LEAVE':
print 'Viewer_left'
t.event.set()
t.join()
elif event == 'QUIT':
break
q=Queue()
p = Process(target=server_control, args=(q,))
p.start()
p.join()
If a thread needs to wait for an event, it should sleep until the event occurs, rather than busy-waiting. Your event object has a wait() method that can be used to accomplish that. Call it, and it won't return until some other thread has called set() on the event (or the timeout elapses, if you specify one). In the meantime, the thread uses no CPU.
The multiprocessing module has a clone of threading's event object
from multiprocessing import Process, Event
Instead of use a Queue. You should declare event of interest in your main and pass them to other process
In your case:
detected = Event()
leave = Event()
exit = Event()
Process(target=server_control, args=(detected, leave, exit))
and finally check if the event is fired or wait in your loop
You might make the loop a bit less tight by adding a time.sleep(0) in the loop to pass the remainder of the quantum to another thread.
See also: How does a threading.Thread yield the rest of its quantum in Python?

Categories