Python process doesn't work although it is alive - python

I have this code:
from multiprocessing import Process
def f():
print('hello')
if __name__ == '__main__':
p = Process(target=f)
p.start()
print(p.is_alive())
p.join()
Although it prints that the process is alive the f() functions never runs. This is a simple code example that I want to help me understand why it doesn't work.

Related

Why the child process can't finish, even if the code has go out of the function run?

I have the codes like this:
It is clear the 'finished' has been printed out. but join still blocks.
Why should this happend?
from multiprocessing import Process
class MyProcess(Process):
def run(self):
## do someting
print 'finished'
processes = []
for i in range(3):
p = MyProcess()
p.start()
processes.append(p)
for p in processes:
p.join()
you should add this line if __name__ == '__main__': for things to work properly
Explanation:
your main script will be imported by process.py module, then it will execute your script lines 2 times, one during importing and one from your script execution,
here is the runtime error if we didn't include if __name__ == '__main__':
RuntimeError:
An attempt has been made to start a new process before the
current process has finished its bootstrapping phase.
This probably means that you are not using fork to start your
child processes and you have forgotten to use the proper idiom
in the main module:
if __name__ == '__main__':
freeze_support()
...
The "freeze_support()" line can be omitted if the program
is not going to be frozen to produce an executable.
your working code in python 3.6 is:
from multiprocessing import Process
class MyProcess(Process):
def run(self):
## do someting
print ('finished')
processes = []
if __name__ == '__main__':
for i in range(3):
p = MyProcess()
p.start()
processes.append(p)
for p in processes:
p.join()
print('we are done here .......')
output:
finished
finished
finished
we are done here .......
join would not block if the task is finished, also your program is invalid.
for i in 3: # X integer is not iterable,
for i in range(3): # should be like this.

Basic multiprocessing with infinity loop and queue

import random
import queue as Queue
import _thread as Thread
a = Queue.Queue()
def af():
while True:
a.put(random.randint(0,1000))
def bf():
while True:
if (not a.empty()): print (a.get())
def main():
Thread.start_new_thread(af, ())
Thread.start_new_thread(bf, ())
return
if __name__ == "__main__":
main()
the above code works fine with extreme high CPU usage, i tried to use multiprocessing with no avail. i have tried
def main():
multiprocessing.Process(target=af).run()
multiprocessing.Process(target=bf).run()
and
def main():
manager = multiprocessing.Manager()
a = manager.Queue()
pool = multiprocessing.Pool()
pool.apply_async(af)
pool.apply_async(bf)
both not working, can anyone please help me? thanks a bunch ^_^
def main():
multiprocessing.Process(target=af).run() # will not return
multiprocessing.Process(target=bf).run()
The above code does not work because af does not return; no chance to call bf. You need to separate run call to start/join so that both can run in parallel. (+ to make them share manage.Queue)
To make the second code work, you need to pass a (manager.Queue object) to functions. Otherwise they will use Queue.Queue global object which is not shared between processes; need to modify af, bf to accepts a, and main to pass a.
def af(a):
while True:
a.put(random.randint(0, 1000))
def bf(a):
while True:
print(a.get())
def main():
manager = multiprocessing.Manager()
a = manager.Queue()
pool = multiprocessing.Pool()
proc1 = pool.apply_async(af, [a])
proc2 = pool.apply_async(bf, [a])
# Wait until process ends. Uncomment following line if there's no waiting code.
# proc1.get()
# proc2.get()
In the first alternative main you use Process, but the method you should call to start the activity is not run(), as one would think, but rather start(). You will want to follow that up with appropriate join() statements. Following the information in multiprocessing (available here: https://docs.python.org/2/library/multiprocessing.html), here is a working sample:
import random
from multiprocessing import Process, Queue
def af(q):
while True:
q.put(random.randint(0,1000))
def bf(q):
while True:
if not q.empty():
print (q.get())
def main():
a = Queue()
p = Process(target=af, args=(a,))
c = Process(target=bf, args=(a,))
p.start()
c.start()
p.join()
c.join()
if __name__ == "__main__":
main()
To add to the accepted answer, in the original code:
while True:
if not q.empty():
print (q.get())
q.empty() is being called every time which is unnecessary since q.get() if the queue is empty will wait until something is available here documentation.
Similar answer here
I assume that this could affect the performance since calling the .empty() every iteration should consume more resources (it should be more noticeable if Thread was used instead of Process because Python Global Interpreter Lock (GIL))
I know it's an old question but hope it helps!

Muliprocessing module not functioning as expected: not outputing

I ran the following code but it didn't output anything. I even included the argument [i] but nothing.
import multiprocessing
def worker():
"""worker function"""
print 'Worker'
return
if __name__ == '__main__':
jobs = []
for i in range(5):
p = multiprocessing.Process(target=worker)
jobs.append(p)
p.start()
The expected output is:
worker,
worker,
worker,
worker,
worker
Thanks for your suggestions.
I can't reproduce it, it works as expected. It may be because script will run too fast. Before the process get the chance to write to stdout, program reaches the end therefor you can't see the results.
Wait for jobs to finish.
for i in jobs: i.join()
I just found a solution. The items have to be put in a q and then you get them from the q one after the other. See solution below, there may be a better solutions, but this works perfectly.
from multiprocessing import Process, Queue
def worker(q):
"""worker function"""
q.put('Worker')
q=Queue()
jobs=[]
if __name__ == '__main__':
for i in range(5):
p = Process(target=worker, args=(q,))
p.start()
print q.get()
p.join()
I hope to find other solutions.

Python multiprocessing example not working

I am trying to learn how to use multiprocessingbut I can't get it to work. Here is the code right out of the documentation
from multiprocessing import Process
def f(name):
print 'hello', name
if __name__ == '__main__':
p = Process(target=f, args=('bob',))
p.start()
p.join()
it should output
>>> 'hello bob'
but instead i get
>>>
no errors or other messages, it just sits there, It is running in IDLE from a saved .py file on a Windows 7 machine with the 32-bit version of Python 2.7
My guess is that you are using IDLE to try to run this script. Unfortunately, this example will not run correctly in IDLE. Note the comment at the beginning of the docs:
Note Functionality within this package requires that the main
module be importable by the children. This is covered in Programming
guidelines however it is worth pointing out here. This means that some
examples, such as the multiprocessing.Pool examples will not work in
the interactive interpreter.
The __main__ module is not importable by children in IDLE, even if you run the script as a file with IDLE (which is commonly done with F5).
The problem is not IDLE. The problem is trying to print to sys.stdout in a process that has no sys.stdout. That is why Spyder has the same problem. Any GUI program on Windows is likely to have the same problem.
On Windows, at least, GUI programs are usually run in a process without stdin, stdout, or stderr streams. Windows expects GUI programs to interact with users through widgets that paint pixels on the screen (the G in Graphical) and receive key and mouse events from Windows event system. That is what the IDLE GUI does, using the tkinter wrapper of the tcl tk GUI framework.
When IDLE runs user code in a subprocess, idlelib.run runs first, and it replaces None for the standard streams with objects that interact with IDLE itself through a socket. Then it exec()s user code. When the user code runs multiprocessing, multiprocessing starts further processes that have no std streams, but never get them.
The solution is to start IDLE in a console: python -m idlelib.idle (the .idle is not needed on 3.x). Processes started in a console get std streams connect to the console. So do further subprocesses. The real stdout (as opposed to the sys.stdout) of all the processes is the console. If one runs the third example in the doc,
from multiprocessing import Process
import os
def info(title):
print(title)
print('module name:', __name__)
print('parent process:', os.getppid())
print('process id:', os.getpid())
def f(name):
info('function f')
print('hello', name)
if __name__ == '__main__':
info('main line')
p = Process(target=f, args=('bob',))
p.start()
p.join()
then the 'main line' block goes to the IDLE shell and the 'function f' block goes to the console.
This result shows that Justin Barber's claim that the user file run by IDLE cannot be imported into processes started by multiprocessing is not correct.
EDIT: Python saves the original stdout of a process in sys.__stdout__. Here is the result in IDLE's shell when IDLE is started normally on Windows, as a pure GUI process.
>>> sys.__stdout__
>>>
Here is the result when IDLE is started from CommandPrompt.
>>> import sys
>>> sys.__stdout__
<_io.TextIOWrapper name='<stdout>' mode='w' encoding='utf-8'>
>>> sys.__stdout__.fileno()
1
The standard file numbers for stdin, stdout, and stderr are 0, 1, 2. Run a file with
from multiprocessing import Process
import sys
def f(name):
print('hello', name)
print(sys.__stdout__)
print(sys.__stdout__.fileno())
if __name__ == '__main__':
p = Process(target=f, args=('bob',))
p.start()
p.join()
in IDLE started in the console and the output is the same.
It works.
I've marked the changes needed to make your sample run using comments:
from multiprocessing import Process
def f(name):
print 'hello', name #indent
if __name__ == '__main__':
p = Process(target=f, args=('bob',))
p.start()
p.join()` # remove ` (grave accent)
result:
from multiprocessing import Process
def f(name):
print 'hello', name
if __name__ == '__main__':
p = Process(target=f, args=('bob',))
p.start()
p.join()
Output from my laptop after saving it as ex1.py:
reuts#reuts-K53SD:~/python_examples$ cat ex1.py
#!/usr/bin/env python
from multiprocessing import Process
def f(name):
print 'hello', name
if __name__ == '__main__':
p = Process(target=f, args=('bob',))
p.start()
p.join()
reuts#reuts-K53SD:~/python_examples$ python ex1.py
hello bob
I had the issue that multiprocessing did not work on Spyder, and always landed here. I solved it by using threading instead of multiprocessing. as described here: https://pymotw.com/2/threading/
import threading
def worker(num):
"""thread worker function"""
print 'Worker: %s' % num
return
threads = []
for i in range(5):
t = threading.Thread(target=worker, args=(i,))
threads.append(t)
t.start()
Most likely your main process exits before sysout is flushed. Try this:
from multiprocessing import Process
import sys
def f(name):
print 'hello', name
if __name__ == '__main__':
p = Process(target=f, args=('bob',))
p.start()
p.join()
# make sure all output has been processed before we exit
sys.stdout.flush()
If this doesn't work, try adding time.sleep(1) as the last statement.
Try using this code (from standard manual). Works for me on windows. Another one did not work for me either :)
import multiprocessing as mp
def foo(q):
q.put('hello')
if __name__ == '__main__':
mp.set_start_method('spawn')
q = mp.Queue()
p = mp.Process(target=foo, args=(q,))
p.start()
print(q.get())
p.join()

Terminate Python Process in a Limited Time

Take a look at this simple python code with Process:
from multiprocessing import Process
import time
def f(name):
time.sleep(100)
print 'hello', name
if __name__ == '__main__':
p = Process(target=f, args=('bob',))
p.start()#Has to be terminated in 5 seconds
#p.join()
print "This Needs to be Printed Immediately"
I guess I am looking for a function like p.start(timeout).
I want to terminate the p process if it has not self-finished in like 5 seconds. How can I do that? There seems to be no such function.
If p.join() is uncommented, the following print line will have to wait 100 seconds and can not be 'Printed Immediately'.But I want it be done immediately so the p.join() has to be commented out.
Use a separate thread to start the process, wait 5 seconds, then terminate the process. Meanwhile the main thread can do the work you want to happen immediately:
from multiprocessing import Process
import time
import threading
def f(name):
time.sleep(100)
print 'hello', name
def run_process_with_timeout(timeout, target, args):
p = Process(target=target, args=args)
p.start()
time.sleep(timeout)
p.terminate()
if __name__ == '__main__':
t = threading.Thread(target=run_process_with_timeout, args=(5,f,('bob',)))
t.start()
print "This Needs to be Printed Immediately"
You might want to take a look at that SO thread.
basically their solution is to use the timeout capability of the threading module by running the process in a separate thread.
You are right, there is no such function in Python 2.x in the subprocess library.
However, with Python 3.3 you can use:
p = subprocess.Popen(...)
try:
p.wait(timeout=5)
except TimeoutError:
p.kill()
With older Python versions, you would have to write a loop that calls p.poll() and checks the returncode, e.g. once per second.
This is (like polling in general) not optimal from performance point-of-view, but it always depends on what you expect.
Try something like this:
def run_process_with_timeout(timeout, target, args):
p = Process(target=target, args=args)
running = False
second = int(time.strftime("%S"))
if second+timeout > 59:
second = (second+timeout)-60
else:
second = second+timeout
print second
while second > int(time.strftime("%S")):
if running == False:
p.start()
running = True
p.terminate()
basically just using the time module to allow a loop to run for five seconds and then moving on, this assumes timeout is given in seconds.
Though I'd point out that if this was used with the code the OP originally posted, this would work, as print was in a second function separate from the loop and would be carried out immediately after calling this function.
Why not use the timeout option of Process.join(), as in:
import sys
...
if __name__ == '__main__':
p = Process(target=f, args=('bob',))
p.start()#Has to be terminated in 5 seconds
# print immediately and flush output
print "This Needs to be Printed Immediately"
sys.stdout.flush()
p.join(5)
if p.is_alive():
p.terminate()

Categories