What I'd like to do is the following program to print out:
Running Main
Running Second
Running Main
Running Second
[...]
Code:
from multiprocessing import Process
import time
def main():
while True:
print('Running Main')
time.sleep(1)
def second():
while True:
print('Running Second')
time.sleep(1)
p1 = Process(main())
p2 = Process(second())
p1.start()
p2.start()
But it doesn't have the desired behavior. Instead it just prints out:
Running Main
Running Main
[...]
I suspect my program doesn't work because of the while statement?
Is there any way I can overcome this problem and have my program print out what I mentioned no matter what I execute in my function?
The issue here seems to be when you make the process vars. I suspect the reason for why the process inclusively runs the first function is because of syntax. My interpretation is that instead of creating a process out of a function you are making a process that executes a function exclusively.
When you want to create Process object you want to avoid using this
p1 = Process(target=main())
and rather write
p1 = Process(target=main)
That also means if you want to include any input for the function you will have to
p1 = Process(target=main, args=('hi',))
Related
I'm running a script on my Raspberry. Sometimes happens that the program freezes, so I've to close the terminal and re-run the .py
So I wanted to "multiprocess" this program. I made two function, the first one does the work, the second one has the job to check the time, and kill the process of the first function in the case the condition is true.
However I tried to do like so:
def AntiFreeze():
print("AntiFreeze partito\n")
global stop
global endtime
global freq
proc_SPN = multiprocessing.Process(target=SPN(), args=())
proc_SPN.start()
time.sleep(2)
proc_SPN.terminate()
proc_SPN.join()
if __name__ == '__main__':
proc_AF = multiprocessing.Process(target=AntiFreeze(), args=())
proc_AF.start()
The main function start the "AntiFreeze" function on a process, this one create another process to run the function that will do the Job I want.
THE PROBLEM (I think):
The function "SPN()" (that is the one that does the job) is busy in a very long while loop that calls function in another .py file.
So when I use proc_SPN.terminate() or proc_SPN.kill() nothing happens... why?
There is another way to force a process to kill? maybe I've to do two different programs?
Thanks in advance for help
You are calling your function at process creation, so most likely the process is never correctly spawned. Your code should be changed into:
def AntiFreeze():
print("AntiFreeze partito\n")
global stop
global endtime
global freq
proc_SPN = multiprocessing.Process(target=SPN, args=())
proc_SPN.start()
time.sleep(2)
proc_SPN.terminate()
proc_SPN.join()
if __name__ == '__main__':
proc_AF = multiprocessing.Process(target=AntiFreeze, args=())
proc_AF.start()
Furthermore, you shouldn't use globals (unless strictly necessarry). You could pass the needed arguments to the AntiFreeze function instead.
I set a multiprocessing.Queue code to run 2 functions in parallel. 1st func parses and writes data to a text file, 2nd function pulls data from same text file and show a live graph. 2nd func must kick off once the 1st func has created a text file. The code works well.
However:
It takes almost all RAM (ca. 6gb), is that because is a multi-process? in task manager I see 3 python.exe processes of 2gb each running at the same time, while when I run only the 1st func (the most RAM consuming) I can see only 1 python.exe of 2gb.
Once the code has parsed all the text and the graph stopped the processes keep running until I terminate manually the code using eclipse console red button, is that normal?
I have a small script that run before and out of the multi-process functions. It provides a value I need to define in order to run the multi-process functions. It runs OK, but once the the multi-process functions are called, it run again!! I don't know why, because it's definitely out of the multi-process functions.
Part of this was resolved in another stackoverflow question.
define newlista
define key
def get_all(myjson, kind, type)
def on_data(data)
small script run out of the multi-process function
def parsing(q):
keep_running = True
numentries = 0
for text in get_all(newlista, "sentence", "text"):
if 80 < len(text) < 500:
firstword = key.lower().split(None, 1)[0]
if text.lower().startswith(firstword):
pass
else:
on_data(text)
numentries += 1
q.put(keep_running)
keep_running = False
q.put(keep_running)
def live_graph(q):
keep_running = True
while keep_running:
keep_running = q.get()
# do the graph updates here
if __name__=='__main__':
q = Queue()
p1 = Process(target = writing, args=(q,))
p1.start()
p2 = Process(target = live_graph, args=(q,))
p2.start()
UPDATE
The graph function is the one that generates two .py processes and once the 1st function terminated the second function keeps running.
I trying to play around with multi-threading so I can better at it, but for some weird reason, my code doesn't want to follow the commands. It's suppose to go into a while loop and print, but it doesn't, and it's also not raising any errors, so which lines is the mistake on?
#!/usr/bin/env python
#
#
#
import random
import thread
import time
import sys
import os
def DisplayA(name,wait):
while True:
print 'Display: 1';time.sleep(wait)
def DisplayB(name,wait):
while True:
print 'Display: 2';time.sleep(wait)
def DisplayC(name,wait):
while True:
print 'Display: 3';time.sleep(wait)
thread.start_new_thread(DisplayA,('Display1',2))
thread.start_new_thread(DisplayB,('Display2',3))
thread.start_new_thread(DisplayC,('Display3',5))
Add this to the bottom:
while True:
pass
The problem is that you're running off the bottom of your main program. This terminates the entire execution session.
Quick and short solution:
while True:
time.sleep(1)
Do not use pass in the while loop, because it eats CPU.
Expensive way of doing nothing.
If you want a more general solution, then you can import Tread from threading, then you can use join:
from threading import Thread
...
p1 = Thread(name="A", target=DisplayA, args=('Display1',2))
p2 = Thread(name="B", target=DisplayB, args=('Display2',3))
p3 = Thread(name="C", target=DisplayC, args=('Display3',5))
p1.start()
p2.start()
p3.start()
p1.join()
p2.join()
p3.join()
This solution works also if the threads do not run endless, and your program can continue after the threads have finished.
You can either do what Prune here suggested, or you can suspend the main thread after initiating DisplayA, DisplayB and DisplayC.
for some reason the code does not get to the main function. I am using cloud9 to run the code so that might be the issue.
from multiprocessing import Process, Value
import time
def main():
print "main function"
def market_price_thread():
while True:
market_price()
time.sleep(5)
def market_price():
#do something
print "end"
def start_threads():
thread = Process(target=market_price_thread())
thread.start()
time.sleep(5)
if __name__ == '__main__':
start_threads()
main() #does not seem to get to this
You've asked Python to call market_price_thread:
thread = Process(target=market_price_thread())
and then use whatever it returns, as the target value. So, before calling Process, we'll have to wait for market_price_thread to return. What value does it return, and when?
(Compare with Process(target=market_price_thread), which does not call market_price_thread yet, but rather, passes the function to Process so that Process can call it.)
Hello I am trying to run 2 functions at the same time in python. Both read data from 2 separate meters over USB and they are not dependant on each other. I have tried multiprocessing but the second meter never starts.
def readMeter1():
while True:
#read Meter1
def readMeter2():
while True:
#read Meter2
if __name__ == "__main__":
Process(target = readMeter1()).start()
Process(target = readMeter2()).start()
Parameter target must be something callable (a function, in your case). You don't need to call that function yourself, start() will do it after launching a new process:
Process(target=readMeter1).start() # fork a new process, call readMeter1
Process(target=readMeter2).start() # fork a new process, call readMeter2
Because you call readMeter1, it starts an infinite loop in the current process and blocks everything else.