I have the following code
def leftdoor():
press('a')
pyautogui.sleep(1)
press('a')
def rightdoor():
press('d')
pyautogui.sleep(1)
press('d')
leftdoor()
rightdoor()
and when I run the code what happens is the letter A is pressed and 1 second is waited and then its pressed again. Then the same happens for the D key. However is there a way for me to be able to press them both down and express that in code by calling both functions and not having to wait for the .sleep of the previous function?
There are two ways to run your code concurrently:
Combine the functions (might not be possible for large functions)
In the case of your code, it would look like this:
def door():
press('a')
press('d')
sleep(1)
press('a')
press('d')
door()
If this isn't what you're looking for, use threading.
Theading
Here is a link to a tutorial on the module, and the code is below.
from threading import Thread # Module import
rdt = Thread(target=rightdoor) # Create two Thread objects
ldt = Thread(target=leftdoor)
rdt.start() # start and join the objects
ldt.start()
rdt.join()
ldt.join()
print("Finished execution") # done!
Note that using this does not absolutely guarantee that a and d will be pressed at the same time (I got a ~10 millisecond delay at max, and it might have been from the program I used to time it), but it should work for all purposes.
Related
Maybe it's a very simple question, but I'm new in concurrency. I want to do a python script to run foo.py 10 times simultaneously with a time limit of 60 sec before automatically abort. The script is a non deterministic algorithm, hence all executions takes different times and one will be finished before the others. Once the first ends, I would like to save the execution time, the output of the algorithm and after that kill the rest of the processes.
I have seen this question run multiple instances of python script simultaneously and it looks very similar, but how can I add time limit and the possibility of when the first one finishes the execution, kills the rest of processes?
Thank you in advance.
I'd suggest using the threading lib, because with it you can set threads to daemon threads so that if the main thread exits for whatever reason the other threads are killed. Here's a small example:
#Import the libs...
import threading, time
#Global variables... (List of results.)
results=[]
#The subprocess you want to run several times simultaneously...
def run():
#We declare results as a global variable.
global results
#Do stuff...
results.append("Hello World! These are my results!")
n=int(input("Welcome user, how much times should I execute run()? "))
#We run the thread n times.
for _ in range(n):
#Define the thread.
t=threading.Thread(target=run)
#Set the thread to daemon, this means that if the main process exits the threads will be killed.
t.setDaemon(True)
#Start the thread.
t.start()
#Once the threads have started we can execute tha main code.
#We set a timer...
startTime=time.time()
while True:
#If the timer reaches 60 s we exit from the program.
if time.time()-startTime>=60:
print("[ERROR] The script took too long to run!")
exit()
#Do stuff on your main thread, if the stuff is complete you can break from the while loop as well.
results.append("Main result.")
break
#When we break from the while loop we print the output.
print("Here are the results: ")
for i in results:
print(f"-{i}")
This example should solve your problem, but if you wanted to use blocking commands on the main thread the timer would fail, so you'd need to tweak this code a bit. If you wanted to do that move the code from the main thread's loop to a new function (for example def main(): and execute the rest of the threads from a primary thread on main. This example may help you:
def run():
pass
#Secondary "main" thread.
def main():
#Start the rest of the threads ( in this case I just start 1).
localT=threading.Thread(target=run)
localT.setDaemon(True)
localT.start()
#Do stuff.
pass
#Actual main thread...
t=threading.Thread(target=main)
t.setDaemon(True)
t.start()
#Set up a timer and fetch the results you need with a global list or any other method...
pass
Now, you should avoid global variables at all costs as sometimes they may be a bit buggy, but for some reason the threading lib doesn't allow you to return values from threads, at least i don't know any methods. I think there are other multi-processing libs out there that do let you return values, but I don't know anything about them so I can't explain you anything. Anyways, I hope that this works for you.
-Update: Ok, I was busy writing the code and I didn't read the comments in the post, sorry. You can still use this method but instead of writing code inside the threads, execute another script. You could either import it as a module or actually run it as a script, here's a question that may help you with that:
How to run one python file in another file?
I'm trying to combine two python3 scripts I'm running separately at the moment. Both run in an infinite loop. I found different ways of achieving what I want, but I'm a beginner still learning and trying to do it the right way.
One script is a reddit bot that replies to certain comments and uploads videos, while saving links in newly created .txt files. The other one iterates through those .txt files, reads them and sometimes deletes them.
This variety seems to be the most intuitive for me:
from threading import Thread
def runA():
while True:
print 'A\n'
def runB():
while True:
print 'B\n'
if __name__ == "__main__":
t1 = Thread(target = runA)
t2 = Thread(target = runB)
t1.setDaemon(True)
t2.setDaemon(True)
t1.start()
t2.start()
while True:
pass
Is this the preferred way of running threads? And why do I need
While True:
pass
at the end?
In general, that is a good way to start two threads, but there are details to think about.
Note that in that code, there are actually 3 threads: main thread, t1 and t2.
Since the comments say one thread downloads and the other reads the downloaded files and since the main thread does nothing in your case, I'd say you need just this much:
def download_forever():
while True:
download_stuff()
def process_new_downloads():
do_something_with_new_downloads_here
def main():
download_thread = Thread(target=download_forever)
download_thread.start()
while True:
process_new_downloads()
sleep(1) # let go of the CPU for a while, there's nothing to do anyway
Setting the threads as daemon does not modify how they live, only how they die. And here it is not clear how the whole thing ends, so I'm not sure you need that. You might want to implement some ways to stop the threads politely. You might also define some way to end the whole thing.
Additionaly, you could implement a way for one thread to wake up the other exactly when there is something new to do. You can do that e.g. with a threading.Event.
BTW, the while True which was in the main thread in the original code was needed exactly because all other threads were daemons, so ending the main thread (i.e. not making it run forever) would kill the whole application.
I have 2 separate scripts working with the same variables.
To be more precise, one code edits the variables and the other one uses them (It would be nice if it could edit them too but not absolutely necessary.)
This is what i am currently doing:
When code 1 edits a variable it dumps it into a json file.
Code 2 repeatedly opens the json file to get the variables.
This method is really not elegant and the while loop is really slow.
How can i share variables across scripts?
My first scripts gets data from a midi controller and sends web-requests.
My second script is for LED strips (those run thanks to the same midi controller). Both script run in a "while true" loop.
I can't simply put them in the same script since every webrequest would slow the LEDs down. I am currently just sharing the variables via a json file.
If enough people ask for it i will post the whole code but i have been told not to do this
Considering the information you provided, meaning...
Both script run in a "while true" loop.
I can't simply put them in the same script since every webrequest would slow the LEDs down.
To me, you have 2 choices :
Use a client/server model. You have 2 machines. One acts as the server, and the second as the client. The server has a script with an infinite loop that consistently updates the data, and you would have an API that would just read and expose the current state of your file/database to the client. The client would be on another machine, and as I understand it, it would simply request the current data, and process it.
Make a single multiprocessing script. Each script would run on a separate 'thread' and would manage its own memory. As you also want to share variables between your two programs, you could pass as argument an object that would be shared between both your programs. See this resource to help you.
Note that there are more solutions to this. For instance, you're using a JSON file that you are consistently opening and closing (that is probably what takes the most time in your program). You could use a real Database that could handle being opened only once, and processed many times, while still being updated.
a Manager from multiprocessing lets you do this sort thing pretty easily
first I simplify your "midi controller and sends web-request" code down to something that just sleeps for random amounts of time and updates a variable in a managed dictionary:
from time import sleep
from random import random
def slow_fn(d):
i = 0
while True:
sleep(random() ** 2)
i += 1
d['value'] = i
next we simplify the "LED strip" control down to something that just prints to the screen:
from time import perf_counter
def fast_fn(d):
last = perf_counter()
while True:
sleep(0.05)
value = d.get('value')
now = perf_counter()
print(f'fast {value} {(now - last) * 1000:.2f}ms')
last = now
you can then run these functions in separate processes:
import multiprocessing as mp
with mp.Manager() as manager:
d = manager.dict()
procs = []
for fn in [slow_fn, fast_fn]:
p = mp.Process(target=fn, args=[d])
procs.append(p)
p.start()
for p in procs:
p.join()
the "fast" output happens regularly with no obvious visual pauses
Ive been trying to read up on threading and multiprocessing but all the examples are to intricate and advanced for my level of python/programming knowlegde. I want to run a function, which consists of a while loop, and while that loop runs I want to continue with the program and eventually change the condition for the while-loop and end that process. This is the code:
class Example():
def __init__(self):
self.condition = False
def func1(self):
self.condition = True
while self.condition:
print "Still looping"
time.sleep(1)
print "Finished loop"
def end_loop(self):
self.condition = False
The I make the following function-calls:
ex = Example()
ex.func1()
time.sleep(5)
ex.end_loop()
What I want is for the func1 to run for 5s before the end_loop() is called and changes the condition and ends the loop and thus also the function. I.e I want one process to start and "go" into func1 and at the same time I want time.sleep(5) to be called, so the processes "split" when arriving at func1, one process entering the function while the other continues down the program and start with the time.sleep(5) execution.
This must be the most basic example of a multiprocess, still Ive had trouble finding a simple way to do it!
Thank you
EDIT1: regarding do_something. In my real problem do_something is replaced by some code that communicates with another program via a socket and receives packages with coordinates every 0.02s and stores them in membervariables of the class. I want this constant updating of the coordinates to start and then be able to to read the coordinates via other functions at the same time.
However that is not so relevant. What if do_something is replaced by:
time.sleep(1)
print "Still looping"
How do I solve my problem then?
EDIT2: I have tried multiprocessing like this:
from multiprocessing import Process
ex = Example()
p1 = Process(target=ex.func1())
p2 = Process(target=ex.end_loop())
p1.start()
time.sleep(5)
p2.start()
When I ran this, I never got to p2.start(), so that did not help. Even if it had this is not really what Im looking for either. What I want would be just to start the process p1, and then continue with time.sleep and ex.end_loop()
The first problem with your code are the calls
p1 = Process(target=ex.func1())
p2 = Process(target=ex.end_loop())
With ex.func1() you're calling the function and pass the return value as target parameter. Since the function doesn't return anything, you're effectively calling
p1 = Process(target=None)
p2 = Process(target=None)
which makes, of course, no sense.
After fixing that, the next problem will be shared data: when using the multiprocessing package, you implement concurrency using multiple processes which, by default, cannot simply share data afaik. Have a look at Sharing state between processes in the package's documentation to read about this. Especially take the first sentence into account: "when doing concurrent programming it is usually best to avoid using shared state as far as possible"!
So you might want to also have a look at Exchanging objects between processes to read about how to send/receive data between two different processes. So, instead of simply setting a flag to stop the loop, it might be better to send a message to signal the loop should be terminated.
Also note that processes are a heavyweight form of multiprocessing, they spawn multiple OS processes which comes with a relatively big overhead. multiprocessing's main purpose is to avoid problems imposed by Python's Global Interpreter Lock (google about this to read more...) If your problem is'nt much more complex than what you've told us, you might want to use the threading package instead: threads come with less overhead than processes and also allow to access the same data (although you really should read about synchronization when doing this...)
I'm afraid, multiprocessing is an inherently complex subject. So I think you will need to advance your programming/python skills to successfully use it. But I'm sure you'll manage this, the python documentation about this is comprehensive and there are a lot of other resources about this.
To tackle your EDIT2 problem, you could try using the shared memory map Value.
import time
from multiprocessing import Process, Value
class Example():
def func1(self, cond):
while (cond.value == 1):
print('do something')
time.sleep(1)
return
if __name__ == '__main__':
ex = Example()
cond = Value('i', 1)
proc = Process(target=ex.func1, args=(cond,))
proc.start()
time.sleep(5)
cond.value = 0
proc.join()
(Note the target=ex.func1 without the parentheses and the comma after cond in args=(cond,).)
But look at the answer provided by MartinStettner to find a good solution.
i've been working on a project that i was assigned to do. it is about some sort of parking lot where the cars that enter, are generated automaticly (done) now, I've put them into a 'waiting list (because i have to represent them with a GUI module later) in order to later be assigned in a spot in the parking lot. and then they must get out the parking lot (also randomly)
The problem raises when I created a function that will always create cars randomly, now i cant call any other function because the first one is looping.
the question is, is there a way to call several looping functions at the same time?
Thanks
the question is, is there a way to call several looping functions at the same time?
This is a great question and there are several ways to do it.
Threading can let your functions run concurrently. The data flow between the threads should be managed using the Queue module:
# Inter-thread communication
wait_to_park = Queue()
wait_to_exit = Queue()
# Start the simulation
tg = threading.Thread(target=generate_cars)
tp = threading.Thread(target=park_cars)
tu = threading.Thread(target=unpark_cars)
tg.start(); tp.start(); tu.start()
# Wait for simumlation to finish
tg.join()
wait_to_park.join()
tp.join()
wait_to_exit.join()
tu.join()
Alternatively, you can use an event-loop such as the sched module to coordinate the events. Generators may help with this -- they work like functions that can be suspended and restarted.
maybe import random and then set up a range that you want certain events to happen?
def mainLoop():
while True:
x = random.randrange(1,100)
if 0>x>10: do something()
if 10>x>60: do somethingMoreFrequently()
if 60>x>61: do somethingRarely()
etcetera
if you LITERALLY want to call several looping functions at the same time be prepared to learn about Threading. Threading is difficult and i never do it unless 100% necessary.
but this should be simple enough to achieve without
Don't have both infinitely loop, have then each do work if needed and return (or possibly yield). Then have your main event loop call both. Something like this:
def car_arrival():
if need_to_generate_car:
# do car generation stuff
return
def car_departure()
if time_for_car_to_leave:
# do car leaving stuff
return
def event_loop():
while sim_running:
car_arrival()
car_departure()
sleep(0.5)