I need to start another process to run parallel with the while loop:
while True:
#bunch of stuff happening
if #something happens:
#do something (here I have something that takes time and while loop will 'pause' untill this
finishes. I need the while loop to somehow continue looping parallel with
this process.)
I tried something like this:
while True:
#bunch of stuff happening
if #something happens:
exec(open("filename.py").read()) #here I tried to call for another script but the while loop
won't continue. It just runs this script and finishes, but
I need this secont script to run parallel with the while loop looping.
You could use multiprocessing for this. Check the doc here
Here's a minimalistic example, hope this helps you.
import multiprocessing
number_of_processes = 5
def exec_process(filename):
#your exec code goes here
p = multiprocessing.Pool(processes = number_of_processes)
while True:
if: #something happens
p.apply_async(exec_process, (filename,))
p.close()
p.join()
Additionally, it is also good to use callback which becomes like master to your processes where you could define your terminating conditions.
Your definition could be like:
def exec_process(filename):
try:
#do what it does
return True
except:
return False
def callback(result):
if not result:
#do what you want to do in case of failure
#something like p.terminate()
#indicate failure to global variables
#Now apply call becomes:
p.apply_async(exec_process, (filename,), callback=callback)
You can use asyncio to do that. Here's a fully working example of a basic producer/consumer:
import asyncio
import random
from datetime import datetime
from pydantic import BaseModel
class Measurement(BaseModel):
data: float
time: datetime
async def measure(queue: asyncio.Queue):
while True:
# Replicate blocking call to recieve data
await asyncio.sleep(1)
print("Measurement complete!")
for i in range(3):
data = Measurement(
data=random.random(),
time=datetime.utcnow()
)
await queue.put(data)
await queue.put(None)
async def process(queue: asyncio.Queue):
while True:
data = await queue.get()
print(f"Got measurement! {data}")
# Replicate pause for http request
await asyncio.sleep(0.3)
print("Sent data to server")
loop = asyncio.get_event_loop()
queue = asyncio.Queue(loop=loop)
meansurement = measure(queue)
processor = process(queue)
loop.run_until_complete(asyncio.gather(processor, meansurement))
loop.close()
Related
Can you help me see what I have understood wrong here please. I have two functions and I would like the second one to run regardless of the status of the first one (whether it is finished or not). Hence I was thinking to make the first function asynchronous. This is what I have done
import os
import asyncio
from datetime import datetime
async def do_some_iterations():
for i in range(10):
print(datetime.now().time())
await asyncio.sleep(1)
print('... Cool!')
async def main():
task = asyncio.create_task (do_some_iterations())
await task
def do_something():
print('something...')
if __name__ == "__main__":
asyncio.run(main())
do_something()
The output is:
00:46:00.145024
00:46:01.148533
00:46:02.159751
00:46:03.169868
00:46:04.179915
00:46:05.187242
00:46:06.196356
00:46:07.207614
00:46:08.215997
00:46:09.225066
Cool!
something...
which looks like the traditional way where one function has to finish and then move to the next call.
I was hoping instead to execute do_something() before the asynchronous function started generating the print statements (or at lease at the very top of those statements..)
What am I doing wrong please? How I should edit the script?
They both need to be part of the event loop the you created. asyncio.run() itself is not async, which means it will run until the loop ends. One easy way to do this is to use gather()
import asyncio
from datetime import datetime
async def do_some_iterations():
for i in range(10):
print(datetime.now().time())
await asyncio.sleep(1)
print('... Cool!')
async def do_something():
print('something...')
async def main():
await asyncio.gather(
do_some_iterations(),
do_something()
)
if __name__ == "__main__":
asyncio.run(main())
print("done")
This will print:
16:08:38.921879
something...
16:08:39.922565
16:08:40.923709
16:08:41.924823
16:08:42.926004
16:08:43.927044
16:08:44.927877
16:08:45.928724
16:08:46.929589
16:08:47.930453
... Cool!
done
You can also simply add another task:
async def main():
task = asyncio.create_task(do_some_iterations())
task2 = asyncio.create_task(do_something())
In both cases the function needs to be awaitable.
I am trying to run a multi-threaded program with asyncio, but I am failing at the part of adding threads. The program runs ok with asyncio as it is:
async def main(var1, var2):
tasks = list()
for z in var1:
for x in range(5):
tasks.append(get_ip(z, var2))
return await asyncio.gather(*tasks)
loop = asyncio.get_event_loop()
start_time = time.time()
for x in list:
result = loop.run_until_complete(main(x, list2))
loop.run_until_complete(release_main(result))
loop.close()
I want to have the for x in list: in threads, I have 8 CPUs so I would want to run it with 8 threads like for example using: with concurrent.futures.ThreadPoolExecutor(max_workers=8) as executor:.
I have been reading posts and everything but I either mess the result from result, or break something and doesn't work. Help/tips needed.
await loop.run_in_executor() doesn't work if I don't have that in a function, is it really needed? when I add the above code in a function and call it it breaks everything
Asyncio loops are not thread safe and are supposed to be used in a single thread, except for few thread safe function e.g. run_coroutine_threadsafe. And so one thing you can do is to spawn multiple threads and then create a separate loop in each thread:
import asyncio
import multiprocessing
import threading
import queue
# Spawn threads and run new loop in each thread
loops = queue.Queue()
def thread_job():
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loops.put(loop)
loop.run_forever()
for _ in range(multiprocessing.cpu_count()):
t = threading.Thread(target=thread_job)
t.start()
# Our main
async def main(x, ev):
try:
await asyncio.sleep(1)
print(x, threading.current_thread().ident)
finally:
ev.set() # events are used to signalize that we are done
# Our job, that only runs main
def job(x, ev):
loop = asyncio.get_event_loop()
loop.create_task(main(x, ev))
# Main thread that synchronizes everything
lst = [i for i in range(3)]
events = []
for x in lst:
ev = threading.Event()
events.append(ev)
# by putting the loop back to queue
# we effectively shift the queue and
# distribute the load equally
loop = loops.get()
loops.put(loop)
loop.call_soon_threadsafe(job, x, ev)
for ev in events:
ev.wait()
# Once all events are set, i.e. we are done, cleanup
while not loops.empty():
loop = loops.get()
loop.call_soon_threadsafe(loop.stop)
I use a FIFO queue for loops here, in order to distribute load equally between threads.
I was experimenting with Asyncio in python, and thought what will happen if call 2 different Asyncio functions running concurrently to non-async fuction.
so did like this `
def calc(number):
while True:
return(number * number)
async def one():
while True:
a = calc(5)
print(a)
await asyncio.sleep(0)
async def two():
while True:
a = calc(2)
print(a)
await asyncio.sleep(0)
if __name__=='__main__':
import os
import uvloop
import asyncio
loop = uvloop.new_event_loop()
asyncio.set_event_loop(loop)
loop.create_task(one())
loop.create_task(two())
loop.run_forever()
I thought program will freeze in cal function (while loop), but the program is printing results concurrently.Could any one explain me why this is not getting stuck in the while loop, Thanks.
`
I'm playing with Python's new(ish) asyncio stuff, trying to combine its event loop with traditional threading. I have written a class that runs the event loop in its own thread, to isolate it, and then provide a (synchronous) method that runs a coroutine on that loop and returns the result. (I realise this makes it a somewhat pointless example, because it necessarily serialises everything, but it's just as a proof-of-concept).
import asyncio
import aiohttp
from threading import Thread
class Fetcher(object):
def __init__(self):
self._loop = asyncio.new_event_loop()
# FIXME Do I need this? It works either way...
#asyncio.set_event_loop(self._loop)
self._session = aiohttp.ClientSession(loop=self._loop)
self._thread = Thread(target=self._loop.run_forever)
self._thread.start()
def __enter__(self):
return self
def __exit__(self, *e):
self._session.close()
self._loop.call_soon_threadsafe(self._loop.stop)
self._thread.join()
self._loop.close()
def __call__(self, url:str) -> str:
# FIXME Can I not get a future from some method of the loop?
future = asyncio.run_coroutine_threadsafe(self._get_response(url), self._loop)
return future.result()
async def _get_response(self, url:str) -> str:
async with self._session.get(url) as response:
assert response.status == 200
return await response.text()
if __name__ == "__main__":
with Fetcher() as fetcher:
while True:
x = input("> ")
if x.lower() == "exit":
break
try:
print(fetcher(x))
except Exception as e:
print(f"WTF? {e.__class__.__name__}")
To avoid this sounding too much like a "Code Review" question, what is the purpose of asynchio.set_event_loop and do I need it in the above? It works fine with and without. Moreover, is there a loop-level method to invoke a coroutine and return a future? It seems a bit odd to do this with a module level function.
You would need to use set_event_loop if you called get_event_loop anywhere and wanted it to return the loop created when you called new_event_loop.
From the docs
If there’s need to set this loop as the event loop for the current context, set_event_loop() must be called explicitly.
Since you do not call get_event_loop anywhere in your example, you can omit the call to set_event_loop.
I might be misinterpreting, but i think the comment by #dirn in the marked answer is incorrect in stating that get_event_loop works from a thread. See the following example:
import asyncio
import threading
async def hello():
print('started hello')
await asyncio.sleep(5)
print('finished hello')
def threaded_func():
el = asyncio.get_event_loop()
el.run_until_complete(hello())
thread = threading.Thread(target=threaded_func)
thread.start()
This produces the following error:
RuntimeError: There is no current event loop in thread 'Thread-1'.
It can be fixed by:
- el = asyncio.get_event_loop()
+ el = asyncio.new_event_loop()
The documentation also specifies that this trick (creating an eventloop by calling get_event_loop) only works on the main thread:
If there is no current event loop set in the current OS thread, the OS thread is main, and set_event_loop() has not yet been called, asyncio will create a new event loop and set it as the current one.
Finally, the docs also recommend to use get_running_loop instead of get_event_loop if you're on version 3.7 or higher
Let's assume I have the following code:
import asyncio
import threading
queue = asyncio.Queue()
def threaded():
import time
while True:
time.sleep(2)
queue.put_nowait(time.time())
print(queue.qsize())
#asyncio.coroutine
def async():
while True:
time = yield from queue.get()
print(time)
loop = asyncio.get_event_loop()
asyncio.Task(async())
threading.Thread(target=threaded).start()
loop.run_forever()
The problem with this code is that the loop inside async coroutine is never finishing the first iteration, while queue size is increasing.
Why is this happening this way and what can I do to fix it?
I can't get rid of separate thread, because in my real code I use a separate thread to communicate with a serial device, and I haven't find a way to do that using asyncio.
asyncio.Queue is not thread-safe, so you can't use it directly from more than one thread. Instead, you can use janus, which is a third-party library that provides a thread-aware asyncio queue.
import asyncio
import threading
import janus
def threaded(squeue):
import time
while True:
time.sleep(2)
squeue.put_nowait(time.time())
print(squeue.qsize())
#asyncio.coroutine
def async_func(aqueue):
while True:
time = yield from aqueue.get()
print(time)
loop = asyncio.get_event_loop()
queue = janus.Queue(loop=loop)
asyncio.create_task(async_func(queue.async_q))
threading.Thread(target=threaded, args=(queue.sync_q,)).start()
loop.run_forever()
There is also aioprocessing (full-disclosure: I wrote it), which provides process-safe (and as a side-effect, thread-safe) queues as well, but that's overkill if you're not trying to use multiprocessing.
Edit
As pointed it out in other answers, for simple use-cases you can use loop.call_soon_threadsafe to add to the queue, as well.
If you do not want to use another library you can schedule a coroutine from the thread. Replacing the queue.put_nowait with the following works fine.
asyncio.run_coroutine_threadsafe(queue.put(time.time()), loop)
The variable loop represents the event loop in the main thread.
EDIT:
The reason why your async coroutine is not doing anything is that
the event loop never gives it a chance to do so. The queue object is
not threadsafe and if you dig through the cpython code you find that
this means that put_nowait wakes up consumers of the queue through
the use of a future with the call_soon method of the event loop. If
we could make it use call_soon_threadsafe it should work. The major
difference between call_soon and call_soon_threadsafe, however, is
that call_soon_threadsafe wakes up the event loop by calling loop._write_to_self() . So let's call it ourselves:
import asyncio
import threading
queue = asyncio.Queue()
def threaded():
import time
while True:
time.sleep(2)
queue.put_nowait(time.time())
queue._loop._write_to_self()
print(queue.qsize())
#asyncio.coroutine
def async():
while True:
time = yield from queue.get()
print(time)
loop = asyncio.get_event_loop()
asyncio.Task(async())
threading.Thread(target=threaded).start()
loop.run_forever()
Then, everything works as expected.
As for the threadsafe aspect of
accessing shared objects,asyncio.queue uses under the hood
collections.deque which has threadsafe append and popleft.
Maybe checking for queue not empty and popleft is not atomic, but if
you consume the queue only in one thread (the one of the event loop)
it could be fine.
The other proposed solutions, loop.call_soon_threadsafe from Huazuo
Gao's answer and my asyncio.run_coroutine_threadsafe are just doing
this, waking up the event loop.
BaseEventLoop.call_soon_threadsafe is at hand. See asyncio doc for detail.
Simply change your threaded() like this:
def threaded():
import time
while True:
time.sleep(1)
loop.call_soon_threadsafe(queue.put_nowait, time.time())
loop.call_soon_threadsafe(lambda: print(queue.qsize()))
Here's a sample output:
0
1443857763.3355968
0
1443857764.3368602
0
1443857765.338082
0
1443857766.3392274
0
1443857767.3403943
What about just using threading.Lock with asyncio.Queue?
class ThreadSafeAsyncFuture(asyncio.Future):
""" asyncio.Future is not thread-safe
https://stackoverflow.com/questions/33000200/asyncio-wait-for-event-from-other-thread
"""
def set_result(self, result):
func = super().set_result
call = lambda: func(result)
self._loop.call_soon_threadsafe(call) # Warning: self._loop is undocumented
class ThreadSafeAsyncQueue(queue.Queue):
""" asyncio.Queue is not thread-safe, threading.Queue is not awaitable
works only with one putter to unlimited-size queue and with several getters
TODO: add maxsize limits
TODO: make put corouitine
"""
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.lock = threading.Lock()
self.loop = asyncio.get_event_loop()
self.waiters = []
def put(self, item):
with self.lock:
if self.waiters:
self.waiters.pop(0).set_result(item)
else:
super().put(item)
async def get(self):
with self.lock:
if not self.empty():
return super().get()
else:
fut = ThreadSafeAsyncFuture()
self.waiters.append(fut)
result = await fut
return result
See also - asyncio: Wait for event from other thread