I am trying to understand asyncio and port my undestanding of threading. I will take the example of two threads running indefinitely and a non-threaded loop (all of them outputting to the console).
The threading version is
import threading
import time
def a():
while True:
time.sleep(1)
print('a')
def b():
while True:
time.sleep(2)
print('b')
threading.Thread(target=a).start()
threading.Thread(target=b).start()
while True:
time.sleep(3)
print('c')
I now tried to port this to asyncio based on the documentation.
Problem 1: I do not understand how to add the non-threaded task as all examples I saw show an ongoing loop at the end of the program which governs the asyncio threads.
I then wanted to have at least the two first threads (a and b) running in parallel (and, worst case, add the third c as a thread as well, abandonning the idea of mixed thread and non-threded operations):
import asyncio
import time
async def a():
while True:
await asyncio.sleep(1)
print('a')
async def b():
while True:
await asyncio.sleep(2)
print('b')
async def mainloop():
await a()
await b()
loop = asyncio.get_event_loop()
loop.run_until_complete(mainloop())
loop.close()
Problem 2: The output is a sequence of a, suggering that the b() coroutine is not called at all. Isn't await supposed to start a() and come back to the execution (and then start b())?
await stops execution at a point, you do await a(), and you have an infinite loop in a(), so it's logical b() doesn't get called. Think about it as if you insert a() in mainloop().
Consider this example:
async def main():
while True:
await asyncio.sleep(1)
print('in')
print('out (never gets printed)')
To achieve what you want you need to create a future which would manage multiple coroutines. asyncio.gather is for that.
import asyncio
async def a():
while True:
await asyncio.sleep(1)
print('a')
async def b():
while True:
await asyncio.sleep(2)
print('b')
async def main():
await asyncio.gather(a(), b())
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
loop.close()
Related
Is there a way to create a secondary asyncio loop(or prioritize an await) that when awaited does not pass control back to the main event loop buts awaits for those 'sub' functions? IE
import asyncio
async def priority1():
print("p1 before sleep")
await asyncio.sleep(11)
print("p1 after sleep")
async def priority2():
print("p2 before sleep")
await asyncio.sleep(11)
print("p2 after sleep")
async def foo():
while True:
print("foo before sleep")
#do not pass control back to main event loop here but run these 2 async
await asyncio.gather(priority1(),priority2())
print("foo after sleep")
async def bar():
while True:
print("bar before sleep")
await asyncio.sleep(5)
print("bar after sleep")
async def main():
await asyncio.gather(foo(),bar())
asyncio.run(main())
I would like foo to wait for priority1/2 to finish before passing control back to the main event loop.
Right now it would go:
foo before sleep
bar before sleep
p1 before sleep
p2 before sleep
bar after sleep
I would like to see:
foo before sleep
bar before sleep
p1 before sleep
p2 before sleep
p1 after sleep
p2 after sleep
bar after sleep
Is this possible?
thanks
It is not possible to run two event loops on the same thread. The code in asyncio and specs even look like they were thought in a way to permit that - but afterwards the API bent in a way it is no longer possible (for example, the explicit loop parameter for several of the relevant calls has been deprecated and removed)
In the same line, there is no way to prioritize a subset of tasks in the running loop. I answered a question on this line a couple weeks ago, and managed to get to a synchronization primitive to be used in place of asyncio.sleep which could take priority into account - but it requires all participating tasks to call it, so it would not be much different of a lock or something (I will link to it bellow - the idea is: your code call await custom.sleep() at certain points: it will only return when there are no other higher-prioritized tasks also calling that custom.sleep() ) -check it here: Execute asyncio task as soon as possible
When wrtting that code, I realized that it is possible to write an event loop which could take into account a "priority" attribute on tasks. But having a production-grade for this requires some non-hobby work: by using that loop, you could get what you are asking for, with no changes needed in the tasks code.
However, I think running a secondary loop in another thread, and then waiting synchronously on that thread to complete is a way to get your things accomplished.
import asyncio
import threading
def priority(func):
async def wrapper(*args, **kwargs):
result = None
def runner(*args, **kw):
nonlocal result
result = asyncio.run(func(*args, **kw))
t = threading.Thread(target=runner, args=args, kwargs=kwargs)
await asyncio.sleep(0)
t.start()
# if one wants to perform limited yields to the main loop, it should be done here
t.join()
return result
return wrapper
async def priority1():
print("p1 before sleep")
await asyncio.sleep(.11)
print("p1 after sleep")
async def priority2():
print("p2 before sleep")
await asyncio.sleep(.11)
print("p2 after sleep")
#priority
async def foo():
print("foo before sleep")
#do not pass control back to main event loop here but run these 2 async
await asyncio.gather(priority1(),priority2())
print("foo after sleep")
async def bar():
print("bar before sleep")
await asyncio.sleep(.05)
print("bar after sleep")
async def main():
await asyncio.gather(foo(),bar())
asyncio.run(main())
Any reason why I can't use a asyncio.Condition within a Task?
c = asyncio.Condition()
async def a():
print("A ..")
# await asyncio.sleep(0.2) # This works
async with c:
# RuntimeError: Task <Task pending coro=<a() running at ..this file..:13>> got Future <Future pending> attached to a different loop
await c.wait() #
async def main():
asyncio.get_event_loop().create_task(a())
await asyncio.sleep(2)
Says: "Got Future attached to a different loop"
I don't think I created a new loop.
Full example here:
import asyncio
c = asyncio.Condition()
async def a():
print("A ..")
# await asyncio.sleep(0.2) # This works
async with c:
# RuntimeError: Task <Task pending coro=<a() running at ..this file..:13>> got Future <Future pending> attached to a different loop
await c.wait() #
print("A done")
async def b():
await asyncio.sleep(2)
print("B ..")
async with c:
c.notify_all()
print("B done")
await asyncio.sleep(1)
async def main():
asyncio.get_event_loop().create_task(a())
await b()
asyncio.run(main())
I see the same error using, Python 3.7. 3.8, 3.9.
The docs for asyncio.Condition.notify_all() states:
The lock must be acquired before this method is called and released shortly after. If called with an unlocked lock a RuntimeError error is raised.
The lock gets released in a on calling c.wait(), therefore the Lock inside c is unlocked when you call c.notify_all().
You need to hold the lock before calling notify_all(). Using
async with c:
c.notify_all()
makes your example work as expected.
UPDATE
I tested this only on Python 3.10.1 where it worked like this. In fact it fails when I run it on Python 3.8.5. But the problem here stems of the use of the Condition as a global variable. In your example the Condition is created before the event loop is created, so I assume it is not correctly attached to the event loop that is created later.
I updated your example in a way, that the Condition is created with a running loop. This makes the example work again in Python 3.8.5:
import asyncio
async def a(c):
print("A ..")
async with c:
await c.wait()
print("A done")
async def b(c):
await asyncio.sleep(2)
print("B ..")
async with c:
c.notify_all()
print("B done")
await asyncio.sleep(1)
async def main():
c = asyncio.Condition() # loop already running
asyncio.create_task(a(c)) # get_event_loop() also works but the use is discouraged in the docs
await b(c)
asyncio.run(main())
As #JanWilamowski points out, this seems to be a bug with Python < v3.10. Works fine in 3.10.
Booo!
It also seems that you can't work around this bug by switching from Condition to Queue (see below) - I guess Queue uses Condition internally.
Simlarly ... this also failed:
async def main():
loop.create_task(a())
...
loop = asyncio.new_event_loop()
loop.run_until_complete(main())
# asyncio.run(main())
However, for reasons unclear, this does work:
async def main():
loop.create_task(a())
...
loop = asyncio.get_event_loop() // ####
loop.run_until_complete(main())
Full working example below:
import asyncio
import time
USE_QUEUE = False
if not USE_QUEUE:
c = asyncio.Condition()
else:
q = asyncio.Queue()
async def a():
print("A ..")
# await asyncio.sleep(0.2) # This works
if not USE_QUEUE:
async with c:
await c.wait()
else:
result = await q.get()
q.task_done()
print("result", result)
print("A done", time.time())
async def b():
await asyncio.sleep(1)
print("B ..", time.time())
if not USE_QUEUE:
async with c:
c.notify_all()
else:
result = await q.put(123)
await asyncio.sleep(1)
print("B done")
async def main():
loop.create_task(a())
# asyncio.get_event_loop().create_task(a())
await b()
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
I'm trying to understand how python's asyncio, and there is one particular situation that confuses me. What happens if you await a coroutine that is just a normal routine in all but name?
I.E.:
async def A():
time.wait( 1 )
async def B():
await A()
Will this just run like a normal synchronous program, and thus the async and awaits are superfluous? Or will A() be split off into a separate thread to run synchronously with other awaited functions?
As python_user says, it's only one thread, asyncio multiplexes functions inside a single event loop. For this to work slow (IO bound but not CPU intensive) functions must hand back execution control to the event loop, await does this. Note time.wait() does not exist, use time.sleep(), or the awaitable asyncio.sleep().
import asyncio
import time
async def A():
time.sleep(1.0) # blocks, stops B() if called first
async def B():
await asyncio.sleep(1.0) # hands back execution, A() can run while timing
async def main():
t0 = time.time()
await asyncio.gather(A(), B()) # takes 2 seconds
print(time.time() - t0)
t0 = time.time()
await asyncio.gather(B(), A()) # takes 1 second
print(time.time() - t0)
if __name__ == '__main__':
asyncio.run(main())
I want to make a timer which is started in a normal function, but in the timer function, it should be able to call an async function
I want to do something like this:
startTimer()
while True:
print("e")
def startTimer(waitForSeconds: int):
# Wait for `waitForSeconds`
await myAsyncFunc()
async def myAsyncFunc():
print("in my async func")
Where the while True loop should do its stuff and after waitForSeconds the timer the async function should execute an other async function, but waiting shouldn't block any other actions and doesn't need to be awaited
If something isn't understandable, I'm sorry, I'll try to explain it then
Thanks
If you want to run your synchronous and asynchronous code in parallel, you will need to run one of them in a separate thread. For example:
def sync_code():
while True:
print("e")
async def start_timer(secs):
await asyncio.sleep(secs)
await async_func()
async def main():
asyncio.create_task(start_timer(1))
loop = asyncio.get_event_loop()
# use run_in_executor to run sync code in a separate thread
# while this thread runs the event loop
await loop.run_in_executor(None, sync_code)
asyncio.run(main())
If the above is not acceptable for you (e.g. because it turns the whole program into an asyncio program), you can also run the event loop in a background thread, and submit tasks to it using asyncio.run_coroutine_threadsafe. That approach would allow startTimer to have the signature (and interface) like you wanted it:
def startTimer(waitForSeconds):
loop = asyncio.new_event_loop()
threading.Thread(daemon=True, target=loop.run_forever).start()
async def sleep_and_run():
await asyncio.sleep(waitForSeconds)
await myAsyncFunc()
asyncio.run_coroutine_threadsafe(sleep_and_run(), loop)
async def myAsyncFunc():
print("in my async func")
startTimer(1)
while True:
print("e")
I'm pretty sure that you are familiar with concurent processing, but you didn't show exactly what you want. So if I understand you correctly you want to have 2 processes. First is doing only while True, and the second process is the timer(waits e.g. 5s) and it will call async task. I assume that you are using asyncio according to tags:
import asyncio
async def myAsyncFunc():
print("in my async func")
async def call_after(delay):
await asyncio.sleep(delay)
await myAsyncFunc()
async def while_true():
while True:
await asyncio.sleep(1) # sleep here to avoid to large output
print("e")
async def main():
task1 = asyncio.create_task(
while_true())
task2 = asyncio.create_task(
call_after(5))
# Wait until both tasks are completed (should take
# around 2 seconds.)
await task1
await task2
asyncio.run(main())
Sometimes there is some non-critical asynchronous operation that needs to happen but I don't want to wait for it to complete. In Tornado's coroutine implementation you can "fire & forget" an asynchronous function by simply ommitting the yield key-word.
I've been trying to figure out how to "fire & forget" with the new async/await syntax released in Python 3.5. E.g., a simplified code snippet:
async def async_foo():
print("Do some stuff asynchronously here...")
def bar():
async_foo() # fire and forget "async_foo()"
bar()
What happens though is that bar() never executes and instead we get a runtime warning:
RuntimeWarning: coroutine 'async_foo' was never awaited
async_foo() # fire and forget "async_foo()"
Upd:
Replace asyncio.ensure_future with asyncio.create_task everywhere if you're using Python >= 3.7 It's a newer, nicer way to spawn tasks.
asyncio.Task to "fire and forget"
According to python docs for asyncio.Task it is possible to start some coroutine to execute "in the background". The task created by asyncio.ensure_future won't block the execution (therefore the function will return immediately!). This looks like a way to "fire and forget" as you requested.
import asyncio
async def async_foo():
print("async_foo started")
await asyncio.sleep(1)
print("async_foo done")
async def main():
asyncio.ensure_future(async_foo()) # fire and forget async_foo()
# btw, you can also create tasks inside non-async funcs
print('Do some actions 1')
await asyncio.sleep(1)
print('Do some actions 2')
await asyncio.sleep(1)
print('Do some actions 3')
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
Output:
Do some actions 1
async_foo started
Do some actions 2
async_foo done
Do some actions 3
What if tasks are executing after the event loop has completed?
Note that asyncio expects tasks to be completed at the moment the event loop completes. So if you'll change main() to:
async def main():
asyncio.ensure_future(async_foo()) # fire and forget
print('Do some actions 1')
await asyncio.sleep(0.1)
print('Do some actions 2')
You'll get this warning after the program finished:
Task was destroyed but it is pending!
task: <Task pending coro=<async_foo() running at [...]
To prevent that you can just await all pending tasks after the event loop has completed:
async def main():
asyncio.ensure_future(async_foo()) # fire and forget
print('Do some actions 1')
await asyncio.sleep(0.1)
print('Do some actions 2')
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
# Let's also finish all running tasks:
pending = asyncio.Task.all_tasks()
loop.run_until_complete(asyncio.gather(*pending))
Kill tasks instead of awaiting them
Sometimes you don't want to await tasks to be done (for example, some tasks may be created to run forever). In that case, you can just cancel() them instead of awaiting them:
import asyncio
from contextlib import suppress
async def echo_forever():
while True:
print("echo")
await asyncio.sleep(1)
async def main():
asyncio.ensure_future(echo_forever()) # fire and forget
print('Do some actions 1')
await asyncio.sleep(1)
print('Do some actions 2')
await asyncio.sleep(1)
print('Do some actions 3')
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
# Let's also cancel all running tasks:
pending = asyncio.Task.all_tasks()
for task in pending:
task.cancel()
# Now we should await task to execute it's cancellation.
# Cancelled task raises asyncio.CancelledError that we can suppress:
with suppress(asyncio.CancelledError):
loop.run_until_complete(task)
Output:
Do some actions 1
echo
Do some actions 2
echo
Do some actions 3
echo
Output:
>>> Hello
>>> foo() started
>>> I didn't wait for foo()
>>> foo() completed
Here is the simple decorator function which pushes the execution to background and line of control moves to next line of the code.
The primary advantage is, you don't have to declare the function as await
import asyncio
import time
def fire_and_forget(f):
def wrapped(*args, **kwargs):
return asyncio.get_event_loop().run_in_executor(None, f, *args, *kwargs)
return wrapped
#fire_and_forget
def foo():
print("foo() started")
time.sleep(1)
print("foo() completed")
print("Hello")
foo()
print("I didn't wait for foo()")
Note: Check my other answer which does the same using plain thread without asyncio.
This is not entirely asynchronous execution, but maybe run_in_executor() is suitable for you.
def fire_and_forget(task, *args, **kwargs):
loop = asyncio.get_event_loop()
if callable(task):
return loop.run_in_executor(None, task, *args, **kwargs)
else:
raise TypeError('Task must be a callable')
def foo():
#asynchronous stuff here
fire_and_forget(foo)
For some reason if you are unable to use asyncio then here is the implementation using plain threads. Check my other answers and Sergey's answer too.
import threading, time
def fire_and_forget(f):
def wrapped():
threading.Thread(target=f).start()
return wrapped
#fire_and_forget
def foo():
print("foo() started")
time.sleep(1)
print("foo() completed")
print("Hello")
foo()
print("I didn't wait for foo()")
produces
>>> Hello
>>> foo() started
>>> I didn't wait for foo()
>>> foo() completed
def fire_and_forget(f):
def wrapped(*args, **kwargs):
threading.Thread(target=functools.partial(f, *args, **kwargs)).start()
return wrapped
is the better version of the above -- does not use asyncio