Asyncio.sleep causes script to End Immediately - python

In my simple asyncio Python program below, bar_loop is supposed to run continuously with a 1 second delay between loops.
Things run as expected when we have simply
async def bar_loop(self):
while True:
print('bar')
However, when we add a asyncio.sleep(1), the loop will end instead of looping.
async def bar_loop(self):
while True:
print('bar')
await asyncio.sleep(1)
Why does asyncio.sleep() cause bar_loop to exit immediately? How can we let it loop with a 1 sec delay?
Full Example:
import asyncio
from typing import Optional
class Foo:
def __init__(self):
self.bar_loop_task: Optional[asyncio.Task] = None
async def start(self):
self.bar_loop_task = asyncio.create_task(self.bar_loop())
async def stop(self):
if self.bar_loop_task is not None:
self.bar_loop_task.cancel()
async def bar_loop(self):
while True:
print('bar')
await asyncio.sleep(1)
if __name__ == '__main__':
try:
foo = Foo()
asyncio.run(foo.start())
except KeyboardInterrupt:
asyncio.run(foo.stop())
Using Python 3.9.5 on Ubuntu 20.04.

This behavior has nothing to do with calling asyncio.sleep, but with the expected behavior of creating a task and doing nothing else.
Tasks will run in parallel in the the asyncio loop, while other code that uses just coroutine and await expressions can be thought as if run in a linear pattern - however, as the are "out of the way" of the - let's call it "visible path of execution", they also won't prevent that flow.
In this case, your program simply reaches the end of the start method, with nothing left being "awaited", the asyncio loop simply finishes its execution.
If you have no explicit code to run in parallel to bar_loop, just await for the task. Change your start method to read:
async def start(self):
self.bar_loop_task = asyncio.create_task(self.bar_loop())
try:
await self.bar_loop_task
except XXX:
# handle excptions that might have taken place inside the task

Related

async function in Python, basic example

Can you help me see what I have understood wrong here please. I have two functions and I would like the second one to run regardless of the status of the first one (whether it is finished or not). Hence I was thinking to make the first function asynchronous. This is what I have done
import os
import asyncio
from datetime import datetime
async def do_some_iterations():
for i in range(10):
print(datetime.now().time())
await asyncio.sleep(1)
print('... Cool!')
async def main():
task = asyncio.create_task (do_some_iterations())
await task
def do_something():
print('something...')
if __name__ == "__main__":
asyncio.run(main())
do_something()
The output is:
00:46:00.145024
00:46:01.148533
00:46:02.159751
00:46:03.169868
00:46:04.179915
00:46:05.187242
00:46:06.196356
00:46:07.207614
00:46:08.215997
00:46:09.225066
Cool!
something...
which looks like the traditional way where one function has to finish and then move to the next call.
I was hoping instead to execute do_something() before the asynchronous function started generating the print statements (or at lease at the very top of those statements..)
What am I doing wrong please? How I should edit the script?
They both need to be part of the event loop the you created. asyncio.run() itself is not async, which means it will run until the loop ends. One easy way to do this is to use gather()
import asyncio
from datetime import datetime
async def do_some_iterations():
for i in range(10):
print(datetime.now().time())
await asyncio.sleep(1)
print('... Cool!')
async def do_something():
print('something...')
async def main():
await asyncio.gather(
do_some_iterations(),
do_something()
)
if __name__ == "__main__":
asyncio.run(main())
print("done")
This will print:
16:08:38.921879
something...
16:08:39.922565
16:08:40.923709
16:08:41.924823
16:08:42.926004
16:08:43.927044
16:08:44.927877
16:08:45.928724
16:08:46.929589
16:08:47.930453
... Cool!
done
You can also simply add another task:
async def main():
task = asyncio.create_task(do_some_iterations())
task2 = asyncio.create_task(do_something())
In both cases the function needs to be awaitable.

Python async function is never awaited

I implemented a async function in python's asynchronous framework FastAPI
The function looks like:
async def func2(num):
time.sleep(3)
return num
async def func1():
text = await func2(5)
print(text)
print('inside func1')
async def my_async_func():
print('start')
await func1()
print('finish')
Here, when I execute my_async_func I'm expecting asyn behavior and values to be printed as
start
finish
inside func1
5
But it prints synchronously as
start
5
inside func1
finish
How to handle concurrent operation and implementation of coroutines asynchronously?
do asyncio.create_task, (note: you can't make sure print 5 after inside func1 by the order of print.)
Try code below:
import asyncio
import time
async def func2(num):
time.sleep(3)
return num
async def func1():
text = await func2(5)
print(text)
print('inside func1')
async def my_async_func():
print('start')
asyncio.create_task(func1())
print('finish')
asyncio.run(my_async_func())
Result:
start
finish
5
inside func1
Also notice that sleep would make your thread sleep.
Asynchronous behavior shows up when several independent(ish) tasks take turns executing in an event loop, but here you only run the 1 task my_async_func. my_async_func then calls func1, which then calls func2; your program is executing in exactly the order you wrote.
This chain of function calls shouldn't really be called synchronous because there is only 1 independent task. You can see asynchronous behavior if you queue up 2 my_async_func tasks, actually.

await does not return the value of a future with the value set in a thread

This code should print "hi" 3 times, but it doesn't always print.
I made a gif that shows the code being executed:
from asyncio import get_event_loop, wait_for, new_event_loop
from threading import Thread
class A:
def __init__(self):
self.fut = None
def start(self):
"""
Expects a future to be created and puts "hi" as a result
"""
async def foo():
while True:
if self.fut:
self.fut.set_result('hi')
self.fut = None
new_event_loop().run_until_complete(foo())
async def make(self):
"""
Create a future and print your result when it runs out
"""
future = get_event_loop().create_future()
self.fut = future
print(await future)
a = A()
Thread(target=a.start).start()
for _ in range(3):
get_event_loop().run_until_complete(a.make())
This is caused by await future, because when I change
print(await future)
by
while not future.done():
pass
print(future.result())
the code always prints "hi" 3 times.
Is there anything in my code that causes this problem in await future?
Asyncio functions are not thread-safe, except where explicitly noted. For set_result to work from another thread, you'd need to call it through call_soon_threadsafe.
But in your case this wouldn't work because A.start creates a different event loop than the one the main thread executes. This creates issues because the futures created in one loop cannot be awaited in another one. Because of this, and also because there is just no need to create multiple event loops, you should pass the event loop instance to A.start and use it for your async needs.
But - when using the event loop from the main thread, A.start cannot call run_until_complete() because that would try to run an already running event loop. Instead, it must call asyncio.run_coroutine_threadsafe to submit the coroutine to the event loop running in the main thread. This will return a concurrent.futures.Future (not to be confused with an asyncio Future) whose result() method can be used to wait for it to execute and propagate the result or exception, just like run_until_complete() would have done. Since foo will now run in the same thread as the event loop, it can just call set_result without call_soon_threadsafe.
One final problem is that foo contains an infinite loop that doesn't await anything, which blocks the event loop. (Remember that asyncio is based on cooperative multitasking, and a coroutine that spins without awaiting doesn't cooperate.) To fix that, you can have foo monitor an event that gets triggered when a new future is available.
Applying the above to your code can look like this, which prints "hi" three times as desired:
import asyncio
from asyncio import get_event_loop
from threading import Thread
class A:
def __init__(self):
self.fut = None
self.have_fut = asyncio.Event()
def start(self, loop):
async def foo():
while True:
await self.have_fut.wait()
self.have_fut.clear()
if self.fut:
self.fut.set_result('hi')
self.fut = None
asyncio.run_coroutine_threadsafe(foo(), loop).result()
async def make(self):
future = get_event_loop().create_future()
self.fut = future
self.have_fut.set()
print(await future)
a = A()
Thread(target=a.start, args=(get_event_loop(),), daemon=True).start()
for _ in range(3):
get_event_loop().run_until_complete(a.make())

Running an event loop within its own thread

I'm playing with Python's new(ish) asyncio stuff, trying to combine its event loop with traditional threading. I have written a class that runs the event loop in its own thread, to isolate it, and then provide a (synchronous) method that runs a coroutine on that loop and returns the result. (I realise this makes it a somewhat pointless example, because it necessarily serialises everything, but it's just as a proof-of-concept).
import asyncio
import aiohttp
from threading import Thread
class Fetcher(object):
def __init__(self):
self._loop = asyncio.new_event_loop()
# FIXME Do I need this? It works either way...
#asyncio.set_event_loop(self._loop)
self._session = aiohttp.ClientSession(loop=self._loop)
self._thread = Thread(target=self._loop.run_forever)
self._thread.start()
def __enter__(self):
return self
def __exit__(self, *e):
self._session.close()
self._loop.call_soon_threadsafe(self._loop.stop)
self._thread.join()
self._loop.close()
def __call__(self, url:str) -> str:
# FIXME Can I not get a future from some method of the loop?
future = asyncio.run_coroutine_threadsafe(self._get_response(url), self._loop)
return future.result()
async def _get_response(self, url:str) -> str:
async with self._session.get(url) as response:
assert response.status == 200
return await response.text()
if __name__ == "__main__":
with Fetcher() as fetcher:
while True:
x = input("> ")
if x.lower() == "exit":
break
try:
print(fetcher(x))
except Exception as e:
print(f"WTF? {e.__class__.__name__}")
To avoid this sounding too much like a "Code Review" question, what is the purpose of asynchio.set_event_loop and do I need it in the above? It works fine with and without. Moreover, is there a loop-level method to invoke a coroutine and return a future? It seems a bit odd to do this with a module level function.
You would need to use set_event_loop if you called get_event_loop anywhere and wanted it to return the loop created when you called new_event_loop.
From the docs
If there’s need to set this loop as the event loop for the current context, set_event_loop() must be called explicitly.
Since you do not call get_event_loop anywhere in your example, you can omit the call to set_event_loop.
I might be misinterpreting, but i think the comment by #dirn in the marked answer is incorrect in stating that get_event_loop works from a thread. See the following example:
import asyncio
import threading
async def hello():
print('started hello')
await asyncio.sleep(5)
print('finished hello')
def threaded_func():
el = asyncio.get_event_loop()
el.run_until_complete(hello())
thread = threading.Thread(target=threaded_func)
thread.start()
This produces the following error:
RuntimeError: There is no current event loop in thread 'Thread-1'.
It can be fixed by:
- el = asyncio.get_event_loop()
+ el = asyncio.new_event_loop()
The documentation also specifies that this trick (creating an eventloop by calling get_event_loop) only works on the main thread:
If there is no current event loop set in the current OS thread, the OS thread is main, and set_event_loop() has not yet been called, asyncio will create a new event loop and set it as the current one.
Finally, the docs also recommend to use get_running_loop instead of get_event_loop if you're on version 3.7 or higher

Python threading.Thread.join() is blocking

I'm working with asynchronous programming and wrote a small wrapper class for thread-safe execution of co-routines based on some ideas from this thread here: python asyncio, how to create and cancel tasks from another thread. After some debugging, I found that it hangs when calling the Thread class's join() function (I overrode it only for testing). Thinking I made a mistake, I basically copied the code that the OP said he used and tested it to find the same issue.
His mildly altered code:
import threading
import asyncio
from concurrent.futures import Future
import functools
class EventLoopOwner(threading.Thread):
class __Properties:
def __init__(self, loop, thread, evt_start):
self.loop = loop
self.thread = thread
self.evt_start = evt_start
def __init__(self):
threading.Thread.__init__(self)
self.__elo = self.__Properties(None, None, threading.Event())
def run(self):
self.__elo.loop = asyncio.new_event_loop()
asyncio.set_event_loop(self.__elo.loop)
self.__elo.thread = threading.current_thread()
self.__elo.loop.call_soon_threadsafe(self.__elo.evt_start.set)
self.__elo.loop.run_forever()
def stop(self):
self.__elo.loop.call_soon_threadsafe(self.__elo.loop.stop)
def _add_task(self, future, coro):
task = self.__elo.loop.create_task(coro)
future.set_result(task)
def add_task(self, coro):
self.__elo.evt_start.wait()
future = Future()
p = functools.partial(self._add_task, future, coro)
self.__elo.loop.call_soon_threadsafe(p)
return future.result() # block until result is available
def cancel(self, task):
self.__elo.loop.call_soon_threadsafe(task.cancel)
async def foo(i):
return 2 * i
async def main():
elo = EventLoopOwner()
elo.start()
task = elo.add_task(foo(10))
x = await task
print(x)
elo.stop(); print("Stopped")
elo.join(); print("Joined") # note: giving it a timeout does not fix it
if __name__ == "__main__":
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
assert isinstance(loop, asyncio.AbstractEventLoop)
try:
loop.run_until_complete(main())
finally:
loop.close()
About 50% of the time when I run it, It simply stalls and says "Stopped" but not "Joined". I've done some debugging and found that it is correlated to when the Task itself sent an exception. This doesn't happen every time, but since it occurs when I'm calling threading.Thread.join(), I have to assume it is related to the destruction of the loop. What could possibly be causing this?
The exception is simply: "cannot join current thread" which tells me that the .join() is sometimes being run on the thread from which I called it and sometimes from the ELO thread.
What is happening and how can I fix it?
I'm using Python 3.5.1 for this.
Note: This is not replicated on IDE One: http://ideone.com/0LO2D9

Categories