I know that asgiref.sync.sync_to_async is for running sync code in an async context and cannot magically convert them to async code. I have also seen this question and the examples in the answer. But I came up with an unexpected case that I can't wrap my head around. It seems that sync_to_async is converting time.sleep to an async sleep. Here is the example:
I have two jobs. First job has a loop of 3 with 1-second interval between each iteration. second job just has a wait interval of 3 seconds using time.sleep(). I expect to get the same results no matter how I run the sync wait function (directly or by sync_to_async). But it's not happening. why is this happening?
import asyncio
import time
from asgiref.sync import sync_to_async
async def first_job():
for i in range(3):
print(time.time(), "first start")
await asyncio.sleep(1)
print(time.time(), "first finish")
def sync_wait():
print(time.time(), "wait start")
time.sleep(3)
print(time.time(), "wait finish")
async def second_job_sync_wait():
print(time.time(), "second start")
sync_wait()
print(time.time(), "second finish")
async def second_job_async_wait():
print(time.time(), "second start")
await sync_to_async(sync_wait)()
print(time.time(), "second finish")
async def main():
print("========== sync wait:")
await asyncio.gather(second_job_sync_wait(), first_job())
print("========== async wait:")
await asyncio.gather(second_job_async_wait(), first_job())
asyncio.run(main())
output:
========== sync wait:
1676477101.0311968 second start
1676477101.031214 wait start
1676477104.036779 wait finish
1676477104.0369549 second finish
1676477104.037318 first start (0)
1676477105.0388129 first finish (0)
1676477105.03891 first start (1)
1676477106.040628 first finish (1)
1676477106.041095 first start (2)
1676477107.043103 first finish (2)
========== async wait:
1676477107.043814 second start
1676477107.045408 wait start
1676477107.045689 first start (0)
1676477108.047393 first finish (0)
1676477108.047503 first start (1)
1676477109.048685 first finish (1)
1676477109.048785 first start (2)
1676477110.04971 first finish (2)
1676477110.050215 wait finish
1676477110.0516758 second finish
It just runs the synchronous function in another thread. The time.sleep doesn't block your async code with sync_to_async because it's running in a different thread from your other code.
sync_to_async does more than that - it puts in some extra work to make some unusual guarantees for code that cares about those guarantees - but none of that is relevant here. You can read more in the docs if you want.
Related
I'm a python beginner and taking from https://www.youtube.com/watch?v=iG6fr81xHKA&t=269s about the power of asyncio, I tried to use this example shown and repurpose it to execute 10 times. Here's a code snippet
def main(x):
print("Hello")
time.sleep(3)
print("World!")
And so I tried to do it in a asyncio fashion however it doesn't execute asynchronously.
Here's so far what I've tried. What am I doing wrong?
import time
import asyncio
async def main(x):
print(f"Starting Task {x}")
await asyncio.sleep(3)
print(f"Finished Task {x}")
async def async_io():
for i in range(10):
await main(i)
if __name__ == "__main__":
start_time = time.perf_counter()
asyncio.run(async_io())
print(f"Took {time.perf_counter() - start_time} secs")
I've also tried to use queue_task in asyncio.
Using await, by definition, waits for the task main to finish. So your code as-is is no different from the synchronous code you posted above. If you want to run them at the same time (asynchronously), while waiting for the results, you should use asyncio.gather or asyncio.wait instead.
async def async_io():
tasks = []
for i in range(10):
tasks += [main(i)]
await asyncio.gather(*tasks)
If you don't care to wait for all of the main() calls to finish, you can also just use asyncio.create_task(main(i)), which creates a Task object and schedule its execution in the background. In this case, def async_io() doesn't need to be async anymore.
import asyncio
import time
start = time.time()
class DMconvo:
async def feature():
print('hi')
async def two():
await asyncio.sleep(5)
print('hi again')
async def test():
await DMconvo.feature()
await DMconvo.two() # should run in background and wait 5 seconds
time.sleep(10) # should run while the previous script waits 5 seconds
asyncio.run(test())
print("--- %s seconds ---" % (time.time() - start))
I think the code is running synchronously instead of asynchronously, but I'm not sure why.
Asyncio is still synchronous when calling await. This means that the code within test() is running sequentially with DMconvo.two() waiting 5 seconds and time.sleep(10) waiting 10 seconds. Totaling 15 seconds. You could use asyncio.gather to run co-routines concurrently eg.
import asyncio
import time
start = time.time()
class DMconvo:
async def feature():
print('hi')
async def two():
await asyncio.sleep(5)
print('hi again')
async def test():
await asyncio.gather(
DMconvo.feature(),
DMconvo.two(),
asyncio.sleep(10),
)
asyncio.run(test())
print("--- %s seconds ---" % (time.time() - start))
await DMconvo.two()
time.sleep(10) # should run while the previous script waits 5 seconds
The assumption in this comment is incorrect.
time.sleep(10) is not executed until DMconvo.two() is finished.
Compared to synchronous execution, the advantage of using await is that the event loop can do other things while a task is waiting for I/O. But running a task in the background must still be done explicitly, for example by using create_task.
Furthermore, using time.sleep blocks the thread, in which also the asyncio event loop is running. To be able to sleep asynchronously, you need to use asyncio.sleep.
task = asyncio.create_task(DMconvo.two())
await asyncio.sleep(10)
await task
A more abstract and convenient approach to create and run multiple tasks concurrently is to use asyncio.gather, as shown in the answer by ThisIsHowItIs.
Using await means "Program will wait there till this asynchronous function completes"
So,
await DMconvo.two()
Will wait for 5 secs.
Remove await from here and it will run this function in background.
I'm new to Python and have code similar to the following:
import time
import asyncio
async def my_async_function(i):
print("My function {}".format(i))
async def start():
requests = []
# Create multiple requests
for i in range(5):
print("Creating request #{}".format(i))
requests.append(my_async_function(i))
# Do some additional work here
print("Begin sleep")
time.sleep(10)
print("End sleep")
# Wait for all requests to finish
return await asyncio.gather(*requests)
asyncio.run(start())
No matter how long the "additional work" takes, the requests seem to only run after "End sleep". I'm guessing asyncio.gather is what actually begins to execute them. How can I have the requests (aka my_async_function()) start immediately, do additional work, and then wait for all to complete at the end?
Edit:
Per Krumelur's comments and my own findings, the following results in what I'm looking for:
import time
import asyncio
import random
async def my_async_function(i):
print("Begin function {}".format(i))
await asyncio.sleep(int(random.random() * 10))
print("End function {}".format(i))
async def start():
requests = []
# Create multiple requests
for i in range(10):
print("Creating request #{}".format(i))
requests.append(asyncio.create_task(my_async_function(i)))
# Do some additional work here
print("Begin sleep")
await asyncio.sleep(5)
print("End sleep")
# Wait for all requests to finish
return await asyncio.gather(*requests)
asyncio.run(start())
This only works if my_async_function and the "additional work" both are awaitable so that the event loop can give each of them execution time. You need create_task (if you know it's a coroutine) or ensure_future (if it could be a coroutine or future) to allow the requests to run immediately, otherwise they still end up running only when you gather.
time.sleep() is a synchronous operation
You’ll want to use the asynchronous sleep and await it,
E.g.
await asyncio.sleep(10)
Other async code will only run when the current task yields (I.e. typically when “await”ing something).
Using async code means you have to keep using async everywhere. Async operations are meant for I/O-bound applications. If “additional work” is mainly CPU-bound, you are better off using threads (but beware the global interpreter lock!)
I have a queue which stored on Redis lists. I'm trying to create async consumer for this queue. But couldn't call async function inside loop. Its working like sync function when I call.
import asyncio
async def worker():
print("starting sleep")
await asyncio.sleep(2)
print("slept")
async def main():
while True:
await worker()
asyncio.run(main())
Here is a short and simple example of mine implemantation. I'm expecting to see 'starting sleep' messages until first 'slept' message, it means for 2 seconds.
main is literally awaiting the completion of worker. Until worker is done, main won't progress. async tasks don't run in the background like in multithreading.
What you want is to keep launching new workers without awaiting each one of them. However, if you just keep doing this in a loop like this:
while True:
worker()
then you will never see any output of those workers, since this is an endless loop which never gives anything else the chance to run. You'd need to "break" this loop in some way to allow workers to progress. Here's an example of that:
import asyncio
async def worker():
print("starting sleep")
await asyncio.sleep(2)
print("slept")
async def main():
while True:
asyncio.ensure_future(worker())
await asyncio.sleep(0.5)
asyncio.run(main())
This will produce the expected outcome:
starting sleep
starting sleep
starting sleep
starting sleep
slept
starting sleep
slept
...
The await inside main transfers control back to the event loop, which now has the chance to run the piled up worker tasks, When those worker tasks await, they in turn transfer control back to the event loop, which will transfer it back to either main or a worker as their awaited sleep completes.
Note that this is only for illustration purposes; if and when you interrupt this program, you'll see notices about unawaited tasks which haven't completed. You should keep track of your tasks and await them all to completion at the end somewhere.
Here is an example using asyncio.wait:
import asyncio
async def worker():
print("starting sleep")
await asyncio.sleep(2)
print("slept")
async def main():
tasks = [worker() for each in range(10)]
await asyncio.wait(tasks)
asyncio.run(main())
It spawns all the workers together.
As I read more, I feel more stupid about aysnc in python. So I decided to ask for a direct answer. How can I change the following code (using async or similar approaches) to achieve the desired result? Additionally, how can I do it in flask or sanic?
import time
def long_job():
print('long job started')
time.sleep(5)
print('long job ended')
def main_job():
long_job()
time.sleep(1)
print('main job returned')
main_job()
# expected result:
# 'long job started'
# 'main job returned'
# 'long job ended'
Basically, I do NOT want to await for long_job to end before returning my main_job. Thank you in advance. :)
Await asyncio's sleep() to yield time to other jobs (if you don't need to await something else).
Use create_task() instead of await to start a job without blocking.
Finally, you have to start the main job using the event loop.
# Written in Python 3.7
import asyncio
async def long_job():
print('long job started')
await asyncio.sleep(5)
print('long job ended')
async def main_job():
asyncio.create_task(long_job())
await asyncio.sleep(1)
print('main job returned')
Your framework should start the event loop, you don't have to start it yourself. You can await or call create_task on main_job() from an async def function called by your framework, depending on if you want to block or not.
If you want to test this without a framework, you'll have to start the loop yourself using asyncio.run(). This will stop immediately after its task completes, even if other tasks haven't finished yet. But this is easy enough to work around:
async def loop_job():
asyncio.create_task(main_job())
while len(asyncio.Task.all_tasks()) > 1: # Any task besides loop_job()?
await asyncio.sleep(0.2)
asyncio.run(loop_job())
If you're implementing a framework yourself, you can use the more primitive loop.run_forever(), but you'd have to stop() it yourself.