I want to use async in python like c# or javascript.For example I want my program not to block next codes or maybe buttons in app while sending a request.I wrote some code about that.However I dont know whether this usage is true or false.I can't understand asyncio
import asyncio
import threading
import time
async def waitForMe(name):
for i in range(5):
await asyncio.sleep(1)
print(name)
async def main():
task1 = asyncio.create_task(waitForMe("task1"))
task2 = asyncio.create_task(waitForMe("task2"))
task3 = asyncio.create_task(waitForMe("task3"))
await task1
await task2
await task3
def mfunction():
asyncio.run(main())
t1=threading.Thread(target=mfunction)
t1.start()
for i in range(3):
time.sleep(1)
print("main")
I'd really recommend this excellent asyncio walkthrough, which should answer most of your questions.
A quote from the mentioned article in the light of your code:
[...] async IO is a single-threaded, single-process design: it uses cooperative multitasking, a term that [...] gives a feeling of concurrency despite using a single thread in a single process.
If you don't want your program to block while processing (IO) requests (as stated in your question), concurrency is sufficient (and you don't need (multi)threading)!
Concurrency [...] suggests that multiple tasks have the ability to run in an overlapping manner.
I'll repeat the exact example from the mentioned article, which has a similar structure as your example:
#!/usr/bin/env python3
# countasync.py
import asyncio
async def count():
print("One")
await asyncio.sleep(1)
print("Two")
async def main():
await asyncio.gather(count(), count(), count())
if __name__ == "__main__":
import time
s = time.perf_counter()
asyncio.run(main())
elapsed = time.perf_counter() - s
print(f"{__file__} executed in {elapsed:0.2f} seconds.")
This runs as follows:
$ python3 countasync.py
One
One
One
Two
Two
Two
countasync.py executed in 1.01 seconds.
Note that this examples uses asyncio.gather to kick-off the three count() processes in a non-blocking manner. Putting three await count() statements after one another won't work.
As far as I can see, this is exactly what you are looking for. As demonstrated, you don't need threading to achieve this.
Related
I have rather complex system running an asynchronous task called "automation". Meanwhile I would like to inspect where the task is currently waiting. Something like a callstack for async-await.
The following example creates such an automation task, stepping into do_something which in turn calls sleep. While this task is running, its stack is printed. I'd wish to see something like "automation → do_something → sleep". But print_stack only points to the line await do_something() in the top-level coroutine automation, but nothing more.
#!/usr/bin/env python3
import asyncio
async def sleep():
await asyncio.sleep(0.1)
async def do_something():
print('...')
await sleep()
async def automation():
for _ in range(10):
await do_something()
async def main():
task = asyncio.create_task(automation(), name='automation')
while not task.done():
task.print_stack()
await asyncio.sleep(0.1)
asyncio.run(main())
I thought about using _scheduled from asyncio.BaseEventLoop, but this seems to be always [] in my example. And since my production code runs uvloop I looked into https://github.com/MagicStack/uvloop/issues/135,
https://github.com/MagicStack/uvloop/issues/163 and
https://github.com/MagicStack/uvloop/pull/171, all of which are stale for about 4 years.
Is there something else I could try?
I found something: When running asyncio with debug=True and using a different interval of 0.11s (to avoid both task being "in sync"), we can access asyncio.get_running_loop()._scheduled. This contains a list of asyncio.TimerHandle. In debug mode each TimerHandle has a proper _source_traceback containing a list of traceback.FrameSummary with information like filename, lineno, name and line.
...
async def main():
task = asyncio.create_task(automation(), name='automation')
while not task.done():
for timer_handle in asyncio.get_running_loop()._scheduled:
for frame_summary in timer_handle._source_traceback:
print(f'{frame_summary.filename}:{frame_summary.lineno} {frame_summary.name}')
await asyncio.sleep(0.11)
asyncio.run(main(), debug=True)
The output looks something like this:
/opt/homebrew/Cellar/python#3.10/3.10.6_1/Frameworks/Python.framework/Versions/3.10/lib/python3.10/asyncio/base_events.py:600 run_forever
/opt/homebrew/Cellar/python#3.10/3.10.6_1/Frameworks/Python.framework/Versions/3.10/lib/python3.10/asyncio/base_events.py:1888 _run_once
/opt/homebrew/Cellar/python#3.10/3.10.6_1/Frameworks/Python.framework/Versions/3.10/lib/python3.10/asyncio/events.py:80 _run
/Users/falko/./test.py:17 automation
/Users/falko/./test.py:12 do_something
/Users/falko/./test.py:7 sleep
/opt/homebrew/Cellar/python#3.10/3.10.6_1/Frameworks/Python.framework/Versions/3.10/lib/python3.10/asyncio/tasks.py:601 sleep
The main drawback for my application is the fact that uvloop doesn't come with the field _scheduled. That's why I'm still looking for an alternative approach.
I'm a python beginner and taking from https://www.youtube.com/watch?v=iG6fr81xHKA&t=269s about the power of asyncio, I tried to use this example shown and repurpose it to execute 10 times. Here's a code snippet
def main(x):
print("Hello")
time.sleep(3)
print("World!")
And so I tried to do it in a asyncio fashion however it doesn't execute asynchronously.
Here's so far what I've tried. What am I doing wrong?
import time
import asyncio
async def main(x):
print(f"Starting Task {x}")
await asyncio.sleep(3)
print(f"Finished Task {x}")
async def async_io():
for i in range(10):
await main(i)
if __name__ == "__main__":
start_time = time.perf_counter()
asyncio.run(async_io())
print(f"Took {time.perf_counter() - start_time} secs")
I've also tried to use queue_task in asyncio.
Using await, by definition, waits for the task main to finish. So your code as-is is no different from the synchronous code you posted above. If you want to run them at the same time (asynchronously), while waiting for the results, you should use asyncio.gather or asyncio.wait instead.
async def async_io():
tasks = []
for i in range(10):
tasks += [main(i)]
await asyncio.gather(*tasks)
If you don't care to wait for all of the main() calls to finish, you can also just use asyncio.create_task(main(i)), which creates a Task object and schedule its execution in the background. In this case, def async_io() doesn't need to be async anymore.
This question already has answers here:
How to run functions in parallel?
(8 answers)
Closed 2 years ago.
I need to run two python functions simultaneously. How can I achieve that? Example:
import time
def short_task():
time.sleep(2)
def long_task():
time.sleep(4)
short_task()
long_task()
I expect the whole code to finish in 4 seconds (instead of 6).
It depends on what the functions are doing.
If it is something that involves waiting (such as time.sleep() or waiting for a response) use a Thread.
import time
from threading import Thread
def short_task():
time.sleep(2)
def long_task():
time.sleep(4)
Thread(target=short_task).start()
Thread(target=long_task).start()
If you have a long task that requires a lot of computing, use a Process:
import time
from multiprocessing import Process
def short_task():
time.sleep(2)
def long_task():
time.sleep(4)
if __name__ == '__main__':
Process(target=short_task).start()
Process(target=long_task).start()
One of your options is to use the asyncio module.
First of all, import the asyncio module with import asyncio. Replace your def functions to async def functions.
Second of all, replace time.sleep() with await asyncio.sleep(). You no longer need the time module because of this step.
Thirdly, create a new function, normally called main(). You can take the following code snippet for reference:
async def main():
task1 = asyncio.create_task(
short_task())
task2 = asyncio.create_task(
long_task())
await task1
await task2
Finally, run the whole main() code with asyncio.run(main()). Your final code should look like this:
import asyncio
async def short_task():
await asyncio.sleep(2)
async def long_task():
await asyncio.sleep(4)
async def main():
task1 = asyncio.create_task(
short_task())
task2 = asyncio.create_task(
long_task())
await task1
await task2
You may use a simple code snippet to proof that the whole process took 4 seconds.
I'm new to Python and have code similar to the following:
import time
import asyncio
async def my_async_function(i):
print("My function {}".format(i))
async def start():
requests = []
# Create multiple requests
for i in range(5):
print("Creating request #{}".format(i))
requests.append(my_async_function(i))
# Do some additional work here
print("Begin sleep")
time.sleep(10)
print("End sleep")
# Wait for all requests to finish
return await asyncio.gather(*requests)
asyncio.run(start())
No matter how long the "additional work" takes, the requests seem to only run after "End sleep". I'm guessing asyncio.gather is what actually begins to execute them. How can I have the requests (aka my_async_function()) start immediately, do additional work, and then wait for all to complete at the end?
Edit:
Per Krumelur's comments and my own findings, the following results in what I'm looking for:
import time
import asyncio
import random
async def my_async_function(i):
print("Begin function {}".format(i))
await asyncio.sleep(int(random.random() * 10))
print("End function {}".format(i))
async def start():
requests = []
# Create multiple requests
for i in range(10):
print("Creating request #{}".format(i))
requests.append(asyncio.create_task(my_async_function(i)))
# Do some additional work here
print("Begin sleep")
await asyncio.sleep(5)
print("End sleep")
# Wait for all requests to finish
return await asyncio.gather(*requests)
asyncio.run(start())
This only works if my_async_function and the "additional work" both are awaitable so that the event loop can give each of them execution time. You need create_task (if you know it's a coroutine) or ensure_future (if it could be a coroutine or future) to allow the requests to run immediately, otherwise they still end up running only when you gather.
time.sleep() is a synchronous operation
You’ll want to use the asynchronous sleep and await it,
E.g.
await asyncio.sleep(10)
Other async code will only run when the current task yields (I.e. typically when “await”ing something).
Using async code means you have to keep using async everywhere. Async operations are meant for I/O-bound applications. If “additional work” is mainly CPU-bound, you are better off using threads (but beware the global interpreter lock!)
I've read tons of articles and tutorial about Python's 3.5 async/await thing. I have to say I'm pretty confused, because some use get_event_loop() and run_until_complete(), some use ensure_future(), some use asyncio.wait(), and some use call_soon().
It seems like I have a lot choices, but I have no idea if they are completely identical or there are cases where you use loops and there are cases where you use wait().
But the thing is all examples work with asyncio.sleep() as simulation of real slow operation which returns an awaitable object. Once I try to swap this line for some real code the whole thing fails. What the heck are the differences between approaches written above and how should I run a third-party library which is not ready for async/await. I do use the Quandl service to fetch some stock data.
import asyncio
import quandl
async def slow_operation(n):
# await asyncio.sleep(1) # Works because it's await ready.
await quandl.Dataset(n) # Doesn't work because it's not await ready.
async def main():
await asyncio.wait([
slow_operation("SIX/US9884981013EUR4"),
slow_operation("SIX/US88160R1014EUR4"),
])
# You don't have to use any code for 50 requests/day.
quandl.ApiConfig.api_key = "MY_SECRET_CODE"
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
I hope you get the point how lost I feel and how simple thing I would like to have running in parallel.
If a third-party library is not compatible with async/await then obviously you can't use it easily. There are two cases:
Let's say that the function in the library is asynchronous and it gives you a callback, e.g.
def fn(..., clb):
...
So you can do:
def on_result(...):
...
fn(..., on_result)
In that case you can wrap such functions into the asyncio protocol like this:
from asyncio import Future
def wrapper(...):
future = Future()
def my_clb(...):
future.set_result(xyz)
fn(..., my_clb)
return future
(use future.set_exception(exc) on exception)
Then you can simply call that wrapper in some async function with await:
value = await wrapper(...)
Note that await works with any Future object. You don't have to declare wrapper as async.
If the function in the library is synchronous then you can run it in a separate thread (probably you would use some thread pool for that). The whole code may look like this:
import asyncio
import time
from concurrent.futures import ThreadPoolExecutor
# Initialize 10 threads
THREAD_POOL = ThreadPoolExecutor(10)
def synchronous_handler(param1, ...):
# Do something synchronous
time.sleep(2)
return "foo"
# Somewhere else
async def main():
loop = asyncio.get_event_loop()
futures = [
loop.run_in_executor(THREAD_POOL, synchronous_handler, param1, ...),
loop.run_in_executor(THREAD_POOL, synchronous_handler, param1, ...),
loop.run_in_executor(THREAD_POOL, synchronous_handler, param1, ...),
]
await asyncio.wait(futures)
for future in futures:
print(future.result())
with THREAD_POOL:
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
If you can't use threads for whatever reason then using such a library simply makes entire asynchronous code pointless.
Note however that using synchronous library with async is probably a bad idea. You won't get much and yet you complicate the code a lot.
You can take a look at the following simple working example from here. By the way it returns a string worth reading :-)
import aiohttp
import asyncio
async def fetch(client):
async with client.get('https://docs.aiohttp.org/en/stable/client_reference.html') as resp:
assert resp.status == 200
return await resp.text()
async def main():
async with aiohttp.ClientSession() as client:
html = await fetch(client)
print(html)
loop = asyncio.get_event_loop()
loop.run_until_complete(main())