Python: await the generator end - python

Current versions of Python (Dec 2022) still allow using #coroutine decorator and a generation can be as:
import asyncio
asyncify = asyncio.coroutine
data_ready = False # Status of a pipe, just to test
def gen():
global data_ready
while not data_ready:
print("not ready")
data_ready = True # Just to test
yield
return "done"
async def main():
result = await asyncify(gen)()
print(result)
loop = asyncio.new_event_loop()
loop.create_task(main())
loop.run_forever()
However, new Python versions 3.8+ will deprecate #coroutine decorator (the asyncify function alias), how to wait for (await) generator to end as above?
I tried to use async def as expected by the warning but not working:
import asyncio
asyncify = asyncio.coroutine
data_ready = False # Just to test
async def gen():
global data_ready
while not data_ready:
print("not ready")
data_ready = True # Just to test
yield
yield "done"
return
async def main():
# this has error: TypeError: object async_generator can't be used in 'await' expression
result = await gen()
print(result)
loop = asyncio.new_event_loop()
loop.create_task(main())
loop.run_forever()

Asynchronous generators inherit asynchronous iterator and are aimed for asynchronous iterations. You can not directly await them as regular coroutines.
With that in mind, returning to your experimental case and your question "how to wait for (await) generator to end?": to get the final yielded value - perform asynchronous iterations:
import asyncio
data_ready = False # Just to test
async def gen():
global data_ready
while not data_ready:
print("not ready")
data_ready = True # Just to test
yield "processing"
yield "done"
return
async def main():
a_gen = gen()
async for result in a_gen: # assign to result on each async iteration
pass
print('result:', result)
asyncio.run(main())
Prints:
not ready
result: done
Naturally, you can also advance the async generator in steps with anext:
a_gen = gen()
val_1 = await anext(a_gen)
Summing up, follow the guidlines on PEP 525 – Asynchronous Generators and try to not mix old-depreceted things with the actual ones.

Related

Automatic conversion of standard function into asynchronous function in Python

In most of the asynchronous coroutines I write, I only need to replace the function definition def func() -> async def func() and the sleep time.sleep(s) -> await asyncio.sleep(s).
Is it possible to convert a standard python function into an async function where all the time.sleep(s) is converted to await asyncio.sleep(s)?
Example
Performance during task
Measure performance during a task
import asyncio
import random
async def performance_during_task(task):
stop_event = asyncio.Event()
target_task = asyncio.create_task(task(stop_event))
perf_task = asyncio.create_task(measure_performance(stop_event))
await target_task
await perf_task
async def measure_performance(event):
while not event.is_set():
print('Performance: ', random.random())
await asyncio.sleep(.2)
if __name__ == "__main__":
asyncio.run(
performance_during_task(task)
)
Task
The task has to be defined with async def and await asyncio.sleep(s)
async def task(event):
for i in range(10):
print('Step: ', i)
await asyncio.sleep(.2)
event.set()
into ->
Easy task definition
To have others not worrying about async etc. I want them to be able to define a task normally (e.g. with a decorator?)
#as_async
def easy_task(event):
for i in range(10):
print('Step: ', i)
time.sleep(.2)
event.set()
So that it can be used as an async function with e.g. performance_during_task()
I think I found a solution similar to the interesting GitHub example mentioned in the comments and a similar post here.
We can write a decorator like
from functools import wraps, partial
def to_async(func):
#wraps(func) # Makes sure that function is returned for e.g. func.__name__ etc.
async def run(*args, loop=None, executor=None, **kwargs):
if loop is None:
loop = asyncio.get_event_loop(). # Make event loop of nothing exists
pfunc = partial(func, *args, **kwargs) # Return function with variables (event) filled in
return await loop.run_in_executor(executor, pfunc).
return run
Such that easy task becomes
#to_async
def easy_task(event):
for i in range(10):
print('Step: ', i)
time.sleep(.2)
event.set()
Where wraps makes sure we can call attributes of the original function (explained here).
And partial fills in the variables as explained here.

Is there a special syntax for suspending a coroutine until a condition is met?

I need to suspend a coroutine until a condition is met. Currently, I have:
class Awaiter:
def __init__(self):
self.ready = False
def __await__(self):
while not self.ready:
yield
And the caller code:
await awaiter
This works, but it requires boilerplate code. Is it necessary boilerplate or is there a special syntax to await on a predicate, such as:
await condition
which would yield until condition is false?
At the asyncio package there is a builtin Condition object that you can use.
An asyncio condition primitive can be used by a task to wait for some event to happen and then get exclusive access to a shared resource.
How to use the condition (from the same source):
cond = asyncio.Condition()
# The preferred way to use a Condition is an async with statement
async with cond:
await cond.wait()
# It can also be used as follow
await cond.acquire()
try:
await cond.wait()
finally:
cond.release()
A code example:
import asyncio
cond = asyncio.Condition()
async def func1():
async with cond:
print('It\'s look like I will need to wait')
await cond.wait()
print('Now it\'s my turn')
async def func2():
async with cond:
print('Notifying....')
cond.notify()
print('Let me finish first')
# Main function
async def main(loop):
t1 = loop.create_task(func1())
t2 = loop.create_task(func2())
await asyncio.wait([t1, t2])
if __name__ == '__main__':
l = asyncio.get_event_loop()
l.run_until_complete(main(l))
l.close()
This will results with:
It's look like I will need to wait
Notifying....
Let me finish first
Now it's my turn
An alternative way is to use the asyncio.Event.
import asyncio
event = asyncio.Event()
async def func1():
print('It\'s look like I will need to wait')
await event.wait()
print('Now it\'s my turn')
async def func2():
print('Notifying....')
event.set()
print('Let me finish first')
It will have the same results as the Condition code example.

Python parallelising "async for"

I have the following method in my Tornado handler:
async def get(self):
url = 'url here'
try:
async for batch in downloader.fetch(url):
self.write(batch)
await self.flush()
except Exception as e:
logger.warning(e)
This is the code for downloader.fetch():
async def fetch(url, **kwargs):
timeout = kwargs.get('timeout', aiohttp.ClientTimeout(total=12))
response_validator = kwargs.get('response_validator', json_response_validator)
extractor = kwargs.get('extractor', json_extractor)
try:
async with aiohttp.ClientSession(timeout=timeout) as session:
async with session.get(url) as resp:
response_validator(resp)
async for batch in extractor(resp):
yield batch
except aiohttp.client_exceptions.ClientConnectorError:
logger.warning("bad request")
raise
except asyncio.TimeoutError:
logger.warning("server timeout")
raise
I would like yield the "batch" object from multiple downloaders in paralel.
I want the first available batch from the first downloader and so on until all downloaders finished. Something like this (this is not working code):
async for batch in [downloader.fetch(url1), downloader.fetch(url2)]:
....
Is this possible? How can I modify what I am doing in order to be able to yield from multiple coroutines in parallel?
How can I modify what I am doing in order to be able to yield from multiple coroutines in parallel?
You need a function that merges two async sequences into one, iterating over both in parallel and yielding elements from one or the other, as they become available. While such a function is not included in the current standard library, you can find one in the aiostream package.
You can also write your own merge function, as shown in this answer:
async def merge(*iterables):
iter_next = {it.__aiter__(): None for it in iterables}
while iter_next:
for it, it_next in iter_next.items():
if it_next is None:
fut = asyncio.ensure_future(it.__anext__())
fut._orig_iter = it
iter_next[it] = fut
done, _ = await asyncio.wait(iter_next.values(),
return_when=asyncio.FIRST_COMPLETED)
for fut in done:
iter_next[fut._orig_iter] = None
try:
ret = fut.result()
except StopAsyncIteration:
del iter_next[fut._orig_iter]
continue
yield ret
Using that function, the loop would look like this:
async for batch in merge(downloader.fetch(url1), downloader.fetch(url2)):
....
Edit:
As mentioned in the comment, below method does not execute given routines in parallel.
Checkout aitertools library.
import asyncio
import aitertools
async def f1():
await asyncio.sleep(5)
yield 1
async def f2():
await asyncio.sleep(6)
yield 2
async def iter_funcs():
async for x in aitertools.chain(f2(), f1()):
print(x)
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(iter_funcs())
It seems that, functions being iterated must be couroutine.

run async while loop independently

Is it possible to run an async while loop independently of another one?
Instead of the actual code I isolated the issue I am having in the following example code
import asyncio, time
class Time:
def __init__(self):
self.start_time = 0
async def dates(self):
while True:
t = time.time()
if self.start_time == 0:
self.start_time = t
yield t
await asyncio.sleep(1)
async def printer(self):
while True:
print('looping') # always called
await asyncio.sleep(self.interval)
async def init(self):
async for i in self.dates():
if i == self.start_time:
self.interval = 3
await self.printer()
print(i) # Never Called
loop = asyncio.get_event_loop()
t = Time()
loop.run_until_complete(t.init())
Is there a way to have the print function run independently so print(i) gets called each time?
What it should do is print(i) each second and every 3 seconds call self.printer(i)
Essentially self.printer is a separate task that does not need to be called very often, only every x seconds(in this case 3).
In JavaScript the solution is to do something like so
setInterval(printer, 3000);
EDIT: Ideally self.printer would also be able to be canceled / stopped if a condition or stopping function is called
The asyncio equivalent of JavaScript's setTimeout would be asyncio.ensure_future:
import asyncio
async def looper():
for i in range(1_000_000_000):
print(f'Printing {i}')
await asyncio.sleep(0.5)
async def main():
print('Starting')
future = asyncio.ensure_future(looper())
print('Waiting for a few seconds')
await asyncio.sleep(4)
print('Cancelling')
future.cancel()
print('Waiting again for a few seconds')
await asyncio.sleep(2)
print('Done')
if __name__ == '__main__':
asyncio.get_event_loop().run_until_complete(main())
You'd want to register your self.printer() coroutine as a separate task; pass it to asyncio.ensure_future() rather than await on it directly:
asyncio.ensure_future(self.printer())
By passing the coroutine to asyncio.ensure_future(), you put it on the list of events that the loop switches between as each awaits on further work to be completed.
With that change, your test code outputs:
1516819094.278697
looping
1516819095.283424
1516819096.283742
looping
1516819097.284152
# ... etc.
Tasks are the asyncio equivalent of threads in a multithreading scenario.

cannot 'yield from' a coroutine object in a non-coroutine generator

I try to run a coroutine.i write a correct demo:
import asyncio
async def outer():
print('in outer')
print('waiting for result1')
result1 = await phase1()
print('waiting for result2')
result2 = await phase2(result1)
return (result1, result2)
async def phase1():
print('in phase1')
return 'result1'
async def phase2(arg):
print('in phase2')
return 'result2 derived from {}'.format(arg)
event_loop = asyncio.get_event_loop()
try:
return_value = event_loop.run_until_complete(outer())
print('return value: {!r}'.format(return_value))
finally:
event_loop.close()
I want to know what if the outer function is not a coroutine, so, I remove async, and after that:
import asyncio
def outer():
print('in outer')
print('waiting for result1')
result1 = yield from phase1()
print('waiting for result2')
result2 = yield from phase2(result1)
return (result1, result2)
async def phase1():
print('in phase1')
return 'result1'
async def phase2(arg):
print('in phase2')
return 'result2 derived from {}'.format(arg)
event_loop = asyncio.get_event_loop()
try:
return_value = event_loop.run_until_complete(outer())
print('return value: {!r}'.format(return_value))
finally:
event_loop.close()
then i run this event loop ,I find this error.how to explain this error?it not allowed to use a usual function to call coroutine? I used to thought yield from can be used in a usual funciton,but here,absolutly,it could't.who can tell me the reason?
Python 3.5+ requires coroutines to be defined with async def or #asyncio.coroutine for many operations (for example for await x) and will throw exceptions if you try to pass anything that does not conform to this.
See inspect.isawaitable(object) in python 3.5 docs and source code:
def isawaitable(object):
"""Return true if object can be passed to an ``await`` expression."""
return (isinstance(object, types.CoroutineType) or
isinstance(object, types.GeneratorType) and
bool(object.gi_code.co_flags & CO_ITERABLE_COROUTINE) or
isinstance(object, collections.abc.Awaitable))

Categories