This question already has answers here:
How can I call an async function without await?
(3 answers)
Closed 1 year ago.
I've been using Python for years now, and recently I've discovered asynchronies programming.
However, I'm having a rough time trying to implement my ideas...
Let's say I have a regular (synchronized) function g() that returns a value that it computes. Inside this function, I want to call another function, that is asynchronies and runs in a loop - we will call this function f(). The function f() is called inside g(), but the returned value of g() is computed before f() is even called. However, obviously, I want g() to return the value it computed and keep running f() "in the background".
import asyncio
def g(a, b):
y = a + b
asyncio.run(f()) # This is blocking!
return y
async def f():
for _ in range(3):
await asyncio.sleep(1)
print("I'm still alive!")
print(g(3, 5))
# code async code continues here...
# Output:
# I'm still alive!
# I'm still alive!
# I'm still alive!
# 8
# Expected\Wanted:
# 8
# I'm still alive!
# I'm still alive!
# I'm still alive!
Is something like this is even possible? Or am I missing something?
Thanks in advance!
There are a few misunderstandings I believe. So first of all, you cannot have something really simultaneous without using thread or process. async is syntax sugar for the callback-based event-driven architecture.
Shortly, they are still in the same thread, and as you know, one thread can do only one thing at one time. If you want to have a kind of background task running and printing "I'm still alive", then async is not what you are looking for.
Also, Aren't you curious about where is the event loop? The loop is created and managed by asyncio.run, it roughly equals to:
loop = asyncio.get_event_loop()
loop.run_until_complete(f())
So you see, you need to run/trigger the loop, and it is blocking.
Basically, the asynchronies programming doesn't work as you thought(I guess). There is no magic inside, it is just a normal blocking event loop. We add multiple tasks into it, and all the tasks are running one by one. Some tasks have callback functions, which adds another task into the loop. That's it.
Related
This question already has answers here:
How to get the return value from a thread?
(26 answers)
Closed 3 years ago.
I am currently writing a program that is required to be as fast as possible.
Currently, one of the functions looks like this:
def function():
value = get_value()
# Multiple lines of code
if value == "1":
print("success")
I want to know, if there is a way of calling the get_value() function at the start of the function and instantly running the multiple lines of code and then whenever the the get_value() function is finishes and returns a value the value variable is updated ready for the if statement.
Thanks!
This is what futures are for. With the concurrent.futures module, you'd do something like:
import concurrent.futures
# Properly, this would be created once, in a main method, using a with statement
# Use ProcessPoolExecutor instead if the functions involved are CPU bound, rather
# than I/O bound, to work around the GIL
executor = concurrent.futures.ThreadPoolExecutor()
def function():
value = executor.submit(get_value)
# Multiple lines of code
if value.result() == "1":
print("success")
This creates a pool of workers that you can submit tasks to, receiving futures, which can be waited for when you actually need the result. I'd recommend looking at the examples in the documentation for more full-featured usage.
The other approach here, for largely I/O bound cases based on sockets, subprocesses, etc., is using asyncio with async/await, but that requires a complete rewrite of your code, and is out of scope for a short answer.
Extracted from python 3.6.8 documentation.
coroutine asyncio.sleep(delay, result=None, *, loop=None)
Create a coroutine that completes after a given time (in seconds). If result is provided, it is produced to the caller when the coroutine completes.
Question 1: What does the 2nd sentence mean, i.e. "If result is provided, ....."? I don't understand how to use the result argument. Can an example be provided to illustrate it's use?
Question 2: When should the loop argument be used? Can an example be given also to illustrate it's use?
I don't understand how to use the result argument.
result is simply the value that will be returned by asyncio.sleep once the specified time elapses. This is useful if you replace something that returns actual data with sleep(), e.g. for testing purposes, you can immediately specify a return value. For example:
data = await read_from_database()
...
if mocking:
read_from_database = functools.partial(
asyncio.sleep, 0.1, result='no data')
else:
async def read_from_database():
... real implementation ...
When should the loop argument be used?
The loop argument is, as of Python 3.7 deprecated and scheduled for removal. It was useful in Python 3.5 and earlier, when the return value of asyncio.get_event_loop() wasn't guaranteed to be the currently running event loop, but an event loop associated with the thread. Since one can run multiple event loops during the lifetime of a thread, correct code had to propagate an explicit loop everywhere. If you were running in a non-default event loop, you had to specify the loop to asyncio.sleep and most other asyncio functions and constructors. This style is often encountered in old tutorials and is nowadays actively discouraged.
Just like to understand async await syntax, so I am looking for some 'hello world' app without using asyncio at all.
So how to create simplest event loop using only Python syntax itself? The simplest code (from this Start async function without importing the asyncio package , further code is much more then hello world, that's why I am asking) looks like that:
async def cr():
while(True):
print(1)
cr().send(None)
It prints 1 infinitely, not so good.
So the 1st question is how to yield from coroutine back to the main flow? yield keyword makes coroutine async generator, not we expected.
I would also appreciate a simple application, like this
i.e. we have a coroutine which prints 1, then yields to event loop, then prints 2 then exits with return 3, and simple event loop, which push coroutine until return and consume result.
How about this?
import types
#types.coroutine
def simple_coroutine():
print(1)
yield
print(2)
return 3
future = simple_coroutine()
while True:
try: future.send(None)
except StopIteration as returned:
print('It has returned', returned.value)
break
I think your biggest problem is that you're mixing concepts. An async function is not the same as a coroutine. It is more appropriate to think of it as a way of combining coroutines. Same as ordinary def functions are a way of combining statements into functions. Yes, Python is highly reflective language, so def is also a statement, and what you get from your async function is also a coroutine---but you need to have something at the bottom, something to start with. (At the bottom, yielding is just yielding. At some intermediate level, it is awaiting---of something else, of course.) That's given to you through the types.coroutine decorator in the standard library.
If you have any more questions, feel free to ask.
Is is possible to start a function like this
async def foo():
while True:
print("Hello!")
without importing the asyncio package (and getting the event loop)?
I am looking for a principle similar to Go's goroutines, where one can launch a coroutine with only go statement.
Edit: The reason why I'm not importing the asyncio package is simply because I think it should be possible to launch coroutine without event loop (explicit). I don't understand why async def and similar statements are part of core language (even part of syntax) and the way to launch created coroutines is available only through package.
Of course it is possible to start an async function without explicitly using asyncio. After all, asyncio is written in Python, so all it does, you can do too (though sometimes you might need other modules like selectors or threading if you intend to concurrently wait for external events, or paralelly execute some other code).
In this case, since your function has no await points inside, it just needs a single push to get going. You push a coroutine by sending None into it.
>>> foo().send(None)
Hello!
Hello!
...
Of course, if your function (coroutine) had yield expressions inside, it would suspend execution at each yield point, and you would need to push additional values into it (by coro.send(value) or next(gen)) - but you already know that if you know how generators work.
import types
#types.coroutine
def bar():
to_print = yield 'What should I print?'
print('Result is', to_print)
to_return = yield 'And what should I return?'
return to_return
>>> b = bar()
>>> next(b)
'What should I print?'
>>> b.send('Whatever you want')
Result is Whatever you want
'And what should I return?'
>>> b.send(85)
Traceback...
StopIteration: 85
Now, if your function had await expressions inside, it would suspend at evaluating each of them.
async def baz():
first_bar, second_bar = bar(), bar()
print('Sum of two bars is', await first_bar + await second_bar)
return 'nothing important'
>>> t = baz()
>>> t.send(None)
'What should I print?'
>>> t.send('something')
Result is something
'And what should I return?'
>>> t.send(35)
'What should I print?'
>>> t.send('something else')
Result is something else
'And what should I return?'
>>> t.send(21)
Sum of two bars is 56
Traceback...
StopIteration: nothing important
Now, all these .sends are starting to get tedious. It would be nice to have them semiautomatically generated.
import random, string
def run_until_complete(t):
prompt = t.send(None)
try:
while True:
if prompt == 'What should I print?':
prompt = t.send(random.choice(string.ascii_uppercase))
elif prompt == 'And what should I return?':
prompt = t.send(random.randint(10, 50))
else:
raise ValueError(prompt)
except StopIteration as exc:
print(t.__name__, 'returned', exc.value)
t.close()
>>> run_until_complete(baz())
Result is B
Result is M
Sum of two bars is 56
baz returned nothing important
Congratulations, you just wrote your first event loop! (Didn't expect it to happen, did you?;) Of course, it is horribly primitive: it only knows how to handle two types of prompts, it doesn't enable t to spawn additional coroutines that run concurrently with it, and it fakes events by a random generator.
(In fact, if you want to get philosophical: what we did above that manually, could also be called an event loop: Python REPL was printing prompts to a console window, and it was relying on you to provide events by typing t.send(whatever) into it.:)
asyncio is just an immensely generalized variant of the above: prompts are replaced by Futures, multiple coroutines are kept in queues so each of them eventually gets its turn, and events are much richer and include network/socket communication, filesystem reads/writes, signal handling, thread/process side-execution, and so on. But the basic idea is still the same: you grab some coroutines, juggle them in the air routing the Futures from one to another, until they all raise StopIteration. When all coroutines have nothing to do, you go to external world and grab some additional events for them to chew on, and continue.
I hope it's all much clearer now. :-)
Python coroutines are a syntactic sugar for generators, with some added restrictions in their behavior (so that their purpose is explicitly different and doesn't mix). You can't do:
next(foo())
TypeError: 'coroutine' object is not an iterator
because it's disabled explicitly. However you can do:
foo().send(None)
Hello
Hello
Hello
...
Which is equivalent to next() for a generator.
Coroutines should be able to
run
yield control to caller (optionally producing some intermediate
results)
be able to get some information from caller and resume
So, here is a small demo of async functions (aka native coroutines) which do it without using asyncio or any other modules/frameworks which provide event loop. At least python 3.5 is required. See comments inside the code.
#!/usr/bin/env python
import types
# two simple async functions
async def outer_af(x):
print("- start outer_af({})".format(x))
val = await inner_af(x) # Normal way to call native coroutine.
# Without `await` keyword it wouldn't
# actually start
print("- inner_af result: {}".format(val))
return "outer_af_result"
async def inner_af(x):
print("-- start inner_af({})".format(x))
val = await receiver() # 'await' can be used not only with native
# coroutines, but also with `generator-based`
# coroutines!
print("-- received val {}".format(val))
return "inner_af_result"
# To yiled execution control to caller it's necessary to use
# 'generator-based' coroutine: the one created with types.coroutine
# decorator
#types.coroutine
def receiver():
print("--- start receiver")
# suspend execution / yield control / communicate with caller
r = yield "value request"
print("--- receiver received {}".format(r))
return r
def main():
# We want to call 'outer_af' async function (aka native coroutine)
# 'await' keyword can't be used here!
# It can only be used inside another async function.
print("*** test started")
c = outer_af(42) # just prepare coroutine object. It's not running yet.
print("*** c is {}".format(c))
# To start coroutine execution call 'send' method.
w = c.send(None) # The first call must have argument None
# Execution of coroutine is now suspended. Execution point is on
# the 'yield' statement inside the 'receiver' coroutine.
# It is waiting for another 'send' method to continue.
# The yielded value can give us a hint about what exectly coroutine
# expects to receive from us.
print("*** w = {}".format(w))
# After next 'send' the coroutines execution would finish.
# Even though the native coroutine object is not iterable it will
# throw StopIteration exception on exit!
try:
w = c.send(25)
# w here would not get any value. This is unreachable.
except StopIteration as e:
print("*** outer_af finished. It returned: {}".format(e.value))
if __name__ == '__main__':
main()
Output looks like:
*** test started
*** c is <coroutine object outer_af at 0x7f4879188620>
- start outer_af(42)
-- start inner_af(42)
--- start receiver
*** w = value request
--- receiver received 25
-- received val 25
- inner_af result: inner_af_result
*** outer_af finished. It returned: outer_af_result
Additional comment.
Looks like it's not possible to yield control from inside native coroutine. yield is not permitted inside async functions! So it is necessary to import types and use coroutine decorator. It does some black magic! Frankly speaking I do not understand why yield is prohibited so that a mixture of native and generator-based coroutines is required.
No, that is not possible. You need an event loop. Take a look at what happens if you just call foo():
>>> f = foo()
>>> print(f)
<coroutine object foo at 0x7f6e13edac50>
So you get a coroutine object, nothing get's executed right now! Only by passing it to an event loop does it get executed. You can use asyncio or another event loop like Curio.
Is is possible to start a function like this
async def foo():
while True:
print("Hello!")
without importing the asyncio package (and getting the event loop)?
I am looking for a principle similar to Go's goroutines, where one can launch a coroutine with only go statement.
Edit: The reason why I'm not importing the asyncio package is simply because I think it should be possible to launch coroutine without event loop (explicit). I don't understand why async def and similar statements are part of core language (even part of syntax) and the way to launch created coroutines is available only through package.
Of course it is possible to start an async function without explicitly using asyncio. After all, asyncio is written in Python, so all it does, you can do too (though sometimes you might need other modules like selectors or threading if you intend to concurrently wait for external events, or paralelly execute some other code).
In this case, since your function has no await points inside, it just needs a single push to get going. You push a coroutine by sending None into it.
>>> foo().send(None)
Hello!
Hello!
...
Of course, if your function (coroutine) had yield expressions inside, it would suspend execution at each yield point, and you would need to push additional values into it (by coro.send(value) or next(gen)) - but you already know that if you know how generators work.
import types
#types.coroutine
def bar():
to_print = yield 'What should I print?'
print('Result is', to_print)
to_return = yield 'And what should I return?'
return to_return
>>> b = bar()
>>> next(b)
'What should I print?'
>>> b.send('Whatever you want')
Result is Whatever you want
'And what should I return?'
>>> b.send(85)
Traceback...
StopIteration: 85
Now, if your function had await expressions inside, it would suspend at evaluating each of them.
async def baz():
first_bar, second_bar = bar(), bar()
print('Sum of two bars is', await first_bar + await second_bar)
return 'nothing important'
>>> t = baz()
>>> t.send(None)
'What should I print?'
>>> t.send('something')
Result is something
'And what should I return?'
>>> t.send(35)
'What should I print?'
>>> t.send('something else')
Result is something else
'And what should I return?'
>>> t.send(21)
Sum of two bars is 56
Traceback...
StopIteration: nothing important
Now, all these .sends are starting to get tedious. It would be nice to have them semiautomatically generated.
import random, string
def run_until_complete(t):
prompt = t.send(None)
try:
while True:
if prompt == 'What should I print?':
prompt = t.send(random.choice(string.ascii_uppercase))
elif prompt == 'And what should I return?':
prompt = t.send(random.randint(10, 50))
else:
raise ValueError(prompt)
except StopIteration as exc:
print(t.__name__, 'returned', exc.value)
t.close()
>>> run_until_complete(baz())
Result is B
Result is M
Sum of two bars is 56
baz returned nothing important
Congratulations, you just wrote your first event loop! (Didn't expect it to happen, did you?;) Of course, it is horribly primitive: it only knows how to handle two types of prompts, it doesn't enable t to spawn additional coroutines that run concurrently with it, and it fakes events by a random generator.
(In fact, if you want to get philosophical: what we did above that manually, could also be called an event loop: Python REPL was printing prompts to a console window, and it was relying on you to provide events by typing t.send(whatever) into it.:)
asyncio is just an immensely generalized variant of the above: prompts are replaced by Futures, multiple coroutines are kept in queues so each of them eventually gets its turn, and events are much richer and include network/socket communication, filesystem reads/writes, signal handling, thread/process side-execution, and so on. But the basic idea is still the same: you grab some coroutines, juggle them in the air routing the Futures from one to another, until they all raise StopIteration. When all coroutines have nothing to do, you go to external world and grab some additional events for them to chew on, and continue.
I hope it's all much clearer now. :-)
Python coroutines are a syntactic sugar for generators, with some added restrictions in their behavior (so that their purpose is explicitly different and doesn't mix). You can't do:
next(foo())
TypeError: 'coroutine' object is not an iterator
because it's disabled explicitly. However you can do:
foo().send(None)
Hello
Hello
Hello
...
Which is equivalent to next() for a generator.
Coroutines should be able to
run
yield control to caller (optionally producing some intermediate
results)
be able to get some information from caller and resume
So, here is a small demo of async functions (aka native coroutines) which do it without using asyncio or any other modules/frameworks which provide event loop. At least python 3.5 is required. See comments inside the code.
#!/usr/bin/env python
import types
# two simple async functions
async def outer_af(x):
print("- start outer_af({})".format(x))
val = await inner_af(x) # Normal way to call native coroutine.
# Without `await` keyword it wouldn't
# actually start
print("- inner_af result: {}".format(val))
return "outer_af_result"
async def inner_af(x):
print("-- start inner_af({})".format(x))
val = await receiver() # 'await' can be used not only with native
# coroutines, but also with `generator-based`
# coroutines!
print("-- received val {}".format(val))
return "inner_af_result"
# To yiled execution control to caller it's necessary to use
# 'generator-based' coroutine: the one created with types.coroutine
# decorator
#types.coroutine
def receiver():
print("--- start receiver")
# suspend execution / yield control / communicate with caller
r = yield "value request"
print("--- receiver received {}".format(r))
return r
def main():
# We want to call 'outer_af' async function (aka native coroutine)
# 'await' keyword can't be used here!
# It can only be used inside another async function.
print("*** test started")
c = outer_af(42) # just prepare coroutine object. It's not running yet.
print("*** c is {}".format(c))
# To start coroutine execution call 'send' method.
w = c.send(None) # The first call must have argument None
# Execution of coroutine is now suspended. Execution point is on
# the 'yield' statement inside the 'receiver' coroutine.
# It is waiting for another 'send' method to continue.
# The yielded value can give us a hint about what exectly coroutine
# expects to receive from us.
print("*** w = {}".format(w))
# After next 'send' the coroutines execution would finish.
# Even though the native coroutine object is not iterable it will
# throw StopIteration exception on exit!
try:
w = c.send(25)
# w here would not get any value. This is unreachable.
except StopIteration as e:
print("*** outer_af finished. It returned: {}".format(e.value))
if __name__ == '__main__':
main()
Output looks like:
*** test started
*** c is <coroutine object outer_af at 0x7f4879188620>
- start outer_af(42)
-- start inner_af(42)
--- start receiver
*** w = value request
--- receiver received 25
-- received val 25
- inner_af result: inner_af_result
*** outer_af finished. It returned: outer_af_result
Additional comment.
Looks like it's not possible to yield control from inside native coroutine. yield is not permitted inside async functions! So it is necessary to import types and use coroutine decorator. It does some black magic! Frankly speaking I do not understand why yield is prohibited so that a mixture of native and generator-based coroutines is required.
No, that is not possible. You need an event loop. Take a look at what happens if you just call foo():
>>> f = foo()
>>> print(f)
<coroutine object foo at 0x7f6e13edac50>
So you get a coroutine object, nothing get's executed right now! Only by passing it to an event loop does it get executed. You can use asyncio or another event loop like Curio.