I understand that task.cancel() arranges an exception to be thrown inside the task function. Is that happen in a synchronous way? (As I don't await task.cancel()). Can code that follows the line task.cancel() assume that the task will no longer run?
A simple example:
async def task1():
await asyncio.sleep(3)
print("after sleep")
async def task2():
t = loop.create_task(task1())
await asyncio.sleep(1)
t.cancel()
# can the following code lines assume that task1 is no longer running?
loop = asyncio.get_event_loop()
loop.run_forever()
Can code that follows the line task.cancel() assume that the task will
no longer run?
No. task.cancel() only marks task to be cancelled later. You should explicitly await task after it and catch CancelledError to be sure task is cancelled.
See example here.
Related
I am reading the book and face the code snippet, which doesn't makes sense for me. Can someone clarify that for me ?
import asyncio
async def main():
print(f'{time.ctime()} Hello!')
await asyncio.sleep(1.0)
print(f'{time.ctime()} Goodbye!')
loop = asyncio.get_event_loop()
task = loop.create_task(main())
loop.run_until_complete(task) # This line is responsible to block the thread (Which is MainThread in my case), until every coroutine won't be finished.
pending = asyncio.all_tasks(loop=loop) # asyncio.all_tasks() Return a set of not yet finished Task objects run by the loop. Based on definition, pending will always be an empty set.
for task in pending:
task.cancel()
group = asyncio.gather(*pending, return_exceptions=True)
loop.run_until_complete(group)
loop.close()
I think asyncio.all_tasks() should be used before loop.run_until_complete() function. Besides I find many other places where it is useful, but this example absolutely does not makes sense for me. I am really interested in, why author did that ? What was the point ?
What you are thinking is correct. There is no point here for having .all_tasks() as it always returns an empty set. You only have one task and you pass it to .run_until_complete(), so it blocks until it gets done.
But things change when you have another task that takes longer than your main coroutine:
import asyncio
import time
async def longer_task():
print("inside longer coroutine")
await asyncio.sleep(2)
async def main():
print(f"{time.ctime()} Hello!")
await asyncio.sleep(1.0)
print(f"{time.ctime()} Goodbye!")
loop = asyncio.new_event_loop()
task1 = loop.create_task(main())
task2 = loop.create_task(longer_task())
loop.run_until_complete(task1)
pending = asyncio.all_tasks(loop=loop)
print(pending)
for task in pending:
task.cancel()
group = asyncio.gather(*pending, return_exceptions=True)
loop.run_until_complete(group)
loop.close()
Event loop only cares to finish task1 so task2 is gonna be in pending mode.
I think asyncio.all_tasks() should be used before
loop.run_until_complete() function.
As soon as you create_task(), it will be included in the set returned by all_tasks() even if the loop has not started yet.
Just a side note: (version 3.10) Since you don't have a running event loop, .get_event_loop() will warn you. use .new_event_loop() instead.
I just watch a async/await tutorial video on youtube.
To my understanding of await, if await is in a task, when execute the task it would turn back to the event-loop while it encounter the await inside of the task.
So if await inside a for loop(that's say 10 loops), the task would be paused for 10 times, and I should use 10 await in the event-loop in order to finished the task, like this:
import asyncio
async def print_numbers():
for i in range(10):
print(i)
await asyncio.sleep(0.25)
async def main():
task2 = asyncio.create_task(print_numbers())
for i in range(10):
await task2
asyncio.run(main())
But, in fact the task can be done by using only 1 await, like this:
async def print_numbers():
for i in range(10):
print(i)
await asyncio.sleep(0.25)
async def main():
task2 = asyncio.create_task(print_numbers())
await task2
asyncio.run(main()
What do I missing in this topic?
it would turn back to the event-loop while it encounter the await inside of the task
It does, but you wait for task[0] to finish before you start task[1], so there is simply no other task in the event loop to do. So your code just ends up sleeping and doing nothing
and I should use 10 await in the event-loop in order to finished the task
Yes you will need to await the 10 tasks you started, so your code will only continue once all 10 tasks are done. But you should use asyncio.wait or asyncio.gather so the individual tasks can be parallelized and don't have to wait for the previous one to finish.
import asyncio
import random
async def print_number(i):
print(i, 'start')
await asyncio.sleep(random.random())
print(i, 'done')
async def main():
await asyncio.wait([
asyncio.create_task(print_number(i))
for i in range(10)
])
print('main done')
asyncio.run(main())
I want to make a timer which is started in a normal function, but in the timer function, it should be able to call an async function
I want to do something like this:
startTimer()
while True:
print("e")
def startTimer(waitForSeconds: int):
# Wait for `waitForSeconds`
await myAsyncFunc()
async def myAsyncFunc():
print("in my async func")
Where the while True loop should do its stuff and after waitForSeconds the timer the async function should execute an other async function, but waiting shouldn't block any other actions and doesn't need to be awaited
If something isn't understandable, I'm sorry, I'll try to explain it then
Thanks
If you want to run your synchronous and asynchronous code in parallel, you will need to run one of them in a separate thread. For example:
def sync_code():
while True:
print("e")
async def start_timer(secs):
await asyncio.sleep(secs)
await async_func()
async def main():
asyncio.create_task(start_timer(1))
loop = asyncio.get_event_loop()
# use run_in_executor to run sync code in a separate thread
# while this thread runs the event loop
await loop.run_in_executor(None, sync_code)
asyncio.run(main())
If the above is not acceptable for you (e.g. because it turns the whole program into an asyncio program), you can also run the event loop in a background thread, and submit tasks to it using asyncio.run_coroutine_threadsafe. That approach would allow startTimer to have the signature (and interface) like you wanted it:
def startTimer(waitForSeconds):
loop = asyncio.new_event_loop()
threading.Thread(daemon=True, target=loop.run_forever).start()
async def sleep_and_run():
await asyncio.sleep(waitForSeconds)
await myAsyncFunc()
asyncio.run_coroutine_threadsafe(sleep_and_run(), loop)
async def myAsyncFunc():
print("in my async func")
startTimer(1)
while True:
print("e")
I'm pretty sure that you are familiar with concurent processing, but you didn't show exactly what you want. So if I understand you correctly you want to have 2 processes. First is doing only while True, and the second process is the timer(waits e.g. 5s) and it will call async task. I assume that you are using asyncio according to tags:
import asyncio
async def myAsyncFunc():
print("in my async func")
async def call_after(delay):
await asyncio.sleep(delay)
await myAsyncFunc()
async def while_true():
while True:
await asyncio.sleep(1) # sleep here to avoid to large output
print("e")
async def main():
task1 = asyncio.create_task(
while_true())
task2 = asyncio.create_task(
call_after(5))
# Wait until both tasks are completed (should take
# around 2 seconds.)
await task1
await task2
asyncio.run(main())
I am new to study about asyncio.I don't know how to
describe my question.But here is a minimal example:
import asyncio
async def work():
await asyncio.sleep(3)
async def check_it():
task = asyncio.create_task(work())
await task
while True:
if task.done():
print("Done")
break
print("Trying...")
asyncio.run(check_it())
My idea is very simple:
create a async task in check_it().And await it.
Use a while loop to check whether the task is finished.
If task.done() return True,break the while loop.Then exit the script.
If my question is duplicate, please flag my question.Thanks!
Try asyncio.wait or use asyncio.sleep. Otherwise, your program will output a lot without some pauses.
import asyncio
async def work():
await asyncio.sleep(3)
async def check_it():
task = asyncio.create_task(work())
# "await" block until the task finish. Do not do here.
timeout = 0 # Probably the first timeout is 0
while True:
done, pending = await asyncio.wait({task}, timeout=timeout)
if task in done:
print('Done')
# Do an await here is favourable in case any exception is raised.
await task
break
print('Trying...')
timeout = 1
asyncio.run(check_it())
I need to wrap a coroutine that returns data. If the data is returned, then it is not available anymore. If the coroutine is cancelled, the data is available next call. I need the wrapping coroutine to have the same behavior, however sometimes it is cancelled while the wrapped coroutine has already finished.
I can reproduce this behavior with the following code.
import asyncio
loop = asyncio.get_event_loop()
fut = asyncio.Future()
async def wait():
return await fut
task = asyncio.ensure_future(wait())
async def test():
await asyncio.sleep(0.1)
fut.set_result('data')
print ('fut', fut)
print ('task', task)
task.cancel()
await asyncio.sleep(0.1)
print ('fut', fut)
print ('task', task)
loop.run_until_complete(test())
The output clearly shows that the wrapping coroutine was cancelled after the coroutine finished, meaning that data is forever lost. I cannot shield neither the call, because if I'm cancelled I have no data to return anyway.
fut <Future finished result='data'>
task <Task pending coro=<wait() running at <ipython-input-8-6d115ded09c6>:7> wait_for=<Future finished result='data'>>
fut <Future finished result='data'>
task <Task cancelled coro=<wait() done, defined at <ipython-input-8-6d115ded09c6>:6>>
In my case, this is due to having two futures, the one validating the wrapped coroutine, and the one cancelling the wrapping coroutine, being sometimes validated together. I could probably choose to delay the cancel (via asyncio.sleep(0)), but am I sure it will never happen by accident ?
The problem makes more sense with a task:
import asyncio
loop = asyncio.get_event_loop()
data = []
fut_data = asyncio.Future()
async def get_data():
while not data:
await asyncio.shield(fut_data)
return data.pop()
fut_wapper = asyncio.Future()
async def wrapper_data():
task = asyncio.ensure_future(get_data())
return await task
async def test():
task = asyncio.ensure_future(wrapper_data())
await asyncio.sleep(0)
data.append('data')
fut_data.set_result(None)
await asyncio.sleep(0)
print ('wrapper_data', task)
task.cancel()
await asyncio.sleep(0)
print ('wrapper_data', task)
print ('data', data)
loop.run_until_complete(test())
task <Task cancelled coro=<wrapper_data() done, defined at <ipython-input-2-93645b78e9f7>:16>>
data []
The data has been consumed but the task has been cancelled, so data cannot be retrieved. Awaiting directly for get_data() would work, but then cannot be cancelled.
I think you need to first shield the awaited future from cancellation, then detect your own cancellation. If the future hasn't completed, propagate the cancellation into it (effectively undoing the shield()) and out. If the future has completed, ignore the cancellation and return the data.
The code would look like this, also changed to avoid global vars and use asyncio.run() (which you can turn to run_until_complete() if you're using Python 3.6):
import asyncio
async def wait(fut):
try:
return await asyncio.shield(fut)
except asyncio.CancelledError:
if fut.done():
# we've been canceled, but we have the data - ignore the
# cancel request
return fut.result()
# otherwise, propagate the cancellation into the future
fut.cancel()
# ...and to the caller
raise
async def test():
loop = asyncio.get_event_loop()
fut = loop.create_future()
task = asyncio.create_task(wait(fut))
await asyncio.sleep(0.1)
fut.set_result('data')
print ('fut', fut)
print ('task', task)
task.cancel()
await asyncio.sleep(0.1)
print ('fut', fut)
print ('task', task)
asyncio.run(test())
Note that ignoring the cancel request can be thought of as abuse of the cancellation mechanism. But if the task is known to proceed afterwards (ideally immediately finishing), it might be the right thing in your situation. Caution is advised.