I'm trying to create a Python script that will receive messages from a websocket connection and every time it receives a new message, it needs to run an asyncio task in the background.
In order to "simulate" this process, i made a blocking functio that starts counting with a while True statement. The expected output is that every time a new message is received from the ws connection, a new count starts, but in my case, as soon as i run the script, the count function will block the whole code. How can i solve this?
Here is what i tried:
import asyncio
import websockets
import json
import time
#this is the blocking function..
def counter():
count = 0
while True:
print(count)
count += 1
time.sleep(0.5)
async def main():
while True:
try:
async with websockets.connect('MY-URL') as websocket:
while True:
msg = await asyncio.wait_for(websocket.recv(), 500)
try:
data = json.loads(msg)
await loop.create_task(counter())
except Exception as e:
print(e)
except Exception as e:
print(e)
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
You have two major problems here. Your first problem is, that you create an infinite loop in counter and than call it, when you try to pass it to create_task. This way create_task is never even called.
The second obvious problem is, that you try to pass a method to create_task while it expects a coroutine.
Define your counter method again as coroutine using async def and replace time.sleep with asyncio.sleep and I think it might work.
As a general note: You cannot have blocking code in the same thread as your event loop. This means never ever use time.sleep in asynchronous code...
Related
I am writing a python program to scan a QRcode and exit or if the QR is not read passed X seconds the program needs to end.
import asyncio
from camera.QRGetter import QRGetter
async def getQR(qr_reader):
data = await qr_reader.getQRData()
print(data)
async def main():
qr_reader = QRGetter()
try:
await asyncio.wait_for(getQR(qr_reader), timeout=5.0)
except asyncio.TimeoutError:
qr_reader.close()
print('timeout!')
asyncio.run(main())
The class QRGetter works fine without error. Inside it there is a while for iterate frame to frame the camera input. If the QR is showed, it return the data and exit correctly.
The main problem is that the timeout didn't trigger.
How can I fix it ?
How can I fix
I waited two minutes, and the timeout didn't trigger
I try to cancel process when timeout but asyncio.wait_for not working. How do i cancel process when reached time out. My code below:
import asyncio
async def process():
# do something take a long time like this
for i in range(0,10000000000,1):
for j in range(0,10000000000,1):
continue
print('done!')
async def main():
# I want to cancel process when reached timeout
try:
await asyncio.wait_for(process(), timeout=1.0)
except asyncio.TimeoutError:
print('timeout!')
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
This doesn't work because your process function is async in name only - it doesn't await anything. That means that it finishes in its entirety without giving the event loop a chance to interrupt it. Since asyncio is cooperative (as are other async/await based systems), such a function is not a correctly written async function and cannot be interrupted.
If you add an await asyncio.sleep(0.001) into the inner loop (or anything else that awaits something that actually suspends), your code will work fine.
I have a code like the foolowing:
def render():
loop = asyncio.get_event_loop()
async def test():
await asyncio.sleep(2)
print("hi")
return 200
if loop.is_running():
result = asyncio.ensure_future(test())
else:
result = loop.run_until_complete(test())
When the loop is not running is quite easy, just use loop.run_until_complete and it return the coro result but if the loop is already running (my blocking code running in app which is already running the loop) I cannot use loop.run_until_complete since it will raise an exception; when I call asyncio.ensure_future the task gets scheduled and run, but I want to wait there for the result, does anybody knows how to do this? Docs are not very clear how to do this.
I tried passing a concurrent.futures.Future calling set_result inside the coro and then calling Future.result() on my blocking code, but it doesn't work, it blocks there and do not let anything else to run. ANy help would be appreciated.
To implement runner with the proposed design, you would need a way to single-step the event loop from a callback running inside it. Asyncio explicitly forbids recursive event loops, so this approach is a dead end.
Given that constraint, you have two options:
make render() itself a coroutine;
execute render() (and its callers) in a thread different than the thread that runs the asyncio event loop.
Assuming #1 is out of the question, you can implement the #2 variant of render() like this:
def render():
loop = _event_loop # can't call get_event_loop()
async def test():
await asyncio.sleep(2)
print("hi")
return 200
future = asyncio.run_coroutine_threadsafe(test(), loop)
result = future.result()
Note that you cannot use asyncio.get_event_loop() in render because the event loop is not (and should not be) set for that thread. Instead, the code that spawns the runner thread must call asyncio.get_event_loop() and send it to the thread, or just leave it in a global variable or a shared structure.
Waiting Synchronously for an Asynchronous Coroutine
If an asyncio event loop is already running by calling loop.run_forever, it will block the executing thread until loop.stop is called [see the docs]. Therefore, the only way for a synchronous wait is to run the event loop on a dedicated thread, schedule the asynchronous function on the loop and wait for it synchronously from another thread.
For this I have composed my own minimal solution following the answer by user4815162342. I have also added the parts for cleaning up the loop when all work is finished [see loop.close].
The main function in the code below runs the event loop on a dedicated thread, schedules several tasks on the event loop, plus the task the result of which is to be awaited synchronously. The synchronous wait will block until the desired result is ready. Finally, the loop is closed and cleaned up gracefully along with its thread.
The dedicated thread and the functions stop_loop, run_forever_safe, and await_sync can be encapsulated in a module or a class.
For thread-safery considerations, see section “Concurrency and Multithreading” in asyncio docs.
import asyncio
import threading
#----------------------------------------
def stop_loop(loop):
''' stops an event loop '''
loop.stop()
print (".: LOOP STOPPED:", loop.is_running())
def run_forever_safe(loop):
''' run a loop for ever and clean up after being stopped '''
loop.run_forever()
# NOTE: loop.run_forever returns after calling loop.stop
#-- cancell all tasks and close the loop gracefully
print(".: CLOSING LOOP...")
# source: <https://xinhuang.github.io/posts/2017-07-31-common-mistakes-using-python3-asyncio.html>
loop_tasks_all = asyncio.Task.all_tasks(loop=loop)
for task in loop_tasks_all: task.cancel()
# NOTE: `cancel` does not guarantee that the Task will be cancelled
for task in loop_tasks_all:
if not (task.done() or task.cancelled()):
try:
# wait for task cancellations
loop.run_until_complete(task)
except asyncio.CancelledError: pass
#END for
print(".: ALL TASKS CANCELLED.")
loop.close()
print(".: LOOP CLOSED:", loop.is_closed())
def await_sync(task):
''' synchronously waits for a task '''
while not task.done(): pass
print(".: AWAITED TASK DONE")
return task.result()
#----------------------------------------
async def asyncTask(loop, k):
''' asynchronous task '''
print("--start async task %s" % k)
await asyncio.sleep(3, loop=loop)
print("--end async task %s." % k)
key = "KEY#%s" % k
return key
def main():
loop = asyncio.new_event_loop() # construct a new event loop
#-- closures for running and stopping the event-loop
run_loop_forever = lambda: run_forever_safe(loop)
close_loop_safe = lambda: loop.call_soon_threadsafe(stop_loop, loop)
#-- make dedicated thread for running the event loop
thread = threading.Thread(target=run_loop_forever)
#-- add some tasks along with my particular task
myTask = asyncio.run_coroutine_threadsafe(asyncTask(loop, 100200300), loop=loop)
otherTasks = [asyncio.run_coroutine_threadsafe(asyncTask(loop, i), loop=loop)
for i in range(1, 10)]
#-- begin the thread to run the event-loop
print(".: EVENT-LOOP THREAD START")
thread.start()
#-- _synchronously_ wait for the result of my task
result = await_sync(myTask) # blocks until task is done
print("* final result of my task:", result)
#... do lots of work ...
print("*** ALL WORK DONE ***")
#========================================
# close the loop gracefully when everything is finished
close_loop_safe()
thread.join()
#----------------------------------------
main()
here is my case, my whole programe is async, but call some sync lib, then callback to my async func.
follow the answer by user4815162342.
import asyncio
async def asyncTask(k):
''' asynchronous task '''
print("--start async task %s" % k)
# await asyncio.sleep(3, loop=loop)
await asyncio.sleep(3)
print("--end async task %s." % k)
key = "KEY#%s" % k
return key
def my_callback():
print("here i want to call my async func!")
future = asyncio.run_coroutine_threadsafe(asyncTask(1), LOOP)
return future.result()
def sync_third_lib(cb):
print("here will call back to your code...")
cb()
async def main():
print("main start...")
print("call sync third lib ...")
await asyncio.to_thread(sync_third_lib, my_callback)
# await loop.run_in_executor(None, func=sync_third_lib)
print("another work...keep async...")
await asyncio.sleep(2)
print("done!")
LOOP = asyncio.get_event_loop()
LOOP.run_until_complete(main())
I'm trying to asynchronously download a file in Python, using wget in a subprocess. My code looks like this:
async def download(url, filename):
wget = await asyncio.create_subprocess_exec(
'wget', url,
'O', filename
)
await wget.wait()
def main(url):
loop = asyncio.get_event_loop()
future = asyncio.ensure_future(download(url, 'test.zip'), loop=loop)
print("Downloading..")
time.sleep(15)
print("Still downloading...")
loop.run_until_complete(future)
loop.close()
What I'm trying to do is witness the printing of "Downloading.." then 15 seconds later "Still downloading...", all while the download of the file has started. What I'm actually seeing is that the download of the file only starts when the code hits loop.run_until_complete(future)
My understanding is that asyncio.ensure_future should start running the code of the download coroutine, but apparently I'm missing something.
When passed a coroutine, asyncio.ensure_future converts it to a task - a special kind of future that knows how to drive the coroutine - and enqueues it in the event loop. "Enqueue" means that the code inside the coroutine will be executed by a running event loop that schedules the coroutines. If the event loop is not running, then none of the coroutines will get a chance to run either. The loop is told to run by a call to loop.run_forever() or loop.run_until_complete(some_future). In the question the event loop is only started after the call to time.sleep(), so the beginning of the download is delayed by 15 seconds.
time.sleep should never be called in a thread that runs the asyncio event loop. The correct way to sleep is with asyncio.sleep, which yields the control to the event loop while waiting. asyncio.sleep returns a future that can be submitted to the event loop or awaited from a coroutine:
# ... definition of download omitted ...
async def report():
print("Downloading..")
await asyncio.sleep(15)
print("Still downloading...")
def main(url):
loop = asyncio.get_event_loop()
dltask = loop.create_task(download(url, 'test.zip'))
loop.create_task(report())
loop.run_until_complete(dltask)
loop.close()
The above code has a different problem. When the download is shorter than 15 seconds, it results in a Task was destroyed but it is pending! warning being printed. The problem is that the report task was never canceled when the download task finished and the loop closed, it was just abandoned. This happening often indicates a bug or a misunderstanding of how asyncio works, so asyncio flags it with a warning.
The obvious way to eliminate the warning is by explicitly canceling the task of the report coroutine, but the resulting code ends up being verbose and not very elegant. An simpler and shorter fix is to change report to await the download task, specifying a timeout for displaying the "Still downloading..." message:
async def dl_and_report(dltask):
print("Downloading..")
try:
await asyncio.wait_for(asyncio.shield(dltask), 15)
except asyncio.TimeoutError:
print("Still downloading...")
# assuming we want the download to continue; otherwise
# remove the shield(), and dltask will be canceled
await dltask
def main(url):
loop = asyncio.get_event_loop()
dltask = loop.create_task(download(url, 'test.zip'))
loop.run_until_complete(dl_and_report(dltask))
loop.close()
I am trying to translate this key "debouncing" logic from Javascript to Python.
function handle_key(key) {
if (this.state == null) {
this.state = ''
}
this.state += key
clearTimeout(this.timeout)
this.timeout = setTimeout(() => {
console.log(this.state)
}, 500)
}
handle_key('a')
handle_key('b')
The idea is that subsequent key presses extend the timeout. The Javascript version prints:
ab
I don't want to translate the JS timeout functions, I'd rather have idiomatic Python using asyncio. My attempt in Python (3.5) is below, but it doesn't work as global_state is not actually updated when I expect.
import asyncio
global_state = ''
#asyncio.coroutine
def handle_key(key):
global global_state
global_state += key
local_state = global_state
yield from asyncio.sleep(0.5)
#if another call hasn't modified global_state we print it
if local_state == global_state:
print(global_state)
#asyncio.coroutine
def main():
yield from handle_key('a')
yield from handle_key('b')
ioloop = asyncio.get_event_loop()
ioloop.run_until_complete(main())
It prints:
a
ab
I have looked into asyncio Event, Queue and Condition but it isn't clear to me how to use them for this. How would you implement the desired behavior using Python's asyncio?
EDIT
Some more details on how I'd like to use handle_keys. I have an async function that checks for key presses.
#asyncio.coroutine
def check_keys():
keys = driver.get_keys()
for key in keys:
yield from handle_key(key)
Which in turn is scheduled along with other program tasks
#asyncio.coroutine
def main():
while True:
yield from check_keys()
yield from do_other_stuff()
ioloop = asyncio.get_event_loop()
ioloop.run_until_complete(main())
Qeek's use of asyncio.create_task and asyncio.gather makes sense. But how would I use it within a loop like this? Or is there another way to schedule the async tasks that would allow handle_keys calls to "overlap"?
Actual code on GitHub if you are interested.
Why your code doesn't work now?
Both handle_key javascript functions don't block execution. Each just clear timeout callback and set new one. It happens immediately.
Coroutines work another way: using yield from or newer syntax await on coroutine means that we want to resume execution flow only after this coroutine if fully done:
async def a():
await asyncio.sleep(1)
async def main():
await a()
await b() # this line would be reached only after a() done - after 1 second delay
asyncio.sleep(0.5) in your code - is not setting callback by timeout, but code that should be done before handle_key finsihed.
Let's try to make code work
You can create task to start execution some coroutine "in background". You can also cancel task (just like you do with clearTimeout(this.timeout)) if you don't want it to be finished.
Python version that emulates your javascript snippet:
import asyncio
from contextlib import suppress
global_state = ''
timeout = None
async def handle_key(key):
global global_state, timeout
global_state += key
# cancel previous callback (clearTimeout(this.timeout))
if timeout:
timeout.cancel()
with suppress(asyncio.CancelledError):
await timeout
# set new callback (this.timeout = setTimeout ...)
async def callback():
await asyncio.sleep(0.5)
print(global_state)
timeout = asyncio.ensure_future(callback())
async def main():
await handle_key('a')
await handle_key('b')
# both handle_key functions done, but task isn't finished yet
# you need to await for task before exit main() coroutine and close loop
if timeout:
await timeout
loop = asyncio.get_event_loop()
try:
loop.run_until_complete(main())
finally:
loop.close()
Idiomatic?
While code above works, it is not how asyncio should be used. Your javascript code based on callbacks, while asyncio usually is about to avoid using of callbacks.
It's hard to demonstrate difference on your example since it's callback based by nature (key handling - is some sort of global callback) and doesn't have more async logic. But this understanding would be important later when you'll add more async operations.
Right now I advice you to read about async/await in modern javascript (it's similar to Python's async/await) and look at examples comparing it to callbacks/promises. This article looks good.
It'll help you understand how you can use coroutine-based approach in Python.
Upd:
Since buttons.check needs to periodically call driver.get_buttons() you'll have to use loop. But it can be done as task along with your event loop.
If you had some sort of button_handler(callback) (this is usually how different libs allow to handle user input) you could use it to set some asyncio.Future directly and avoid loop.
Consider possibility write some little gui app with asyncio from the beginning. I think it may help you to better understand how you can adapt your existing project.
Here's some pseudo-code that shows background task to handle
buttons and using asyncio to handle some simple UI events/states logic:
.
import asyncio
from contextlib import suppress
# GUI logic:
async def main():
while True:
print('We at main window, popup closed')
key = await key_pressed
if key == 'Enter':
print('Enter - open some popup')
await popup()
# this place wouldn't be reached until popup is not closed
print('Popup was closed')
elif key == 'Esc':
print('Esc - exit program')
return
async def popup():
while True:
key = await key_pressed
if key == 'Esc':
print('Esc inside popup, let us close it')
return
else:
print('Non escape key inside popup, play sound')
# Event loop logic:
async def button_check():
# Where 'key_pressed' is some global asyncio.Future
# that can be used by your coroutines to know some key is pressed
while True:
global key_pressed
for key in get_buttons():
key_pressed.set_result(key)
key_pressed = asyncio.Future()
await asyncio.sleep(0.01)
def run_my_loop(coro):
loop = asyncio.get_event_loop()
# Run background task to process input
buttons_task = asyncio.ensure_future(button_check())
try:
loop.run_until_complete(main())
finally:
# Shutdown task
buttons_task.cancel()
with suppress(asyncio.CancelledError):
loop.run_until_complete(buttons_task)
loop.close()
if __name__ == '__main__':
run_my_loop(main())
What's wrong
Basically the yield from xy() is very similar to the normal function call. The difference between function call and yield from is that the function call immediately start processing called function. The yield from statement insert called coroutine into queue inside event loop and give control to event loop and it decide which coroutine in it's queue will be processed.
Here is the explanation of what you code does:
It adds the main into event loop's queue.
The event loop start processing coroutine in the queue.
The queue contains only the main coroutine so it starts that.
The code hits the yield from handle_key('a').
It adds the handle_key('a') in the event loop's queue.
The event loop now contains the main and handle_key('a') but the main cannot be started because it is waiting for the result of the handle_key('a').
So the event loop starts the handle_key('a').
It will do some stuff until it hits the yield from asyncio.sleep(0.5).
Now the event loop contains main(), handle_key('a') and sleep(0.5).
The main() is waiting for result from handle_key('a').
The handle_key('a') is waiting for result from sleep(0.5).
The sleep has no dependency so it can be started.
The asyncio.sleep(0.5) returns None after 0.5 second.
The event loop takes the None and return it into the handle_key('a') coroutine.
The return value is ignored because it isn't assign into anything
The handle_key('a') prints the key (because nothing change the state)
The handle_key coroutine at the end return None (because there isn't return statement).
The None is returned to the main.
Again the return value is ignored.
The code hits the yield from handle_key('b') and start processing new key.
It run same steps from step 5 (but with the key b).
How to fix it
The main coroutinr replace with this:
#asyncio.coroutine
def main(loop=asyncio.get_event_loop()):
a_task = loop.create_task(handle_key('a'))
b_task = loop.create_task(handle_key('b'))
yield from asyncio.gather(a_task, b_task)
The loop.create_task adds into the event loop's queue the handle_key('a') and handle_key('b') and then the yield from asyncio.gather(a_task, b_task) give control to the event loop. The event loop from this point contains handle_key('a'), handle_key('b'), gather(...) and main().
The main() wiating for result from gather()
The gather() waiting until all tasks given as parameters are finished
The handle_key('a') and handle_key('b') has no dependencies so they can be started.
The event loop now contains 2 coroutine which can start but which one will it pick? Well... who knows it is implementation depended. So for better simulation of pressed keys this one replace should be a little better:
#asyncio.coroutine
def main(loop=asyncio.get_event_loop()):
a_task = loop.create_task(handle_key('a'))
yield from asyncio.sleep(0.1)
b_task = loop.create_task(handle_key('b'))
yield from asyncio.gather(a_task, b_task)
Python 3.5 bonus
From the documentation:
Coroutines used with asyncio may be implemented using the async def statement.
The async def type of coroutine was added in Python 3.5, and is recommended if there is no need to support older Python versions.
It means that you can replace:
#asyncio.coroutine
def main():
with newer statement
async def main():
If you start using the new syntax then you have to also replace yield from with await.