Python doesn't want to run function asynchronous - python

I just want to run async function like really async, but it blocks main thread for 5 seconds
import asyncio
async def sideTask():
await asyncio.sleep(5) # this will pause main thread for no reason
print("doing side job")
global test
test = 0
# main work
while True:
test += 1
time.sleep(1)
if test > 3:
test = 0
asyncio.run(sideTask())
I am expecting async function being async

What are you trying to achieve by running sideTask() in the loop?
If you're trying to see output of "doing side job" while asyncio.sleep(5) is executing, you should simply put "doing side job" before asyncio.sleep(5)
asyncio.sleep() is not pausing the main thread, you are just doing sleep function which serves no purpose and makes it look like it's not functioning in the main time.

Related

Python asyncio wait and notify

I am trying to do something similar like C# ManualResetEvent but in Python.
I have attempted to do it in python but doesn't seem to work.
import asyncio
cond = asyncio.Condition()
async def main():
some_method()
cond.notify()
async def some_method():
print("Starting...")
await cond.acquire()
await cond.wait()
cond.release()
print("Finshed...")
main()
I want the some_method to start then wait until signaled to start again.
This code is not complete, first of all you need to use asyncio.run() to bootstrap the event loop - this is why your code is not running at all.
Secondly, some_method() never actually starts. You need to asynchronously start some_method() using asyncio.create_task(). When you call an "async def function" (the more correct term is coroutinefunction) it returns a coroutine object, this object needs to be driven by the event loop either by you awaiting it or using the before-mentioned function.
Your code should look more like this:
import asyncio
async def main():
cond = asyncio.Condition()
t = asyncio.create_task(some_method(cond))
# The event loop hasn't had any time to start the task
# until you await again. Sleeping for 0 seconds will let
# the event loop start the task before continuing.
await asyncio.sleep(0)
cond.notify()
# You should never really "fire and forget" tasks,
# the same way you never do with threading. Wait for
# it to complete before returning:
await t
async def some_method(cond):
print("Starting...")
await cond.acquire()
await cond.wait()
cond.release()
print("Finshed...")
asyncio.run(main())

Async Function Call Inside While Loop

I have a queue which stored on Redis lists. I'm trying to create async consumer for this queue. But couldn't call async function inside loop. Its working like sync function when I call.
import asyncio
async def worker():
print("starting sleep")
await asyncio.sleep(2)
print("slept")
async def main():
while True:
await worker()
asyncio.run(main())
Here is a short and simple example of mine implemantation. I'm expecting to see 'starting sleep' messages until first 'slept' message, it means for 2 seconds.
main is literally awaiting the completion of worker. Until worker is done, main won't progress. async tasks don't run in the background like in multithreading.
What you want is to keep launching new workers without awaiting each one of them. However, if you just keep doing this in a loop like this:
while True:
worker()
then you will never see any output of those workers, since this is an endless loop which never gives anything else the chance to run. You'd need to "break" this loop in some way to allow workers to progress. Here's an example of that:
import asyncio
async def worker():
print("starting sleep")
await asyncio.sleep(2)
print("slept")
async def main():
while True:
asyncio.ensure_future(worker())
await asyncio.sleep(0.5)
asyncio.run(main())
This will produce the expected outcome:
starting sleep
starting sleep
starting sleep
starting sleep
slept
starting sleep
slept
...
The await inside main transfers control back to the event loop, which now has the chance to run the piled up worker tasks, When those worker tasks await, they in turn transfer control back to the event loop, which will transfer it back to either main or a worker as their awaited sleep completes.
Note that this is only for illustration purposes; if and when you interrupt this program, you'll see notices about unawaited tasks which haven't completed. You should keep track of your tasks and await them all to completion at the end somewhere.
Here is an example using asyncio.wait:
import asyncio
async def worker():
print("starting sleep")
await asyncio.sleep(2)
print("slept")
async def main():
tasks = [worker() for each in range(10)]
await asyncio.wait(tasks)
asyncio.run(main())
It spawns all the workers together.

Cancel process when reached time out with asyncio python

I try to cancel process when timeout but asyncio.wait_for not working. How do i cancel process when reached time out. My code below:
import asyncio
async def process():
# do something take a long time like this
for i in range(0,10000000000,1):
for j in range(0,10000000000,1):
continue
print('done!')
async def main():
# I want to cancel process when reached timeout
try:
await asyncio.wait_for(process(), timeout=1.0)
except asyncio.TimeoutError:
print('timeout!')
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
This doesn't work because your process function is async in name only - it doesn't await anything. That means that it finishes in its entirety without giving the event loop a chance to interrupt it. Since asyncio is cooperative (as are other async/await based systems), such a function is not a correctly written async function and cannot be interrupted.
If you add an await asyncio.sleep(0.001) into the inner loop (or anything else that awaits something that actually suspends), your code will work fine.

rewriting sync as async: not wait on a func

As I read more, I feel more stupid about aysnc in python. So I decided to ask for a direct answer. How can I change the following code (using async or similar approaches) to achieve the desired result? Additionally, how can I do it in flask or sanic?
import time
def long_job():
print('long job started')
time.sleep(5)
print('long job ended')
def main_job():
long_job()
time.sleep(1)
print('main job returned')
main_job()
# expected result:
# 'long job started'
# 'main job returned'
# 'long job ended'
Basically, I do NOT want to await for long_job to end before returning my main_job. Thank you in advance. :)
Await asyncio's sleep() to yield time to other jobs (if you don't need to await something else).
Use create_task() instead of await to start a job without blocking.
Finally, you have to start the main job using the event loop.
# Written in Python 3.7
import asyncio
async def long_job():
print('long job started')
await asyncio.sleep(5)
print('long job ended')
async def main_job():
asyncio.create_task(long_job())
await asyncio.sleep(1)
print('main job returned')
Your framework should start the event loop, you don't have to start it yourself. You can await or call create_task on main_job() from an async def function called by your framework, depending on if you want to block or not.
If you want to test this without a framework, you'll have to start the loop yourself using asyncio.run(). This will stop immediately after its task completes, even if other tasks haven't finished yet. But this is easy enough to work around:
async def loop_job():
asyncio.create_task(main_job())
while len(asyncio.Task.all_tasks()) > 1: # Any task besides loop_job()?
await asyncio.sleep(0.2)
asyncio.run(loop_job())
If you're implementing a framework yourself, you can use the more primitive loop.run_forever(), but you'd have to stop() it yourself.

Understanding Python Concurrency with Asyncio

I was wondering how concurrency works in python 3.6 with asyncio. My understanding is that when the interpreter executing await statement, it will leave it there until the awaiting process is complete and then move on to execute the other coroutine task. But what I see here in the code below is not like that. The program runs synchronously, executing task one by one.
What is wrong with my understanding and my impletementation code?
import asyncio
import time
async def myWorker(lock, i):
print("Attempting to attain lock {}".format(i))
# acquire lock
with await lock:
# run critical section of code
print("Currently Locked")
time.sleep(10)
# our worker releases lock at this point
print("Unlocked Critical Section")
async def main():
# instantiate our lock
lock = asyncio.Lock()
# await the execution of 2 myWorker coroutines
# each with our same lock instance passed in
# await asyncio.wait([myWorker(lock), myWorker(lock)])
tasks = []
for i in range(0, 100):
tasks.append(asyncio.ensure_future(myWorker(lock, i)))
await asyncio.wait(tasks)
# Start up a simple loop and run our main function
# until it is complete
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
print("All Tasks Completed")
loop.close()
Invoking a blocking call such as time.sleep in an asyncio coroutine blocks the whole event loop, defeating the purpose of using asyncio.
Change time.sleep(10) to await asyncio.sleep(10), and the code will behave like you expect.
asyncio use a loop to run everything, await would yield back the control to the loop so it can arrange the next coroutine to run.

Categories