what happens when 2 different functions calls the same function? - python

I was experimenting with Asyncio in python, and thought what will happen if call 2 different Asyncio functions running concurrently to non-async fuction.
so did like this `
def calc(number):
while True:
return(number * number)
async def one():
while True:
a = calc(5)
print(a)
await asyncio.sleep(0)
async def two():
while True:
a = calc(2)
print(a)
await asyncio.sleep(0)
if __name__=='__main__':
import os
import uvloop
import asyncio
loop = uvloop.new_event_loop()
asyncio.set_event_loop(loop)
loop.create_task(one())
loop.create_task(two())
loop.run_forever()
I thought program will freeze in cal function (while loop), but the program is printing results concurrently.Could any one explain me why this is not getting stuck in the while loop, Thanks.
`

Related

How to run a blocking task asynchronously with ProcessPoolExecutor and asyncio?

Im trying to run a blocking task asynchronously with ProcessPoolExecutor (It works with ThreadPoolExecutor but I need ProcessPoolExecutor for CPU-bound task). Here is my code :
import asyncio
import time
from concurrent.futures import ProcessPoolExecutor
async def run_in_thread(task, *args):
with ProcessPoolExecutor() as process_pool:
loop = asyncio.get_event_loop()
result = await loop.run_in_executor(process_pool, task, *args)
return result
async def main_task():
while True:
await asyncio.sleep(1)
print("ticker")
async def main():
asyncio.create_task(main_task())
global blocking_task
def blocking_task():
time.sleep(5)
print("blocking task done!")
await run_in_thread(blocking_task)
if __name__ == "__main__":
asyncio.run(main())
And I get this error :
result = await loop.run_in_executor(process_pool, task, *args)
concurrent.futures.process.BrokenProcessPool: A process in the process pool was terminated abruptly while the future was running or pending.
I don't understand where is the issue, can someone please help me?
I'd also like to understand why it works with ThreadPoolExecutor but not ProcessPoolExecutor
I was expecting the code to print :
ticker
ticker
ticker
ticker
ticker
blocking task done!
Move the definition of blocking_task to the outer level of the module. As the script stands this function is invisible to other Processes. The code of the function isn't sent directly to the other Process, only its name. The other Process performs its own separate import of the script but the name isn't defined at the top level.
It's the same logic as if you tried to import this script into another script. Let's say this script is in a file named foo.py. After you do import foo, there is no function named foo.blocking_task so you would be unable to call it.
This is a little bit more clear if you looked at the whole traceback, instead of just the last line.
Incidentally, using the global statement in front of the function definition isn't the same thing as moving the definition to the top level. In your script the name blocking_task does not exist at module level until the main() function actually runs (which the secondary Process never does). In the working script below, the name blocking_task exists as soon as the module is imported.
import asyncio
import time
from concurrent.futures import ProcessPoolExecutor
async def run_in_thread(task, *args):
with ProcessPoolExecutor() as process_pool:
loop = asyncio.get_event_loop()
result = await loop.run_in_executor(process_pool, task, *args)
return result
async def main_task():
while True:
await asyncio.sleep(1)
print("ticker")
def blocking_task():
time.sleep(5)
print("blocking task done!")
async def main():
asyncio.create_task(main_task())
await run_in_thread(blocking_task)
if __name__ == "__main__":
asyncio.run(main())
This prints exactly what you were expecting.

async function in Python, basic example

Can you help me see what I have understood wrong here please. I have two functions and I would like the second one to run regardless of the status of the first one (whether it is finished or not). Hence I was thinking to make the first function asynchronous. This is what I have done
import os
import asyncio
from datetime import datetime
async def do_some_iterations():
for i in range(10):
print(datetime.now().time())
await asyncio.sleep(1)
print('... Cool!')
async def main():
task = asyncio.create_task (do_some_iterations())
await task
def do_something():
print('something...')
if __name__ == "__main__":
asyncio.run(main())
do_something()
The output is:
00:46:00.145024
00:46:01.148533
00:46:02.159751
00:46:03.169868
00:46:04.179915
00:46:05.187242
00:46:06.196356
00:46:07.207614
00:46:08.215997
00:46:09.225066
Cool!
something...
which looks like the traditional way where one function has to finish and then move to the next call.
I was hoping instead to execute do_something() before the asynchronous function started generating the print statements (or at lease at the very top of those statements..)
What am I doing wrong please? How I should edit the script?
They both need to be part of the event loop the you created. asyncio.run() itself is not async, which means it will run until the loop ends. One easy way to do this is to use gather()
import asyncio
from datetime import datetime
async def do_some_iterations():
for i in range(10):
print(datetime.now().time())
await asyncio.sleep(1)
print('... Cool!')
async def do_something():
print('something...')
async def main():
await asyncio.gather(
do_some_iterations(),
do_something()
)
if __name__ == "__main__":
asyncio.run(main())
print("done")
This will print:
16:08:38.921879
something...
16:08:39.922565
16:08:40.923709
16:08:41.924823
16:08:42.926004
16:08:43.927044
16:08:44.927877
16:08:45.928724
16:08:46.929589
16:08:47.930453
... Cool!
done
You can also simply add another task:
async def main():
task = asyncio.create_task(do_some_iterations())
task2 = asyncio.create_task(do_something())
In both cases the function needs to be awaitable.

Python async function is never awaited

I implemented a async function in python's asynchronous framework FastAPI
The function looks like:
async def func2(num):
time.sleep(3)
return num
async def func1():
text = await func2(5)
print(text)
print('inside func1')
async def my_async_func():
print('start')
await func1()
print('finish')
Here, when I execute my_async_func I'm expecting asyn behavior and values to be printed as
start
finish
inside func1
5
But it prints synchronously as
start
5
inside func1
finish
How to handle concurrent operation and implementation of coroutines asynchronously?
do asyncio.create_task, (note: you can't make sure print 5 after inside func1 by the order of print.)
Try code below:
import asyncio
import time
async def func2(num):
time.sleep(3)
return num
async def func1():
text = await func2(5)
print(text)
print('inside func1')
async def my_async_func():
print('start')
asyncio.create_task(func1())
print('finish')
asyncio.run(my_async_func())
Result:
start
finish
5
inside func1
Also notice that sleep would make your thread sleep.
Asynchronous behavior shows up when several independent(ish) tasks take turns executing in an event loop, but here you only run the 1 task my_async_func. my_async_func then calls func1, which then calls func2; your program is executing in exactly the order you wrote.
This chain of function calls shouldn't really be called synchronous because there is only 1 independent task. You can see asynchronous behavior if you queue up 2 my_async_func tasks, actually.

Asyncio.sleep causes script to End Immediately

In my simple asyncio Python program below, bar_loop is supposed to run continuously with a 1 second delay between loops.
Things run as expected when we have simply
async def bar_loop(self):
while True:
print('bar')
However, when we add a asyncio.sleep(1), the loop will end instead of looping.
async def bar_loop(self):
while True:
print('bar')
await asyncio.sleep(1)
Why does asyncio.sleep() cause bar_loop to exit immediately? How can we let it loop with a 1 sec delay?
Full Example:
import asyncio
from typing import Optional
class Foo:
def __init__(self):
self.bar_loop_task: Optional[asyncio.Task] = None
async def start(self):
self.bar_loop_task = asyncio.create_task(self.bar_loop())
async def stop(self):
if self.bar_loop_task is not None:
self.bar_loop_task.cancel()
async def bar_loop(self):
while True:
print('bar')
await asyncio.sleep(1)
if __name__ == '__main__':
try:
foo = Foo()
asyncio.run(foo.start())
except KeyboardInterrupt:
asyncio.run(foo.stop())
Using Python 3.9.5 on Ubuntu 20.04.
This behavior has nothing to do with calling asyncio.sleep, but with the expected behavior of creating a task and doing nothing else.
Tasks will run in parallel in the the asyncio loop, while other code that uses just coroutine and await expressions can be thought as if run in a linear pattern - however, as the are "out of the way" of the - let's call it "visible path of execution", they also won't prevent that flow.
In this case, your program simply reaches the end of the start method, with nothing left being "awaited", the asyncio loop simply finishes its execution.
If you have no explicit code to run in parallel to bar_loop, just await for the task. Change your start method to read:
async def start(self):
self.bar_loop_task = asyncio.create_task(self.bar_loop())
try:
await self.bar_loop_task
except XXX:
# handle excptions that might have taken place inside the task

Can I start a parallel process from while loop in python?

I need to start another process to run parallel with the while loop:
while True:
#bunch of stuff happening
if #something happens:
#do something (here I have something that takes time and while loop will 'pause' untill this
finishes. I need the while loop to somehow continue looping parallel with
this process.)
I tried something like this:
while True:
#bunch of stuff happening
if #something happens:
exec(open("filename.py").read()) #here I tried to call for another script but the while loop
won't continue. It just runs this script and finishes, but
I need this secont script to run parallel with the while loop looping.
You could use multiprocessing for this. Check the doc here
Here's a minimalistic example, hope this helps you.
import multiprocessing
number_of_processes = 5
def exec_process(filename):
#your exec code goes here
p = multiprocessing.Pool(processes = number_of_processes)
while True:
if: #something happens
p.apply_async(exec_process, (filename,))
p.close()
p.join()
Additionally, it is also good to use callback which becomes like master to your processes where you could define your terminating conditions.
Your definition could be like:
def exec_process(filename):
try:
#do what it does
return True
except:
return False
def callback(result):
if not result:
#do what you want to do in case of failure
#something like p.terminate()
#indicate failure to global variables
#Now apply call becomes:
p.apply_async(exec_process, (filename,), callback=callback)
You can use asyncio to do that. Here's a fully working example of a basic producer/consumer:
import asyncio
import random
from datetime import datetime
from pydantic import BaseModel
class Measurement(BaseModel):
data: float
time: datetime
async def measure(queue: asyncio.Queue):
while True:
# Replicate blocking call to recieve data
await asyncio.sleep(1)
print("Measurement complete!")
for i in range(3):
data = Measurement(
data=random.random(),
time=datetime.utcnow()
)
await queue.put(data)
await queue.put(None)
async def process(queue: asyncio.Queue):
while True:
data = await queue.get()
print(f"Got measurement! {data}")
# Replicate pause for http request
await asyncio.sleep(0.3)
print("Sent data to server")
loop = asyncio.get_event_loop()
queue = asyncio.Queue(loop=loop)
meansurement = measure(queue)
processor = process(queue)
loop.run_until_complete(asyncio.gather(processor, meansurement))
loop.close()

Categories