I've been trying to run two asyncio loops in parallel, but I am failing to find meaningful instruction on how to do so. I want to execute two async functions at the same time, while both of them depend on one global variable.
My code looks something like the following:
import asyncio
#a---------------------------------------------------------------
async def foo(n):
print("Executing foo(n)")
return n**2
async def main_a():
print("Executing main_a()")
n = await foo(3)
return n+1
x = 1
async def periodic_a():
global x
i = 0
while True:
i += 2
x = await main_a()
x += i
await asyncio.sleep(1)
#b-----------------------------------------------------------------
async def periodic_b():
global x
while True:
print(f"There are {x} ducks in the pond")
await asyncio.sleep(5)
#execution---------------------------------------------------------
loop = asyncio.get_event_loop()
task = loop.create_task(periodic_a())
try:
loop.run_until_complete(task)
except asyncio.CancelledError:
pass
except KeyboardInterrupt:
task.cancel()
loop.close()
pass
I am trying to get functions periodic_a and periodic_b to run at the same time, and provide the output of print(f"There are {x} ducks in the pond") every five seconds. Thank you in advance for any help!
You should create two tasks for each function you want to run concurrently and then await them with asyncio.gather. Also note you should use asyncio.run instead of using the event loop directly, this will make your code cleaner as it handles creating and shutting down the loop for you. Modify the execute section of your code the the following:
async def main():
periodic_a_task = asyncio.create_task(periodic_a())
periodic_b_task = asyncio.create_task(periodic_b())
await asyncio.gather(periodic_a_task, periodic_b_task)
asyncio.run(main())
Also note you mention multiple processes, but there isn't any need to use multiprocessing in the example you're describing. If you do need multiprocessing, you'll need a different approach for global data with shared memory.
Related
I have issue about run multiple function using Telethon
for example I want to using bot management command and tracker function both same time so I know I should multithread but here is my script I am trying to run both of them but never run at the same time.
def Checker():
print('I am Running')
while True:
if isStart:
for i in SpesificDictionary:
Element = SpesificDictionary[i]
poster(Element,i)
time.sleep(10)
async def poster(Element,chatId):
text = Element.API.getText()
if text != None:
luckyNews = await randomAds()
if(luckyNews != None):
print(f"Sending to {luckyNews[0]} with {luckyNews[1]}")
text += f"\n\n <b>π Ad's:</b> '<a href='{luckyNews[0]}'><b>{luckyNews[1]}</b></a>'"
else:
text += f"\n\n <b>π Ad's:</b> <b>Ads your projectπ</b>"
if(len(SpesificButtonAdvertise) != 0):
keyboard = [[Button.url(str(SpesificButtonAdvertise[1]),str(SpesificButtonAdvertise[0]))]]
else:
keyboard = [[Button.url('Advertise your project here π', "https://t.me/contractchecker")]]
# chat = BOT.get_entity(-1001639775918) #-1001639775918 test # main -1001799563725 # sohbet : -1001648583714
chat = BOT.get_entity(chatId)
await BOT.send_file(chat, 'giphy.gif', caption= text, buttons= keyboard, parse_mode = 'HTML')
else:
print("Waiting for the next update")
def main():
BOT.start(bot_token=BOT_TOKEN)
loop = asyncio.get_event_loop()
tasks = [loop.create_task(Checker()),
loop.create_task(BOT.run_until_disconnected())]
loop.run_until_complete(asyncio.wait(tasks))
loop.close()
There are several problems with the listed code.
Your def Checker() is not an async def. It's going to run immediately when you call it, and loop.create_task(Checker()) won't work at all.
You are calling poster, which is an async def, without using await. This means it won't run at all.
You are using time.sleep, which blocks the entire thread, meaning asyncio cannot run its loop, and therefore any tasks created won't run either.
BOT.get_entity is also an async def. It should be await-ed.
Checker would look like this:
async def Checker():
print('I am Running')
while True:
if isStart:
for i in SpesificDictionary:
Element = SpesificDictionary[i]
await poster(Element,i)
await asyncio.sleep(10)
And don't forget to await BOT.get_entity(chatId).
But I strongly recommend reading through the asyncio documentation and being more comfortable with asyncio before attempting to write more complex code.
I have a function in python that should kick off an asynchronous 'slow' function, and return before that 'slow' function completes.
async def doSomethingSlowButReturnQuickly():
asyncio.create_task(_mySlowFunction())
return "Returning early! Slow function is still running."
async def _mySlowFunction():
// Do some really slow stuff that isn't blocking for our other function.
Running this still seems to cause the 'return' to not happen until AFTER my async task has completed.
How do I correct this?
(And apologies, this is day #2 writing Python)
This isn't exactly what you asked, but you can run code in an async executor and continue while it is still running:
from concurrent.futures import ThreadPoolExecutor
from time import sleep
def blocking_code():
sleep(2)
print('inner')
return 'result'
def main():
with ThreadPoolExecutor(max_workers=1) as executor:
future = executor.submit(blocking_code)
print('after')
print(future.result())
return
if __name__ == '__main__':
main()
Output:
after
inner
result
Maybe it's a dull answer, but if you want to stick with asyncio, I would advise to be sure using await in everyasync def functions.
async def doSomethingSlowButReturnQuickly():
task = asyncio.create_task(_mySlowFunction())
# must await something here!
return "Returning early! Slow function is still running."
async def _mySlowFunction():
# must await something here
If you don't need to get the slow function return value, why don't you directly call the two coroutines from a main function?
import asyncio
async def doSomethingSlowButReturnQuickly():
await asyncio.sleep(0.1) # must await something here!
return "Returning early! Slow function is still running."
async def _mySlowFunction():
await asyncio.sleep(10) # must await something here
async def main():
tasks = [_mySlowFunction(), doSomethingSlowButReturnQuickly()]
asyncio.gather(*tasks)
asyncio.run(main())
I'm new to this so I apologize for mistakes
I'm trying to figure out a way to iterate inside a for loop range, calling an async function but without waiting for a response
here's my code
import asyncio
from random import randint
import time
import threading
async def print_i(i):
number = 0
if (number % 2) == 0: #check for even number
time.sleep(5)
while number != 5:
number = randint(0,100)
print("id-", i)
for i in range (0,100):
asyncio.run(print_i(i))
# thread = threading.Thread(target=print_i(i))
# thread.start()
Both the asyncio.run and the thread.start() are linearly executing the called function, whereas i was hoping that the for loop would call the functions in all iterations in one go, and only the even numbers of "i" would get the time.sleep(5)
Is this possible?
Here's some basic examples I made about how to achieve concurrency in asyncio, threading, and trio. Consider range() call as list in these cases.
If you wonder why the trio, there's a better alternative to asyncio - called Structured Concurrency - and they use different method when spawning a concurrent task - you might stumble on it one day.
For asyncio:
import asyncio
async def task(num: int):
print(f"task {num} started.")
# async function need something 'awaitable' to be asynchronous
await asyncio.sleep(3)
print(f"task {num} finished.")
async def spawn_task():
task_list = []
for n in range(5):
task_list.append(asyncio.create_task(task(n)))
await asyncio.gather(*task_list)
asyncio.run(spawn_task())
For threading:
import threading
import time
def thread_workload(num: int):
print(f"task {num} started.")
# most of python's IO functions (including time.sleep) release GIL,
# allowing other thread to run.
# GIL prevents more than 1 thread running the python code.
time.sleep(3)
print(f"task {num} finished.")
def spawn_thread():
for n in range(5):
t = threading.Thread(target=thread_workload, args=(n,))
t.start()
spawn_thread()
For Trio:
import trio
async def task(num: int):
print(f"task {num} started.")
# async function need something 'awaitable' to be asynchronous
await trio.sleep(3)
print(f"task {num} finished.")
async def spawn_task():
async with trio.open_nursery() as nursery:
# explicit task spawning area. Nursery for tasks!
for n in range(5):
nursery.start_soon(task, n)
trio.run(spawn_task)
Output:
task 0 started.
task 1 started.
task 2 started.
task 3 started.
task 4 started.
task 0 finished.
task 1 finished.
task 2 finished.
task 3 finished.
task 4 finished.
My Source Code:
import asyncio
async def mycoro(number):
print(f'Starting {number}')
await asyncio.sleep(1)
print(f'Finishing {number}')
return str(number)
c = mycoro(3)
task = asyncio.create_task(c)
loop = asyncio.get_event_loop()
loop.run_until_complete(task)
loop.close()
Error:
RuntimeError: no running event loop
sys:1: RuntimeWarning: coroutine 'mycoro' was never awaited
I was watching a tutorial and according to my code it was never awaited when I did and it clearly does in the video I was watching.
Simply run the coroutine directly without creating a task for it:
import asyncio
async def mycoro(number):
print(f'Starting {number}')
await asyncio.sleep(1)
print(f'Finishing {number}')
return str(number)
c = mycoro(3)
loop = asyncio.get_event_loop()
loop.run_until_complete(c)
loop.close()
The purpose of asyncio.create_task is to create an additional task from inside a running task. Since it directly starts the new task, it must be used inside a running event loop β hence the error when using it outside.
Use loop.create_task(c) if a task must be created from outside a task.
In more recent version of asyncio, use asyncio.run to avoid having to handle the event loop explicitly:
c = mycoro(3)
asyncio.run(c)
In general, use asyncio.create_task only to increase concurrency. Avoid using it when another task would block immediately.
# bad task usage: concurrency stays the same due to blocking
async def bad_task():
task = asyncio.create_task(mycoro(0))
await task
# no task usage: concurrency stays the same due to stacking
async def no_task():
await mycoro(0)
# good task usage: concurrency is increased via multiple tasks
async def good_task():
tasks = [asyncio.create_task(mycoro(i)) for i in range(3)]
print('Starting done, sleeping now...')
await asyncio.sleep(1.5)
await asyncio.gather(*tasks) # ensure subtasks finished
Change the line
task = asyncio.Task(c)
I want to use both ThreadPoolExecutor from concurrent.futures and async functions.
My program repeatedly submits a function with different input values to a thread pool. The final sequence of tasks that are executed in that larger function can be in any order, and I don't care about the return value, just that they execute at some point in the future.
So I tried to do this
async def startLoop():
while 1:
for item in clients:
arrayOfFutures.append(await config.threadPool.submit(threadWork, obj))
wait(arrayOfFutures, timeout=None, return_when=ALL_COMPLETED)
where the function submitted is:
async def threadWork(obj):
bool = do_something() # needs to execute before next functions
if bool:
do_a() # can be executed at any time
do_b() # ^
where do_b and do_a are async functions.The problem with this is that I get the error: TypeError: object Future can't be used in 'await' expression and if I remove the await, I get another error saying I need to add await.
I guess I could make everything use threads, but I don't really want to do that.
I recommend a careful readthrough of Python 3's asyncio development guide, particularly the "Concurrency and Multithreading" section.
The main conceptual issue in your example that event loops are single-threaded, so it doesn't make sense to execute an async coroutine in a thread pool. There are a few ways for event loops and threads to interact:
Event loop per thread. For example:
async def threadWorkAsync(obj):
b = do_something()
if b:
# Run a and b as concurrent tasks
task_a = asyncio.create_task(do_a())
task_b = asyncio.create_task(do_b())
await task_a
await task_b
def threadWork(obj):
# Create run loop for this thread and block until completion
asyncio.run(threadWorkAsync())
def startLoop():
while 1:
arrayOfFutures = []
for item in clients:
arrayOfFutures.append(config.threadPool.submit(threadWork, item))
wait(arrayOfFutures, timeout=None, return_when=ALL_COMPLETED)
Execute blocking code in an executor. This allows you to use async futures instead of concurrent futures as above.
async def startLoop():
while 1:
arrayOfFutures = []
for item in clients:
arrayOfFutures.append(asyncio.run_in_executor(
config.threadPool, threadWork, item))
await asyncio.gather(*arrayOfFutures)
Use threadsafe functions to submit tasks to event loops across threads. For example, instead of creating a run loop for each thread you could run all async coroutines in the main thread's run loop:
def threadWork(obj, loop):
b = do_something()
if b:
future_a = asyncio.run_coroutine_threadsafe(do_a(), loop)
future_b = asyncio.run_coroutine_threadsafe(do_b(), loop)
concurrent.futures.wait([future_a, future_b])
async def startLoop():
loop = asyncio.get_running_loop()
while 1:
arrayOfFutures = []
for item in clients:
arrayOfFutures.append(asyncio.run_in_executor(
config.threadPool, threadWork, item, loop))
await asyncio.gather(*arrayOfFutures)
Note: This example should not be used literally as it will result in all coroutines executing in the main thread while the thread pool workers just block. This is just to show an example of the run_coroutine_threadsafe() method.