In an API developed using FastAPI framework, I am using asyncio to make calls to solr collections, but when I am calling asyncio.run(man(a,b)) from a python file query.py then I am getting asyncio.run() cannot be called from a running event loop.
in controller.py
#router.api_route()
async def api_call(a,b):
#calling a function
resp = query(a,b)
return resp
in query.py
def query(a,b):
result = asyncio.run(man(a,b))
return result
in database_call.py
async def man(a,b)
async with aiohttp.ClientSession() as session:
url = ''
async with session.get(pokemon_url) as resp:
result = await resp.json()
return result
when I am calling asyncio.run(man(a,b)) from query then I am getting asyncio.run() cannot be called from a running event loop. Kindly help me resolve the issue.
I tried:
in query.py
def query(a,b):
loop = asyncio.get_event_loop
result = loop.create_task(man(a,b))
return result
then I am getting <coroutine object main at 0x0394999ejt>
The docs say that you should have only one call to asyncio.run in a program. "Should" doesn't mean the same thing as "must", so it's not a requirement. But it's a good guideline.
Solution 1: get rid of query entirely, and just await the coroutine man() directly.
#router.api_route()
async def api_call(a,b):
return await man(a, b)
Solution 2: declare query to be an async def function, and await it:
#router.api_route()
async def api_call(a,b):
#calling a function
return await query(a,b)
async def query(a,b):
return await man(a,b)
Solution 3: Do not declare query to be async def, but have it return an awaitable object. This is similar to what you tried in your last listing, but you need to await the result.
#router.api_route()
async def api_call(a,b):
#calling a function
return await query(a,b)
def query(a, b):
return asyncio.create_task(man(a, b))
Solution 4: run query in another thread using the asyncio.to_thread function.
Solution 5: run query in a ThreadPool or ProcessPool.
The whole idea of asyncio (or any other form of parallel processing) is to allow the use of a task or a coroutine that does not finish immediately. It runs at a later time, and its result is not available until enough time has passed. Whether you use threading, multiprocessing or asyncio the situation is the same. If you need the answer from the coroutine/task/function in order for your program to continue, you need to wait for it somehow.
Related
In short, the problem is that the future returned by asyncio.run_coroutine_threadsafe is blocking when I call future.result()
The problem is also documented in the following question with (currently) no satisfactory answer: Future from asyncio.run_coroutine_threadsafe hangs forever?
What I'm trying to achieve is to call async code from sync code, where the sync code is actually itself wrapped in async code with an existing running event loop (to make things more concrete: it's a Jupyter notebook).
I would want to send async tasks from nested sync code to the existing 'outer' event loop and 'await' its results within the nested sync code. Implied constraint: I do not want to run those tasks on a new event loop (multiple reasons).
Since it's not possible to just 'await' an async result from sync code without blocking and without using asyncio.run which creates a new event loop, I thought using a separate thread would somehow help.
From the documentation description, asyncio.run_coroutine_threadsafe sounds like the perfect candidate.
But it's still blocking...
Bellow full snippet, with a timeout when calling the future's result.
How can I get this code to work correctly?
import asyncio
from concurrent.futures import ThreadPoolExecutor
async def gather_coroutines(*coroutines):
return await asyncio.gather(*coroutines)
def run_th_safe(loop, coroutines):
future = asyncio.run_coroutine_threadsafe(gather_coroutines(*coroutines), loop)
res = future.result(timeout=3) # **** BLOCKING *****
return res
def async2sync(*coroutines):
try:
loop = asyncio.get_running_loop()
except RuntimeError:
return asyncio.run(gather_coroutines(*coroutines))
# BLOW DOESN'T WORK BECAUSE run_th_safe IS BLOCKING
with ThreadPoolExecutor(max_workers=1) as ex:
thread_future = ex.submit(run_th_safe, loop, coroutines)
return thread_future.result()
# Testing
async def some_async_task(n):
"""Some async function to test"""
print('Task running with n =', n)
await asyncio.sleep(n/10)
print('Inside coro', n)
return list(range(n))
async def main_async():
coro3 = some_async_task(30)
coro1 = some_async_task(10)
coro2 = some_async_task(20)
results = async2sync(coro3, coro1, coro2)
return results
def main_sync():
coro3 = some_async_task(30)
coro1 = some_async_task(10)
coro2 = some_async_task(20)
results = async2sync(coro3, coro1, coro2)
return results
if __name__ == '__main__':
# Testing functionnality with asyncio.run()
# This works
print(main_sync())
# Testing functionnality with outer-loop (asyncio.run) and nested asyncio.run_coroutine_threadsafe
# **DOESN'T WORK**
print(asyncio.run(main_async()))
I am trying to use use grequests to make a single HTTP call asynchronously.
All goes find until I try to call a function (handle_cars) to handle the response.
The problem is that the function is an async function and it I don't know how to await it while passing.
Is this even possible?
I need the function to be async because there is another async function I need to call from it. Another solution would be to be able to call and await the other async function (send_cars) from inside the sync function.
async def handle_cars(res):
print("GOT IT")
await send_cars()
async def get_cars():
req = grequests.get(URL, hooks=dict(response=handle_cars))
job = grequests.send(req, grequests.Pool(1))
How do I set the response argument to await the function? Or how do I await send_cars if I make handle_cars synchronous?
Thanks
According to the OP's comments, he wishes to create a request in the background, and call a callback when it finishes.
The way to do so can be implemented using asyncio.create_task() and using task.add_done_callback().
A simpler way however, will work by creating a new coroutine and running it in the background.
Demonstrated using aiohttp, it will work in the following way:
async def main_loop():
task = asyncio.create_task(handle_cars(URL))
while True:
...
async def handle_cars(url):
async with aiohttp.ClientSession() as session:
async with session.get(url) as response:
await send_cars()
I am using FastApi and have one endpoint.
I have two long running functions which I want to run concurrently using asyncio
Therefore, I have created two functions:
async def get_data_one():
return 'done_one'
async def get_data_two():
return 'done_two'
These functions get data from external webservices.
I want to execute them concurrently so I have created a another function that does it:
async def get_data():
loop = asyncio.get_event_loop()
asyncio.set_event_loop(loop)
task_1 = loop.create_task(get_data_one)
task_2 = loop.create_task(get_data_two)
tasks = (task_1, task_2)
first, second = loop.run_until_complete(asyncio.gather(*tasks))
loop.close()
# I will then perform cpu intensive computation on the results
# for now - assume i am just concatenating the results
return first + second
Finally, I have my endpoint:
#app.post("/result")
async def get_result_async():
r = await get_data()
return r
Even this simple example breaks and I get the following exception when I hit the endpoint:
RuntimeError: This event loop is already running
ERROR: _GatheringFuture exception was never retrieved
future: <_GatheringFuture finished exception=AttributeError("'function' object has no attribute 'send'",)>
AttributeError: 'function' object has no attribute 'send'
This is a simplified code but I would really appreciate how to do it the right way.
When in FastAPI context, you never need to run an asyncio loop; it's always running for as long as your server process lives.
Therefore, all you need is
import asyncio
async def get_data_one():
return "done_one"
async def get_data_two():
return "done_two"
async def get_data():
a, b = await asyncio.gather(get_data_one(), get_data_two())
return a + b
##route decorator here...
async def get_result_async():
r = await get_data()
print("Get_data said:", r)
return r
# (this is approximately what is done under the hood,
# presented here to make this a self-contained example)
asyncio.run(get_result_async())
It's as simple as:
async def get_data():
first, second = await asyncio.gather(
get_data_one(),
get_data_two(),
)
return first + second
I am trying to create a periodic task for an asyncio event loop as shown below, however I am getting a "RuntimeError: cannot reuse already awaited coroutine" exception. Apparently, asyncio does not allow for the same awaitable function to be awaited as discussed in this bug thread. This is how I tried to implement it:
import asyncio
class AsyncEventLoop:
def __init__(self):
self._loop = asyncio.get_event_loop()
def add_periodic_task(self, async_func, interval):
async def wrapper(_async_func, _interval):
while True:
await _async_func # This is where it goes wrong
await asyncio.sleep(_interval)
self._loop.create_task(wrapper(async_func, interval))
return
def start(self):
self._loop.run_forever()
return
Because of my while loop, the same awaitable function (_async_func) would be executed with a sleep interval in between. I got my inspiration for the implementation of periodic tasks from How can I periodically execute a function with asyncio?
.
From the bug thread mentioned above, I infer that the idea behind the RuntimeError was so that developers wouldn't accidentally await the same coroutine twice or more, as the coroutine would be marked as done and yield None instead of the result. Is there a way I can await the same function more than once?
It seems you are confusing async functions (coroutine functions) with coroutines - values that these async functions produce.
Consider this async function:
async def sample():
await asyncio.sleep(3.14)
You are passing result of its call: add_periodic_task(sample(), 5).
Instead, you should pass async function object itself: add_periodic_task(sample, 5), and call it within your wrapper:
while True:
await _async_func()
await asyncio.sleep(_interval)
I'm trying to consume multiple queues concurrently using python, asyncio and asynqp.
I don't understand why my asyncio.sleep() function call does not have any effect. The code doesn't pause there. To be fair, I actually don't understand in which context the callback is executed, and whether I can yield control bavck to the event loop at all (so that the asyncio.sleep() call would make sense).
What If I had to use a aiohttp.ClientSession.get() function call in my process_msg callback function? I'm not able to do it since it's not a coroutine. There has to be a way which is beyond my current understanding of asyncio.
#!/usr/bin/env python3
import asyncio
import asynqp
USERS = {'betty', 'bob', 'luis', 'tony'}
def process_msg(msg):
asyncio.sleep(10)
print('>> {}'.format(msg.body))
msg.ack()
async def connect():
connection = await asynqp.connect(host='dev_queue', virtual_host='asynqp_test')
channel = await connection.open_channel()
exchange = await channel.declare_exchange('inboxes', 'direct')
# we have 10 users. Set up a queue for each of them
# use different channels to avoid any interference
# during message consumption, just in case.
for username in USERS:
user_channel = await connection.open_channel()
queue = await user_channel.declare_queue('Inbox_{}'.format(username))
await queue.bind(exchange, routing_key=username)
await queue.consume(process_msg)
# deliver 10 messages to each user
for username in USERS:
for msg_idx in range(10):
msg = asynqp.Message('Msg #{} for {}'.format(msg_idx, username))
exchange.publish(msg, routing_key=username)
loop = asyncio.get_event_loop()
loop.run_until_complete(connect())
loop.run_forever()
I don't understand why my asyncio.sleep() function call does not have
any effect.
Because asyncio.sleep() returns a future object that has to be used in combination with an event loop (or async/await semantics).
You can't use await in simple def declaration because the callback is called outside of async/await context which is attached to some event loop under the hood. In other words mixing callback style with async/await style is quite tricky.
The simple solution though is to schedule the work back to the event loop:
async def process_msg(msg):
await asyncio.sleep(10)
print('>> {}'.format(msg.body))
msg.ack()
def _process_msg(msg):
loop = asyncio.get_event_loop()
loop.create_task(process_msg(msg))
# or if loop is always the same one single line is enough
# asyncio.ensure_future(process_msg(msg))
# some code
await queue.consume(_process_msg)
Note that there is no recursion in _process_msg function, i.e. the body of process_msg is not executed while in _process_msg. The inner process_msg function will be called once the control goes back to the event loop.
This can be generalized with the following code:
def async_to_callback(coro):
def callback(*args, **kwargs):
asyncio.ensure_future(coro(*args, **kwargs))
return callback
async def process_msg(msg):
# the body
# some code
await queue.consume(async_to_callback(process_msg))
See Drizzt1991's response on github for a solution.