Issue with asynchronous (non blocking) methods with Sanic - python

So I'm trying to use Sanic to to some asynchronous web requests as I have some special ones that take a few seconds to get back, but want to do other requests in the mean time from the client. Here is an example method that seems to still be blocking other calls from the client while its waiting for the lib.getAlarmState() to come back. (lib.getAlartmState() is a call to a C library using pythons ctypes that takes about 3 seconds to return and returns an Int type.)
According to what I'm seeing in documentation for sanic, simply defining the method as async should do what I'm looking to do? I've tried adding an await in front of lib.getAlarmState() but I'm not sure I'm using that quite right.
#app.route('/processjson')
async def processjson(request):
vals = lib.getAlarmState()
return response.json({"alarm:" : vals})
I expect that while the shown method is off doing its thing, I should be able to call other methods from the client and get responses.

That is correct. Sanic will not be able to convert blocking calls to asynchronous. Perhaps, what you can try instead is to run that in a seperate task.
async def get_alarm_state_wrapper():
return lib.getAlarmState()
#app.route('/processjson')
async def processjson(request):
vals = await asyncio.gather(get_alarm_state_wrapper())
return response.json({"alarm:" : vals})

Related

How to make a function behave asynchronously?

I've recently been having a run-up with asynchronous functions in Python, and I wonder how one could make a synchronous function into an asynchronous one.
For example, there is the library for translation via google api pygoogletranslation. One could most possibly wonder, how to translate many different words asynchronously. Of course, you could place it into one request, but then google api would consider it a text and treat it accordingly, which will cause incorrect results.
How could one turn this code:
from pygoogletranslation import Translator
translator = Translator()
translations = []
words = ['partying', 'sightseeing', 'sleeping', 'catering']
for word in words:
translations.append(translator.translate(word, src='en', dest='es'))
print(translations)
Into this:
from pygoogletranslation import Translator
import asyncio
translator = Translator()
translation_tasks = []
words = ['partying', 'sightseeing', 'sleeping', 'catering']
for word in words:
asyncio.create_task(translator.translate(word, src='en', dest='es'))
translations = asyncio.run(
asyncio.gather(translation_tasks, return_exceptions=True)
)
print(translations)
Considering the function translate doesn't have a built-in async implementation?
You will have to create an async function and then run it. Though if translate doesn't have built in async support or is blocking, using async will not make it faster. It's probably better to use multithreading/multiprocessing as suggested in the comments.
async def main():
async def one_iteration(word):
output.append(translator.translate(word, src='en', dest='es'))
coros = [one_iteration(word) for word in words]
await asyncio.gather(*coros)
asyncio.run(main())
As mentioned in other answers, calling a blocking function is useless with ayncio. In this particular case, I suggest you use google-cloud-translate, which is the official translate library from Google.
You could have done something like this in your current library:
async def do_task(word):
return translator.translate(word, ...)
def main():
# Create translator
...
asyncio.gather(do_task(word) for word in [])
But this will just run the task in the same way without asyncio. The real gain in asyncio is that, when is something pending or waiting, it can do something else. eg, while waiting for response from server, it can send another request.
How will Python know that some work is pending? Only when the function (coroutine here) notifies the event loop via await keyword. So you definitely need to use a library that natively supports async operations. The above mentioned google-cloud-translate is such a library. You can do:
from google.cloud import translate
async def main():
# Async-supported google translator client
client = translate.TranslationServiceAsyncClient()
words = ['partying', 'sightseeing', 'sleeping', 'catering']
results = await asyncio.gather(*[client.translate_text(parent=f"projects/{project_name}", contents=[word], source_language_code="en", target_language_code="es") for word in words])
print(results)
asyncio.run(main())
You can see that this client actually takes list of strings as input, so you could directly pass the list of strings here. According to docs, the limit for that is 1024. So if your list is bigger, you have to use this for loop.
You might have to set up credentials etc for this client though, which is outside the scope of this question.
To make a function async, you need to define it with async def and change it to use other async functions for anything that might block - for example, instead of requests you'd use aiohttp, and so on. The point of the effort is that the function can then be executed by an event loop along with other such functions. Whenever an async function needs to wait for something, as signaled by the await keyword, it suspends to the event loop and gives others a chance to execute. The event loop will seamlessly coordinate concurrent execution of a possibly large number of such async functions. See e.g. this answer for more details.
If a critical blocking function that you are depending on doesn't have an async implementation, you can use run_in_executor (or, beginning with Python 3.9, asyncio.to_thread) to make it async. Note, however, that such solutions are "cheating" because they use threads under the hood, so they will not provide benefits normally associated by asyncio such as ability to scale beyond the number of threads in the thread pool, or ability to cancel execution of coroutines.

How do I make my list comprehension (and the function it calls) run asynchronously?

class Class1():
def func1():
self.conn.send('something')
data = self.conn.recv()
return data
class Class2():
def func2():
[class1.func1() for class1 in self.classes]
How do I make that last line asynchronously in python? I've been googling but can't understand async/await and don't know which functions I should be putting async in front of. In my case, all the class1.func1 need to send before any of them can receive anything. I was also seeing that __aiter__ and __anext__ need to be implemented, but I don't know how those are used in this context. Thanks!
It is indeed possible to fire off multiple requests and asynchronously
wait for them. Because Python is traditionally a synchronous language,
you have to be very careful about what libraries you use with
asynchronous Python. Any library that blocks the main thread (such as
requests) will break your entire asynchronicity. aiohttp is a common
choice for asynchronously making web API calls in Python. What you
want is to create a bunch of future objects inside a Python list and
await it. A future is an object that represents a value that will
eventually resolve to something.
EDIT: Since the function that actually makes the API call is
synchronous and blocking and you don't have control over it, you will
have to run that function in a separate thread.
Async List Comprehensions in Python
import asyncio
async def main():
loop = asyncio.get_event_loop()
futures = [asyncio.ensure_future(loop.run_in_executor(None, get_data, data)) for data in data_name_list]
await asyncio.gather(*futures) # wait for all the future objects to resolve
# Do something with futures
# ...
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
loop.close()

Calling another function asynchronously and never wait it to finish in Python

I am working on a chatbot, where before I reply to the user I make a DB call to save the chat in a table. This will be done each time user types something, and it increases the response time.
So to decrease the response time, we need to call this asynchronously.
How to do this in Python 3?
I have read tutorials of asyncio library, but did not understand it completely and could not understand how to make it work.
Another workaround is to use queuing system, but that sounds like an overkill.
Example:
request = get_request_from_chat
res = call_some_function_to_prepare_response()
save_data() # this will be call asynchronously
reply() # this should not wait save_data() to finish
Any suggestions are welcome.
Use loop.create_task(some_async_function()) to run an async function "in the background". For example, this answer shows how to do that in case of a trivial client-server communication.
In your case the pseudo-code would look like this:
request = await get_request_from_chat()
res = call_some_function_to_prepare_response()
loop = asyncio.get_event_loop()
loop.create_task(save_data()) # runs in the "background"
reply() # doesn't wait for save_data() to finish
For this to work, of course, the program must be written for asyncio and save_data must be a coroutine. For a chat server it's a good approach to follow anyway, so I would recommend to give asyncio a chance.
Because you mentioned
Another workaround is to use queuing system, but that sounds like an
overkill.
I assume you are open to other solutions so I will propose multi-threading approach:
from concurrent.futures import ThreadPoolExecutor
from time import sleep
def long_runnig_funciton(param1):
print(param1)
sleep(10)
return "Complete"
with ThreadPoolExecutor(max_workers=10) as executor:
future = executor.submit(long_runnig_funciton,["Param1"])
print(future.result(timeout=12))
Steps:
1) You create a ThreadPoolExecutor and define maximum number of concurrent tasks.
2) You submit a function with arguments it needs
3) You call result() on the return value from submit() when you need the results
Note that the result() can throw exception if exception was thrown in the submitted function
You can also check if the result of your call is ready with future.done() which returns True or False

When to use and when not to use Python 3.5 `await` ?

I'm getting the flow of using asyncio in Python 3.5 but I haven't seen a description of what things I should be awaiting and things I should not be or where it would be neglible. Do I just have to use my best judgement in terms of "this is an IO operation and thus should be awaited"?
By default all your code is synchronous. You can make it asynchronous defining functions with async def and "calling" these functions with await. A More correct question would be "When should I write asynchronous code instead of synchronous?". Answer is "When you can benefit from it". In cases when you work with I/O operations as you noted you will usually benefit:
# Synchronous way:
download(url1) # takes 5 sec.
download(url2) # takes 5 sec.
# Total time: 10 sec.
# Asynchronous way:
await asyncio.gather(
async_download(url1), # takes 5 sec.
async_download(url2) # takes 5 sec.
)
# Total time: only 5 sec. (+ little overhead for using asyncio)
Of course, if you created a function that uses asynchronous code, this function should be asynchronous too (should be defined as async def). But any asynchronous function can freely use synchronous code. It makes no sense to cast synchronous code to asynchronous without some reason:
# extract_links(url) should be async because it uses async func async_download() inside
async def extract_links(url):
# async_download() was created async to get benefit of I/O
html = await async_download(url)
# parse() doesn't work with I/O, there's no sense to make it async
links = parse(html)
return links
One very important thing is that any long synchronous operation (> 50 ms, for example, it's hard to say exactly) will freeze all your asynchronous operations for that time:
async def extract_links(url):
data = await download(url)
links = parse(data)
# if search_in_very_big_file() takes much time to process,
# all your running async funcs (somewhere else in code) will be frozen
# you need to avoid this situation
links_found = search_in_very_big_file(links)
You can avoid it calling long running synchronous functions in separate process (and awaiting for result):
executor = ProcessPoolExecutor(2)
async def extract_links(url):
data = await download(url)
links = parse(data)
# Now your main process can handle another async functions while separate process running
links_found = await loop.run_in_executor(executor, search_in_very_big_file, links)
One more example: when you need to use requests in asyncio. requests.get is just synchronous long running function, which you shouldn't call inside async code (again, to avoid freezing). But it's running long because of I/O, not because of long calculations. In that case, you can use ThreadPoolExecutor instead of ProcessPoolExecutor to avoid some multiprocessing overhead:
executor = ThreadPoolExecutor(2)
async def download(url):
response = await loop.run_in_executor(executor, requests.get, url)
return response.text
You do not have much freedom. If you need to call a function you need to find out if this is a usual function or a coroutine. You must use the await keyword if and only if the function you are calling is a coroutine.
If async functions are involved there should be an "event loop" which orchestrates these async functions. Strictly speaking it's not necessary, you can "manually" run the async method sending values to it, but probably you don't want to do it. The event loop keeps track of not-yet-finished coroutines and chooses the next one to continue running. asyncio module provides an implementation of event loop, but this is not the only possible implementation.
Consider these two lines of code:
x = get_x()
do_something_else()
and
x = await aget_x()
do_something_else()
Semantic is absolutely the same: call a method which produces some value, when the value is ready assign it to variable x and do something else. In both cases the do_something_else function will be called only after the previous line of code is finished. It doesn't even mean that before or after or during the execution of asynchronous aget_x method the control will be yielded to event loop.
Still there are some differences:
the second snippet can appear only inside another async function
aget_x function is not usual, but coroutine (that is either declared with async keyword or decorated as coroutine)
aget_x is able to "communicate" with the event loop: that is yield some objects to it. The event loop should be able to interpret these objects as requests to do some operations (f.e. to send a network request and wait for response, or just suspend this coroutine for n seconds). Usual get_x function is not able to communicate with event loop.

Parallelise web tasks with asyncio in Python

I'm trying to wrap my head around asyncio and aiohttp and for the first time in years programming makes me feel utterly stupid and incapable. Which is kind of beautiful, in a weirdo Zen way. But alas, there's work to get done.
I've got an existing class that can do numerous wondrous things on the web, like signing up to a web site, getting data, the works. And now I need like, 100 or 1000 of these little worker bees to sign up. Code looks roughly like this:
class Worker(object):
def signup(self, ...):
...
data = self.make_request(url, data)
self.user_id = data.get("user_id")
return self
def make_request(self, url, data):
response = requests.post(url, data=data)
return response.json()
workers = [Worker().signup() for n in range(100)]
As you can see, we're using the requests module to make a POST request. However this is blocking, so we'll have to wait for worker N to finish signing up before we start signing up worker N+1. Fortunately, the original author of the Worker class (that sounds charmingly Marxist) in her infinite wisdom wrapped every HTTP call in the self.make_request method, so making the whole Worker non blocking should just be a matter of swapping out the requests library for a non-blocking one aaaaand bob's your uncle, right? This is how far I got:
class AyncWorker(Worker):
#asyncio.coroutine
def make_request(self, url, data):
response = yield from aiohttp.request('post', url, data=data)
return (yield from response.json())
coroutines = [Worker().signup() for n in range(100)]
loop = asyncio.get_event_loop()
loop.run_until_complete(asyncio.wait(coroutines))
loop.close()
But this will raise an AttributeError: 'generator' object has no attribute 'get' in the signup method where I do self.user_id = data.get("user_id"). And beyond that, I still don't have the workers in a neat dictionary. I'm aware that I'm most likely completely misunderstanding how asyncio works - but I already spent a day reading through various docs, mind-shattering tutorials by David Beazly, and masses of toy examples that are simply enough that I understand them and too simple to apply to this situation. How should I structure my worker and my async loop to sign up 100 workers in parallel and eventually get a list of all workers after they signed up?
Once you use the yield (or yield from) in a function, this function becomes a coroutine. It means that you can't get a result by just calling it: you will get a generator object. You must at least do this:
#asyncio.coroutine
def some_coroutine(*args):
#...
#...
yield from tasty.asyncio.function()
return result
def coroutine_user():
# data = some_coroutine() will give you a generator object instead of result
data = yield from some_coroutine()
return data # data here is a plain result: you can call your .get or whatever
Guess what happens when you call coroutine_user():
>>> coroutine_user()
<generator object coroutine_user at 0x7fe13b8a47e0>
Lack of async.coroutine decorator doesn't help at all: coroutines are contagious! To get a result in a function, you must use yield from. It turns your function into another coroutine!
Though things aren't always that bad (usually you can manually iterate a generator object without relying on yield from), asyncio will specifically stop you from doing it: it breaks some internals (you can do it only from Future or asyncio.coroutine). So just use concurrent.futures or something similar unless you're going to turn all your code into coroutines. As some alternative, isolate all users of aiohttp.request from usual methods and work with both coroutine-based async workers and synchronous plain old code. Diving into asyncio and actually refactoring all your code is an option too, obviously: you basically need to put yield from before every call to any infected with asyncio method.

Categories