I've got a situation where I can't be sure if a function will be sync, or async. In other words, the function has a type signature of Union[Callable[..., Awaitable[Any]], Callable[..., Any]].
I can't seem to find a good (non-deprecated) way of turning the above type signature into a consistent type of Callable[..., Awaitable[Any]].
The only way I was able to find how to do this, is by doing this:
import asyncio
import inspect
from typing import Any, Awaitable, Callable, List, Union
def force_awaitable(function: Union[Callable[..., Awaitable[Any]], Callable[..., Any]]) -> Callable[..., Awaitable[Any]]:
if inspect.isawaitable(function):
# Already awaitable
return function
else:
# Make it awaitable
return asyncio.coroutine(function)
However, asyncio.coroutine (which is normally used as a decorator) is deprecated since Python 3.8. https://docs.python.org/3/library/asyncio-task.html#asyncio.coroutine
The alternative provided does not work for me here, since I don't use asyncio.coroutine as a decorator. Unlike a decorator, the async keyword can't be used as a function.
How do I turn a sync function (not awaitable) into an async function (awaitable)?
Other considered options
Above I've shown that you can detect if something is awaitable. This allows us to change how to call like so:
def my_function():
pass
if inspect.isawaitable(function):
await my_function()
else:
my_function()
However, this feels clunky, can create messy code, and creates unnecessary checks inside big loops. I want to be able to define how to call the function before I'm entering my loop.
In NodeJS I would simply await the sync function:
// Note: Not Python!
function sync() {}
await sync();
When I try to do the same thing in Python, I'm greeted with an error:
def sync():
pass
await sync() # AttributeError: 'method' object has no attribute '__await__'
You can return an async function with the original function object in its scope:
def to_coroutine(f:Callable[..., Any]):
async def wrapper(*args, **kwargs):
return f(*args, **kwargs)
return wrapper
def force_awaitable(function: Union[Callable[..., Awaitable[Any]], Callable[..., Any]]) -> Callable[..., Awaitable[Any]]:
if inspect.iscoroutinefunction(function):
return function
else:
return to_coroutine(function)
Now, if function is not awaitable, force_awaitable will return a coroutine function that contains function.
def test_sync(*args, **kwargs):
pass
async def main():
await force_awaitable(test_sync)('foo', 'bar')
asyncio.run(main())
Related
The following code works as expected for the synchronous part but gives me a TypeError for the async call (TypeError: object NoneType can't be used in 'await' expression), presumably because the mock constructor can't properly deal with the spec. How do I properly tell Mockito that it needs to set up an asynchronous mock for async_method ?
class MockedClass():
def sync_method(self):
pass
async def async_method(self):
pass
class TestedClass():
def do_something_sync(self, mocked: MockedClass):
mocked.sync_method()
async def do_something_async(self, mocked: MockedClass):
await mocked.async_method()
#pytest.mark.asyncio
async def test():
my_mock = mock(spec=MockedClass)
tested_class = TestedClass()
tested_class.do_something_sync(my_mock)
verify(my_mock).sync_method()
await tested_class.do_something_async(my_mock) # <- Fails here
verify(my_mock).async_method()
Edit:
For reference, this is how it works with the standard mocks (the behavior that I expect):
In mockito my_mock.async_method() would not return anything useful by default and without further configuration. (T.i. it returns None which is not awaitable.)
What I did in the past:
# a helper function
def future(value=None):
f = asyncio.Future()
f.set_result(value)
return f
# your code
#pytest.mark.asyncio
async def test():
my_mock = mock(spec=MockedClass)
when(my_mock).async_method().thenReturn(future(None)) # fill in whatever you expect the method to return
# ....
I'm currently working on a small Telegram bot library that uses PTB under the hood.
One of the syntactic sugar features is this decorator function I use, to ensure code locality.
Unfortunately, the inner wrapper is never called (in my current library version, I have an additional wrapper called #aexec I place below #async_command, but that's not ideal)
def async_command(name: str):
def decorator(func):
updater = PBot.updater
updater.dispatcher.add_handler(CommandHandler(name, func, run_async=True))
def wrapper(update: Update, context: CallbackContext):
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop.run_until_complete(func(update, context))
loop.close()
return wrapper
return decorator
#async_command(name="start")
async def start_command(update: Update, context: CallbackContext) -> None:
update.message.reply_text(text="Hello World")
"def wrapper" inside "async_command" is never called, so the function is never executed.
Any idea how to achieve this without needing an additional decorator to start a new asyncio event loop?
Note: PBot is just a simple class that contains one static "updater" that can be re-used anywhere in the code (PTB uses "updater" instances, which is not ideal for my use cases)
EDIT: I notice that the issue of the inner wrapper not being called only happens on async function. Is this a case of differing calling conventions?
I actually figured it out myself.
I took the existing #aexec and chained the two functions internally, creating a new decorator.
def aexec(func):
def wrapper(update: Update, context: CallbackContext):
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop.run_until_complete(func(update, context))
loop.close()
return wrapper
def _async_command(name: str):
def wrapper(func):
updater = PBot.updater
updater.dispatcher.add_handler(CommandHandler(name, func, run_async=True))
return wrapper
def async_command(name: str):
run_first = _async_command(name)
def wrapper(func):
return run_first(aexec(func))
return wrapper
Now I can use #async_command(name="name_of_command") and it will first call _async_command, then aexec
I have a function that has a decorator #retry, which retries the function if a certain Exception happened. I want to test that this function executes the correct amount of times, for which I have the following code which is working:
#pytest.mark.asyncio
async def test_redis_failling(mocker):
sleep_mock = mocker.patch.object(retry, '_sleep')
with pytest.raises(ConnectionError):
retry_store_redis()
assert sleep_mock.call_count == 4
#retry(ConnectionError, initial_wait=2.0, attempts=5)
def retry_store_redis():
raise ConnectionError()
But, if I modify retry_store_redis() to be an async function, the return value of sleep_mock.call_count is 0.
So you define "retry" as a function. Then you define a test, then you define some code that uses #retry.
#retry, as a decorator, is being called at import time. So the order of operations is
declare retry
declare test
call retry with retry_store_redis as an argument
start your test
patch out retry
call the function you defined in step 3
so "retry" gets called once (at import time), your mock gets called zero times. To get the behavior you want, (ensuring that retry is actually re-calling the underlying function) I would do
#pytest.mark.asyncio
async def test_redis_failling(mocker):
fake_function = MagicMock(side_effect=ConnectionError)
decorated_function = retry(ConnectionError, initial_wait=2.0, attempts=5)(fake_function)
with pytest.raises(ConnectionError):
decorated_function()
assert fake_function.call_count == 4
if you wanted to test this as built (instead of a test specifically for the decorator) you would have to mock out the original function inside the decorated function- which would depend on how you implemented the decorator. The default way (without any libraries) means you would have to inspect the "closure" attribute. You can build the object to retain a reference to the original function though, here is an example
def wrap(func):
class Wrapper:
def __init__(self, func):
self.func = func
def __call__(self, *args, **kwargs):
return self.func(*args, **kwargs)
return Wrapper(func)
#wrap
def wrapped_func():
return 42
in this scenario, you could patch the wrapped function at wrapped_func.func
I use Python 3.7 and have following decorator:
def decorator(success_check: function):
def wrap(func):
async def inner(root, info, **args):
func_result = await func(root, info, **args)
if not success_check(func_result):
pass # do some action
return func(root, info, **args)
return inner
return wrap
In a current implementation func is awaited two times. Can I make it work with func awaited once?
if you call return await func(root, info, **args), or, event better, just do return func_result, most likely, it will solve your issue
There's one class that extends another and overrides a coroutine that returns an iterator:
class Repository:
async def run(self, query: Query) -> AsyncIterator[int]:
...
class MyRepository(Repository):
async def run(self, query: Query) -> AsyncIterator[int]:
...
Running mypy returns this error:
error: Return type "AsyncIterator[int]" of "run" incompatible with return type "Coroutine[Any, Any, AsyncIterator[int]]" in supertype "Repository"
Coroutines are typed like normal functions, so I'm not sure what the right approach is.
Using ABC classes won't fix it:
class Repository(metaclass=ABCMeta):
#abstractmethod
async def run(self, query: Query) -> AsyncIterator[int]:
Found it thanks to this issue:
I think you shouldn't make the protocol function async def, but just
def. Conceptually, an async generator is a callable that returns an
AsyncIterator (or more precisely, an AsyncGenerator). But an async def
function without a yield returns an Awaitable of whatever its declared
return type is, so that's how mypy interprets your protocol.
So, changing async def run with def run works.