Call count not working with async function - python

I have a function that has a decorator #retry, which retries the function if a certain Exception happened. I want to test that this function executes the correct amount of times, for which I have the following code which is working:
#pytest.mark.asyncio
async def test_redis_failling(mocker):
sleep_mock = mocker.patch.object(retry, '_sleep')
with pytest.raises(ConnectionError):
retry_store_redis()
assert sleep_mock.call_count == 4
#retry(ConnectionError, initial_wait=2.0, attempts=5)
def retry_store_redis():
raise ConnectionError()
But, if I modify retry_store_redis() to be an async function, the return value of sleep_mock.call_count is 0.

So you define "retry" as a function. Then you define a test, then you define some code that uses #retry.
#retry, as a decorator, is being called at import time. So the order of operations is
declare retry
declare test
call retry with retry_store_redis as an argument
start your test
patch out retry
call the function you defined in step 3
so "retry" gets called once (at import time), your mock gets called zero times. To get the behavior you want, (ensuring that retry is actually re-calling the underlying function) I would do
#pytest.mark.asyncio
async def test_redis_failling(mocker):
fake_function = MagicMock(side_effect=ConnectionError)
decorated_function = retry(ConnectionError, initial_wait=2.0, attempts=5)(fake_function)
with pytest.raises(ConnectionError):
decorated_function()
assert fake_function.call_count == 4
if you wanted to test this as built (instead of a test specifically for the decorator) you would have to mock out the original function inside the decorated function- which would depend on how you implemented the decorator. The default way (without any libraries) means you would have to inspect the "closure" attribute. You can build the object to retain a reference to the original function though, here is an example
def wrap(func):
class Wrapper:
def __init__(self, func):
self.func = func
def __call__(self, *args, **kwargs):
return self.func(*args, **kwargs)
return Wrapper(func)
#wrap
def wrapped_func():
return 42
in this scenario, you could patch the wrapped function at wrapped_func.func

Related

Python decorator for accessing class variable

I have a decorator to control time limit, if the function execution exceeds limit, an error is raised.
def timeout(seconds=10):
def decorator(func):
# a timeout decorator
return decorator
And I want to build a class, using the constructor to pass the time limit into the class.
def myClass:
def __init__(self,time_limit):
self.time_limit = time_limit
#timeout(self.time_limit)
def do_something(self):
#do something
But this does not work.
File "XX.py", line YY, in myClass
#timeout(self.tlimit)
NameError: name 'self' is not defined
What's the correct way to implement this?
self.time_limit is only available when a method in an instance of your class is called.
The decorator statement, prefixing the methods, on the other hand is run when the class body is parsed.
However, the inner part of your decorator, if it will always be applied to methods, will get self as its first parameter - and there you can simply make use of any instance attribute:
def timeout(**decorator_parms):
def decorator(func):
def wrapper(self, *args, **kwargs):
time_limit = self.time_limit
now = time.time()
result = func(self, *args, **kwargs)
# code to check timeout
..
return result
return wrapper
return decorator
If your decorator is expected to work with other time limits than always self.limit you could always pass a string or other constant object, and check it inside the innermost decorator with a simple if statement. In case the timeout is a certain string or object, you use the instance attribute, otherwise you use the passed in value;
You can also decorate a method in the constructor:
def myClass:
def __init__(self,time_limit):
self.do_something = timeout(time_limit)(self.do_something)
def do_something(self):
#do something

passing argurments to a decorator in python

I am using retry function from the package retrying. I want to pass the arguments of the retry decorator from a function and I am not sure how to achieve that.
#retry # (wait_exponential_multiplier=x,wait_exponential_max=y)
def post(url, json, exponential_multiplier, exponential_max):
...
return(abc)
I want to pass the arguments of retry when calling the post(). I know when the function is compiled, the resulting function object is passed to the decorator so I am not sure if this is possible- or if I should may be approach it differently.
If you just want to use the library as is, then you cannot really use the decorator like this. It's arguments are constant from when it is invoked (excepting messing about with mutable arguments). Instead, you could always invoke the decorator before calling the function each time. This allows you to change the retrying arguments as and when you need to.
eg.
def post(url, json):
...
rety(post, wait_exponential_multiplier=...)(url=..., json=...)
But at that point, you might as well just skip the decorator altogether, and use what the decorator is using.
from retrying import Retrying
def post(url, json):
...
Retrying(wait_exponential_multiplier=...).call(post, url=..., json=...)
Either of these ways allow you to keep the post function pure and abstracted away from the concept of retrying (making it easier to call post when you don't want retrying behaviour).
You could also write a convenience function that wrapper that fills in defaults for your program. eg.
def retrier(wait_exponential_multiplier=2, **kwargs):
return Retrying(wait_exponential_multiplier=wait_exponential_multiplier, **kwargs)
retrier(wait_exponential_max=10).call(post, url=..., json=...)
retrier(wait_exponential_multiplier=3, wait_exponential_max=10).call(post, url=..., json=...)
Generally speaking there no good ways to achieve this. You surely can write code like this:
def post(url, json):
...
return(abc)
...
decorated_func = retry(wait_exponential_max=1)(post)
a = decorated_func(url, json)
and it will work. But it looks rather ugly and will construct decorated object for every call ("regular" decorators are executed once in import time).
If decorator itself is not very complex - you can use this approach in some more user-friendly manner:
def _post(url, json):
return(abc)
def post(url, json, wait_exponential_max=None, **kwargs):
return retry(wait_exponential_max=wait_exponential_max, **kwargs)(_post)(url, json)
You have to create a new decorator which pass its own arguments down to the decorated function and transforms the function using the retry decorator:
def retry_that_pass_down_arguments(**decorator_arguments):
def internal_decorator(f):
def decorated_function(*args, **kwargs):
# Add the decorator key-word arguments to key-word arguments of the decorated function
kwargs.update(decorator_arguments)
return retry(**decorator_arguments)(f)(*args, **kwargs)
return decorated_function
return internal_decorator
Then you can just do:
#retry_that_pass_down_arguments(wait_exponential_multiplier=x, wait_exponential_max=y)
def post(url, json, exponential_multiplier=None, exponential_max=None):
...
return(abc)
This is a complement to Jundiaius's answers to show that you can even use the inspect module to correctly handle the signature of the decorated function:
def deco_and_pass(deco, **kwparams):
"""Decorates a function with a decorator and parameter.
The parameters are passed to the decorator and forwarded to the function
The function must be prepared to receive those parameters, but they will
be removed from the signature of the decorated function."""
def outer(f):
sig = inspect.signature(f) # remove parameters from the function signature
params = collections.OrderedDict(sig.parameters)
for k in kwparams:
del params[k]
def inner(*args, **kwargs): # define the decorated function
kwargs.update(kwparams) # add the parameters
# and call the function through the parameterized decorator
return deco(**kwparams)(f)(*args, **kwargs)
inner.__signature__ = inspect.signature(f).replace(
parameters = params.values())
inner.__doc__ = f.__doc__ # update doc and signature
return inner
return outer
Example usage:
#deco_and_pass(retry,wait_exponential_multiplier=x,wait_exponential_max=y)
def post(url, json, exponential_multiplier, exponential_max):
...
return(abc)
...
post(url, json)
The signature of the decorated function is only def post(url, json)
Limits: the above code only accepts and passes keyword arguments for the decorator

Mock python decorator in unit tests

I am trying to test a function that is decorated. Is there a way to mock a decorator and test function in isolation, when decorator is already applied?
import mock
def decorator(func):
def wrapper(*args, **kwargs):
return 1
return wrapper
def mocked(func):
def wrapper(*args, **kwargs):
return 2
return wrapper
#decorator
def f():
return 0
with mock.patch('test.decorator') as d:
d.side_effect = mocked
assert f() == 2 # error
There is not a simple solution.
This is a similar question: How to strip decorators from a function in python
You can either modify the original code just for testing, or use something like this library: https://pypi.python.org/pypi/undecorated in order to write a helper function to switch from the original wrapper to the testing wrapper:
from undecorated import undecorated
mocked(undecorated(f))()

Python decorator to limit number of calls

I would like to write a decorator that limits the number of calls to the wrapped function. Say, if I want the wrapped function to be called maximum 10 times, the decorator should execute that function the first 10 times, then it should return None instead.
Here's what I've come up with:
from functools import wraps
def max_calls(num):
"""Decorator which allows its wrapped function to be called `num` times"""
def decorator(func):
#wraps(func)
def wrapper(*args, **kwargs):
calls = getattr(wrapper, 'calls', 0)
calls += 1
if calls == num:
return None
setattr(wrapper, 'calls', calls)
return func(*args, **kwargs)
setattr(wrapper, 'calls', 0)
return wrapper
return decorator
However, this count calls properly, returns None when the limit is reached, but...it doesn't reset between program runs. That is, if I execute the program once, the counter reaches 5, and then re-execute the program, it continues from 5. What do I need to change so that the decorator works properly?
The problem is that you maintain just one set of call counts. But that means each Flask request shares call counts with all the other requests. What you need to do is to maintain a separate set of call counts for each Flask request.
From reading the API documentation it looks as though the way to make this work is to carry out these three steps:
Make a subclass of flask.Request that can store your function call counts:
import collections
import flask
class MyRequest(flask.Request):
"""A subclass of Request that maintains function call counts."""
def __init__(self, *args, **kwargs):
super(MyRequest, self).__init__(*args, **kwargs)
self.call_counts = collections.defaultdict(int)
Set request_class to your subclass when you initialize your application:
app = flask.Flask(__name__, request_class=MyRequest, ...)
Rewrite your decorator to store its counts in the global flask.request object:
import functools
def max_calls(num, default=None):
"""Decorator which allows its wrapped function to be called at most `num`
times per Flask request, but which then returns `default`.
"""
def decorator(func):
#functools.wraps(func)
def wrapper(*args, **kwargs):
if flask.request.call_counts[func] == num:
return default
flask.request.call_counts[func] += 1
return func(*args, **kwargs)
return wrapper
return decorator
But having written that, it would be remiss for me not to point out that your question seems very strange. Why do you want to restrict the number of times a function can be called by a Flask request? Are you trying to do some kind of rate limiting? It seems likely that whatever you want to do can be done better using some other approach.

Decorating an instance method and calling it from the decorator

I am using nose test generators feature to run the same test with different contexts. Since it requires the following boiler plate for each test:
class TestSample(TestBase):
def test_sample(self):
for context in contexts:
yield self.check_sample, context
def check_sample(self, context):
"""The real test logic is implemented here"""
pass
I decided to write the following decorator:
def with_contexts(contexts=None):
if contexts is None:
contexts = ['twitter', 'linkedin', 'facebook']
def decorator(f):
#wraps(f)
def wrapper(self, *args, **kwargs):
for context in contexts:
yield f, self, context # The line which causes the error
return wrapper
return decorator
The decorator is used in the following manner:
class TestSample(TestBase):
#with_contexts()
def test_sample(self, context):
"""The real test logic is implemented here"""
var1 = self.some_valid_attribute
When the tests executed an error is thrown specifying that the attribute which is being accessed is not available. However If I change the line which calls the method to the following it works fine:
yield getattr(self, f.__name__), service
I understand that the above snippet creates a bound method where as in the first one self is passed manually to the function. However as far as my understanding goes the first snippet should work fine too. I would appreciate if anyone could clarify the issue.
The title of the question is related to calling instance methods in decorators in general but I have kept the description specific to my context.
You can use functools.partial to tie the wrapped function to self, just like a method would be:
from functools import partial
def decorator(f):
#wraps(f)
def wrapper(self, *args, **kwargs):
for context in contexts:
yield partial(f, self), context
return wrapper
Now you are yielding partials instead, which, when called as yieldedvalue(context), will call f(self, context).
As far as I can tell, some things don't fit together. First, your decorator goes like
def with_contexts(contexts=None):
if contexts is None:
contexts = ['twitter', 'linkedin', 'facebook']
def decorator(f):
#wraps(f)
def wrapper(self, *args, **kwargs):
for context in contexts:
yield f, self, context # The line which causes the error
return wrapper
return decorator
but you use it like
#with_contexts
def test_sample(self, context):
"""The real test logic is implemented here"""
var1 = self.some_valid_attribute
This is wrong: this calls with_context(test_sample), but you need with_context()(test_sample). So do
#with_contexts()
def test_sample(self, context):
"""The real test logic is implemented here"""
var1 = self.some_valid_attribute
even if you don't provide the contexts argument.
Second, you decorate the wrong function: your usage shows that the test function yields the check function for each context. The function you want to wrap does the job of the check function, but you have to name it after the test function.
Applying self to a method can be done with partial as Martijn writes, but it can as well be done the way Python does it under the hood: with
method.__get__(self, None)
or maybe better
method.__get__(self, type(self))
you can achieve the same. (Maybe your original version works as well, with yielding the function to be called and the arguments to use. It was not clear to me that this is the way it works.)

Categories