I would like to write a decorator that limits the number of calls to the wrapped function. Say, if I want the wrapped function to be called maximum 10 times, the decorator should execute that function the first 10 times, then it should return None instead.
Here's what I've come up with:
from functools import wraps
def max_calls(num):
"""Decorator which allows its wrapped function to be called `num` times"""
def decorator(func):
#wraps(func)
def wrapper(*args, **kwargs):
calls = getattr(wrapper, 'calls', 0)
calls += 1
if calls == num:
return None
setattr(wrapper, 'calls', calls)
return func(*args, **kwargs)
setattr(wrapper, 'calls', 0)
return wrapper
return decorator
However, this count calls properly, returns None when the limit is reached, but...it doesn't reset between program runs. That is, if I execute the program once, the counter reaches 5, and then re-execute the program, it continues from 5. What do I need to change so that the decorator works properly?
The problem is that you maintain just one set of call counts. But that means each Flask request shares call counts with all the other requests. What you need to do is to maintain a separate set of call counts for each Flask request.
From reading the API documentation it looks as though the way to make this work is to carry out these three steps:
Make a subclass of flask.Request that can store your function call counts:
import collections
import flask
class MyRequest(flask.Request):
"""A subclass of Request that maintains function call counts."""
def __init__(self, *args, **kwargs):
super(MyRequest, self).__init__(*args, **kwargs)
self.call_counts = collections.defaultdict(int)
Set request_class to your subclass when you initialize your application:
app = flask.Flask(__name__, request_class=MyRequest, ...)
Rewrite your decorator to store its counts in the global flask.request object:
import functools
def max_calls(num, default=None):
"""Decorator which allows its wrapped function to be called at most `num`
times per Flask request, but which then returns `default`.
"""
def decorator(func):
#functools.wraps(func)
def wrapper(*args, **kwargs):
if flask.request.call_counts[func] == num:
return default
flask.request.call_counts[func] += 1
return func(*args, **kwargs)
return wrapper
return decorator
But having written that, it would be remiss for me not to point out that your question seems very strange. Why do you want to restrict the number of times a function can be called by a Flask request? Are you trying to do some kind of rate limiting? It seems likely that whatever you want to do can be done better using some other approach.
Related
I have a function that has a decorator #retry, which retries the function if a certain Exception happened. I want to test that this function executes the correct amount of times, for which I have the following code which is working:
#pytest.mark.asyncio
async def test_redis_failling(mocker):
sleep_mock = mocker.patch.object(retry, '_sleep')
with pytest.raises(ConnectionError):
retry_store_redis()
assert sleep_mock.call_count == 4
#retry(ConnectionError, initial_wait=2.0, attempts=5)
def retry_store_redis():
raise ConnectionError()
But, if I modify retry_store_redis() to be an async function, the return value of sleep_mock.call_count is 0.
So you define "retry" as a function. Then you define a test, then you define some code that uses #retry.
#retry, as a decorator, is being called at import time. So the order of operations is
declare retry
declare test
call retry with retry_store_redis as an argument
start your test
patch out retry
call the function you defined in step 3
so "retry" gets called once (at import time), your mock gets called zero times. To get the behavior you want, (ensuring that retry is actually re-calling the underlying function) I would do
#pytest.mark.asyncio
async def test_redis_failling(mocker):
fake_function = MagicMock(side_effect=ConnectionError)
decorated_function = retry(ConnectionError, initial_wait=2.0, attempts=5)(fake_function)
with pytest.raises(ConnectionError):
decorated_function()
assert fake_function.call_count == 4
if you wanted to test this as built (instead of a test specifically for the decorator) you would have to mock out the original function inside the decorated function- which would depend on how you implemented the decorator. The default way (without any libraries) means you would have to inspect the "closure" attribute. You can build the object to retain a reference to the original function though, here is an example
def wrap(func):
class Wrapper:
def __init__(self, func):
self.func = func
def __call__(self, *args, **kwargs):
return self.func(*args, **kwargs)
return Wrapper(func)
#wrap
def wrapped_func():
return 42
in this scenario, you could patch the wrapped function at wrapped_func.func
I'm trying to replace the marshal_with decorator from flask-restful with a decorator that does something before calling marshal_with. My approach is to try to implement a new decorator that wraps marshal_with.
My code looks like:
from flask.ext.restful import marshal_with as restful_marshal_with
def marshal_with(fields, envelope=None):
def wrapper(f):
print("Do something with fields and envelope")
#wraps(f)
def inner(*args, **kwargs):
restful_marshal_with(f(*args, **kwargs))
return inner
return wrapper
Unfortunately this seems to break things... no error messages but my API returns a null response when it shouldn't be. Any insights on what I'm doing wrong?
I don't know the specifics of marshal_with, but it's entirely possible to use multiple decorators on a single function. For instance:
def decorator_one(func):
def inner(*args, **kwargs):
print("I'm decorator one")
func(*args, **kwargs)
return inner
def decorator_two(text):
def wrapper(func):
def inner(*args, **kwargs):
print(text)
func(*args, **kwargs)
return inner
return wrapper
#decorator_one
#decorator_two("I'm decorator two")
def some_function(a, b):
print(a, b, a+b)
some_function(4, 7)
The output this gives is:
I'm decorator one
I'm decorator two
4 7 11
You can modify this little script by adding print statements after each inner function call to see the exact flow control between each decorator as well.
I was doing a couple things wrong here, first, failing to return the output of restful_marshal_with as jonrsharpe pointed out, secondly, failing to understand a decorator written as a class instead of a function, and how to properly pass values to it. The correct code ended up being:
def marshal_with(fields, envelope=None):
def wrapper(f):
print("Do something with fields and envelope")
#wraps(f)
def inner(*args, **kwargs):
rmw = restful_marshal_with(fields, envelope)
return rmw(f)(*args, **kwargs)
return inner
return wrapper
As you can see, in addition to not returning rmw(), I needed to properly initialize the request_marshal_with class before calling it. Finally, it is important to remember that decorators return functions, therefore the arguments of the original function should be passed to the return value of rmw(f), hence the statement return rmw(f)(*args, **kwargs). This is perhaps more apparent if you take a look at the flask_restful.marshal_with code here.
I'm trying to invoke a python function from robotframework keyword. The python function has been decorated to be invoked using run_keyword from Builtin library. This is because robot logs appear well structured if library functions are invoked via run_keyword function from built in library. rather than invoked directly. However this is resulting in an infinite loop. Is there a solution to gracefully accomplish the goal?
robotkeyword :
do something
#creates a user by calling a function from python based library
create user
python function
#wrap_with_run_keyword
def create_user():
pass
def wrap_with_run_keyword(func):
def func_wrapper(*args, **kwargs):
return run_keyword(func, *args, **kwargs)
return func_wrapper
I couldn't solve the problem using partial application.
However, I broker the recursive loop by setting and unsetting an attribute as give below.
def wrap_with_run_keyword(func):
def func_wrapper(*args, **kwargs):
if not hasattr(func, 'second'):
setattr(func, "second", True)
return run_keyword(func, *args, **kwargs)
else:
delattr(func, "second")
return func(*args, **kwargs)
return func_wrapper
I have however run into another problem. I defined create_user as follows
def create_user(properties):
#some code
pass
On Calling this function in the way below
create_user("name=abc")
I'm getting the following error : got an unexpected keyword argument 'name'
I did run in the same issue, but solved it, only wondering if i can detect the caller...if the call is done from robotframework or by python in case that the call is done by the rf it should do only the second call
#wraps(function)
def wrapper(self, *args, **kwargs):
if not hasattr(function, 'second'):
setattr(function, 'second', True)
ar= list(args)
for key, value in kwargs.items():
ar.append(value)
return BuiltIn().run_keyword('Mylib.' + function.__name__, ar)
else:
delattr(function, 'second')
return function(self,*args[0])
return wrapper
Take a look at the partial class from the functools module. I think this might help you.
Or take a look at how decorators work in python.
I'm writing a modding API for my game and I want to allow users to tell the game engine to run their mod's function immediately before or after one of the API functions, in other words to "extend" the functions.
Before now modders had to rewrite the function to add functionality to it which meant digging in the game's sourcecode, which I think sucks for them.
Right now I'm thinking of something like this:
def moddersFunction():
pass
myAPI.extendFunction('functionName', 'before-or-after', moddersFunction, extraArgs=[])
Now what will be comparably clean way to implement this?
In other words, what would the internals of myAPI.extendFunction look like if you were to write it?
I'm thinking of having a dictionary which myAPI.extendFunction will add functions to and the functions in my API will check and run the mod's function if it has been set ("registered").
But this means adding the code which checks the dictionary in every single function in my modding API which I want to allow to be extended (even if it's just a single function call which does the check and function calling itself, it seems to much itself).
I'm wondering if there's some neat Python trick which will allow such "extending" of functions without "butchering" (maybe I'm exaggerating) the existing sourcecode of my game's API.
I would do something along the following lines:
class hookable(object):
def __init__(self, fn):
self.pre = []
self.post = []
self.fn = fn
def add_pre(self, hook):
self.pre.append(hook)
def add_post(self, hook):
self.post.append(hook)
def __call__(self, *args, **kwargs):
for hook in self.pre:
hook(*args, **kwargs)
ret = self.fn(*args, **kwargs)
for hook in self.post:
hook(*args, **kwargs)
return ret
Any "extendable" function can now be decorated like so:
#hookable
def square(x):
return x**2
Now you can use square.add_pre() and square.add_post() to register functions that would automatically get called before and after each call to square():
print square(2)
def pre(x): print 'pre', x
def post(x): print 'post', x
square.add_pre(pre)
print square(2)
print square(3)
square.add_post(post)
print square(4)
print square(5)
I would probably do something like this. Obviously what you actually do is specific to your requirements.
class Extend(object):
def __init__(self, original_function):
self.original_function = original_function
def __call__(self, function):
def _wrapped(*args, **kwargs):
yield function(*args, **kwargs)
yield self.original_function(*args, **kwargs)
return _wrapped
import time
#Extend(time.asctime)
def funky_time():
return "This is the time in the UK"
print list(funky_time())
I am using nose test generators feature to run the same test with different contexts. Since it requires the following boiler plate for each test:
class TestSample(TestBase):
def test_sample(self):
for context in contexts:
yield self.check_sample, context
def check_sample(self, context):
"""The real test logic is implemented here"""
pass
I decided to write the following decorator:
def with_contexts(contexts=None):
if contexts is None:
contexts = ['twitter', 'linkedin', 'facebook']
def decorator(f):
#wraps(f)
def wrapper(self, *args, **kwargs):
for context in contexts:
yield f, self, context # The line which causes the error
return wrapper
return decorator
The decorator is used in the following manner:
class TestSample(TestBase):
#with_contexts()
def test_sample(self, context):
"""The real test logic is implemented here"""
var1 = self.some_valid_attribute
When the tests executed an error is thrown specifying that the attribute which is being accessed is not available. However If I change the line which calls the method to the following it works fine:
yield getattr(self, f.__name__), service
I understand that the above snippet creates a bound method where as in the first one self is passed manually to the function. However as far as my understanding goes the first snippet should work fine too. I would appreciate if anyone could clarify the issue.
The title of the question is related to calling instance methods in decorators in general but I have kept the description specific to my context.
You can use functools.partial to tie the wrapped function to self, just like a method would be:
from functools import partial
def decorator(f):
#wraps(f)
def wrapper(self, *args, **kwargs):
for context in contexts:
yield partial(f, self), context
return wrapper
Now you are yielding partials instead, which, when called as yieldedvalue(context), will call f(self, context).
As far as I can tell, some things don't fit together. First, your decorator goes like
def with_contexts(contexts=None):
if contexts is None:
contexts = ['twitter', 'linkedin', 'facebook']
def decorator(f):
#wraps(f)
def wrapper(self, *args, **kwargs):
for context in contexts:
yield f, self, context # The line which causes the error
return wrapper
return decorator
but you use it like
#with_contexts
def test_sample(self, context):
"""The real test logic is implemented here"""
var1 = self.some_valid_attribute
This is wrong: this calls with_context(test_sample), but you need with_context()(test_sample). So do
#with_contexts()
def test_sample(self, context):
"""The real test logic is implemented here"""
var1 = self.some_valid_attribute
even if you don't provide the contexts argument.
Second, you decorate the wrong function: your usage shows that the test function yields the check function for each context. The function you want to wrap does the job of the check function, but you have to name it after the test function.
Applying self to a method can be done with partial as Martijn writes, but it can as well be done the way Python does it under the hood: with
method.__get__(self, None)
or maybe better
method.__get__(self, type(self))
you can achieve the same. (Maybe your original version works as well, with yielding the function to be called and the arguments to use. It was not clear to me that this is the way it works.)