In my Python app, I'm using events to communicate between different plugins.
Now, instead of registering the methods to the events manually, I thought I might use decorators to do that for me.
I would like to have it look like this:
#events.listento('event.name')
def myClassMethod(self, event):
...
I have first tried to do it like this:
def listento(to):
def listen_(func):
myEventManager.listen(to, func)
def wrapper(*args, **kwargs):
return func(*args, **kwargs)
return func
return listen_
When I callmyEventManger.listen('event', self.method)from within the instance, everything is running fine. However, if I use the decorator approach, theselfargument is never passed.
The other approach that I have tried, after searching for a solution on the Internet, is to use a class as a decorator:
class listen(object):
def __init__(self, method):
myEventManager.listen('frontend.route.register', self)
self._method = method
self._name = method.__name__
self._self = None
def __get__(self, instance, owner):
self._self = instance
return self
def __call__(self, *args, **kwargs):
return self._method(self._self, *args, **kwargs)
The problem with this approach is that I don't really understand the concept of__get__, and that I don't know how I'd incorporate the parameters.
Just for testing I have tried using a fixed event to listen to, but with this approach, nothing happens. When I add print statements, I can see that__init__is called.
If I add an additional, "old style" event registration, both__get__and__call__get executed, and the event works, despite the new decorator.
What would be the best way to achieve what I'm looking for, or am I just missing some important concept with decorators?
The decorator approach isn't working because the decorator is being called when the class is constructed, not when the instance is constructed. When you say
class Foo(object):
#some_decorator
def bar(self, *args, **kwargs):
# etc etc
then some_decorator will be called when the class Foo is constructed, and it will be passed an unbound method, not the bound method of an instance. That's why self isn't getting passed.
The second method, on the other hand, could work as long as you only ever create one object of each class you use the decorator on, and if you're a bit clever. If you define listen as above and then define
class Foo(object):
def __init__(self, *args, **kwargs):
self.some_method = self.some_method # SEE BELOW FOR EXPLANATION
# etc etc
#listen
def some_method(self, *args, **kwargs):
# etc etc
Then listen.__get__ would be called when someone tried to call f.some_method directly for some f...but the whole point of your scheme is that no-one's doing that! The event call back mechanism is calling the listen instance directly 'cause that's what it gets passed and the listen instance is calling the unbound method it squirrelled away when it was created. listen.__get__ won't ever get called and the _self parameter is never getting set properly...unless you explicitly access self.some_method yourself, as I did in the __init__ method above. Then listen.__get__ will be called upon instance creation and _self will be set properly.
Problem is (a) this is a horrible, horrible hack and (b) if you try to create two instances of Foo then the second one will overwrite the _self set by the first, because there's still only one listen object being created, and that's associated to the class, not the instance. If you only ever use one Foo instance then you're fine, but if you have to have the event trigger two different Foo's then you'll just have to use your "old style" event registration.
The TL,DR version: decorating a method decorates the unbound method of the class, whereas you want your event manager to get passed the bound method of an instance.
Part of your code is:
def wrapper(*args, **kwargs):
return func(*args, **kwargs)
return func
which defines wrapper then completely ignores it and returns func instead. Hard to say whether this is a real problem in your real code because obviously you're not posting that (as proven by typoes such as myEventManagre, myEvnetManager, &c), but if that's what you're doing in your actual code it is obviously part of your problem.
Related
I'm trying to implement a decorator which accepts some arguments. Usually decorators with arguments are implemented as double-nested closures, like this:
def mydecorator(param1, param2):
# do something with params
def wrapper(fn):
def actual_decorator(actual_func_arg1, actual_func_arg2):
print("I'm decorated!")
return fn(actual_func_arg1, actual_func_arg2)
return actual_decorator
return wrapper
But personally I don't like such approach because it is very unreadable and difficult to understand.
So I ended up with this:
class jsonschema_validate(object):
def __init__(self, schema):
self._schema = schema
def __call__(self, fn):
self._fn = fn
return self._decorator
def _decorator(self, req, resp, *args, **kwargs):
try:
jsonschema.validate(req.media, self._schema, format_checker=jsonschema.FormatChecker())
except jsonschema.ValidationError as e:
_log.exception('Validation failed: %r', e)
raise errors.HTTPBadRequest('Bad request')
return self._fn(req, resp, *args, **kwargs)
The idea is very simple: at instantiation time we just captures decorator args, and at call time we capture decorated function and return decorator instance's method, which is bound. It is important it to be bound because at decorator's invocation time we want to access self with all information stored in it.
Then we use it on some class:
class MyResource(object):
#jsonschema_validate(my_resource_schema)
def on_post(self, req, resp):
pass
Unfortunately, this approach doesn't work. The problem is that at decorator invocation time we looses context of decorated instance because at decoration time (when defining class) decorated method is not bound. Binding occurs later at attribute access time. But at this moment we already have decorator's bound method (jsonschema_validate._decorator) and self is passed implicitly, and it's value isn't MyResource instance, rather jsonschema_validate instance. And we don't want to loose this self value because we want to access it's attributes at decorator invocation time. In the end it results in TypeError when calling self._fn(req, resp, *args, **kwargs) with complains that "required positional argument 'resp' is missing" because passed in req arg becomes MyResource.on_post "self" and all arguments effectively "shifts".
So, is there a way implement decorator as a class rather than as a bunch of nested functions?
Note
As my first attempt of implementing decorator as simple class was failed rather quickly, I immediately reverted to nested functions. It seems like properly implemented class approach is even more unreadable and tangled, but I want to find solution anyway for the fun of the thing.
UPDATE
Finally found solution, see my own answer.
This is fun! Thanks for posting this question.
Writing a simple decorator that doesn't take arguments is pretty easy, but extending that to a class that then gets called three times is a bit more challenging. I opted to use a functools.partial to solve this problem.
from functools import partial, update_wrapper
from unittest import TestCase, main
class SimpleDecorator(object):
def __new__(cls, func, **params):
self = super(SimpleDecorator, cls).__new__(cls)
self.func = func
self.params = params
return update_wrapper(self, func)
def __call__(self, *args, **kwargs):
args, kwargs = self.before(*args, **kwargs)
return self.after(self.func(*args, **kwargs))
def after(self, value):
return value
def before(self, *args, **kwargs):
return args, kwargs
class ParamsDecorator(SimpleDecorator):
def __new__(cls, **params):
return partial(super(ParamsDecorator, cls).__new__, cls, **params)
class DecoratorTestCase(TestCase):
def test_simple_decorator(self):
class TestSimpleDecorator(SimpleDecorator):
def after(self, value):
value *= 2
return super().after(value)
#TestSimpleDecorator
def _test_simple_decorator(value):
"""Test simple decorator"""
return value + 1
self.assertEqual(_test_simple_decorator.__name__, '_test_simple_decorator')
self.assertEqual(_test_simple_decorator.__doc__, 'Test simple decorator')
self.assertEqual(_test_simple_decorator(1), 4)
def test_params_decorator(self):
class TestParamsDecorator(ParamsDecorator):
def before(self, value, **kwargs):
value *= self.params['factor']
return super().before(value, **kwargs)
#TestParamsDecorator(factor=3)
def _test_params_decorator(value):
"""Test params decorator"""
return value + 1
self.assertEqual(_test_params_decorator.__name__, '_test_params_decorator')
self.assertEqual(_test_params_decorator.__doc__, 'Test params decorator')
self.assertEqual(_test_params_decorator(2), 7)
As you can see I've opted for a design with hooks for modifying the arguments and responses in methods. Hopefully, most of the time this would prevent needing to touch __call__ or __new__.
I couldn't think of a way to attach params to ParamsDecorator after returning the partial, so I had to opt for putting it into the SimpleDecorator but not using it.
I think that this does a good job of keeping the content flat instead of nested. I also like that this can take care of functools.wraps for you so you shouldn't need to worry about including that on these decorators. The downside to writing a decorator this way is you're now introducing a new module that you would need to install or maintain and then import every time you write a decorator.
Finally got it!
As I wrote, the problem that a method can't have two self, so we need to capture both values in some way. Descriptors and closures to the rescue!
Here is complete example:
class decorator_with_args(object):
def __init__(self, arg):
self._arg = arg
def __call__(self, fn):
self._fn = fn
return self
def __get__(self, instance, owner):
if instance is None:
return self
def _decorator(self_, *args, **kwargs):
print(f'decorated! arg: {self._arg}')
return self._fn(self_, *args, **kwargs)
return _decorator.__get__(instance, owner)
Let's break it down to pieces!
It starts exactly as my previous attempt. In __init__ we just capture decorator arguments to it's private attribute(s).
Things get more interested in next part: a __call__ method.
def __call__(self, fn):
self._fn = fn
return self
As before, we capture decorated method to decorator's private attribute. But then, instead of returning actual decorator method (def _decorator in previous example), we return self. So decorated method becomes instance of decorator. This is required to allow it to act as descriptor. According to docs:
a descriptor is an object attribute with "binding behavior"
Confusing, uh? Actually, it is easier than it looks. Descriptor is just an object with "magic" (dunder) methods which is assigned to another's object attribute. When you try to access this attribute, those dunder methods will be invoked with some calling convention. And we'll return to "binding behavior" a little bit later.
Let's look at the details.
def __get__(self, instance, owner):
Descriptor must implement at least __get__ dunder (and __set__ & __delete__ optionally). This is called "descriptor protocol" (similar to "context manager protocol", "collection protocol" an so on).
if instance is None:
return self
This is by convention. When descriptor accessed on class rather than instance, it should return itself.
Next part is most interesting.
def _decorator(self_, *args, **kwargs):
print(f'decorated! arg: {self._arg}')
return self._fn(self_, *args, **kwargs)
return _decorator.__get__(instance, owner)
We need to capture decorator's self as well as decorated instance's self in some way. As we can't define function with two self (even if we can, Python couldn't understand us), so we enclose decorator's self with closure - an inner function. In this closure, we actually do alter behavior of decorated method (print('decorated! arg: {self._arg}')) and then call original one. Again, as there is already argument named self, we need to choose another name for instance's self - in this example I named it self_, but actually it is self' - "self prime" (kinda math humor).
return _decorator.__get__(instance, owner)
And finally, usually, when we define closures, we just return it: def inner(): pass; return inner. But here we can't do that. Because of "binding behavior". What we need is returned closure to be bound to decorated instance in order it to work properly. Let me explain with an example.
class Foo(object):
def foo(self):
print(self)
Foo.foo
# <function Foo.foo at 0x7f5b1f56dcb0>
Foo().foo
# <bound method Foo.foo of <__main__.Foo object at 0x7f5b1f586910>>
When you access method on class, it is just a plain Python function. What makes it a method instead is binding. Binding is an act of linking object's methods with instance which is passed implicitly as firs argument. By convention, it is called self, but roughly speaking this is not required. You can even store method in other variable and call it, and will still have reference to instance:
f = Foo()
f.foo()
# <__main__.Foo object at 0x7f5b1f5868d0>
other_foo = f.foo
other_foo()
# <__main__.Foo object at 0x7f5b1f5868d0>
So, we need to bind our returned closure to decorated instance. How to do that? Remember when we was looking at method? It could be something like that:
# <bound method Foo.foo of <__main__.Foo object at 0x7f5b1f586910>>
Let's look at it's type:
type(f.foo)
# <class 'method'>
Wow! It actually even a class! Let's create it!
method()
# Traceback (most recent call last):
# File "<stdin>", line 1, in <module>
# NameError: name 'method' is not defined
Unfortunately, we can't do that directly. But, there is types.MethodType:
types.MethodType
# <class 'method'>
Seems like we finally found what we wanted! But, actually, we don't need to create method manually!. All we need to do is to delegate to standard mechanics of method creation. And this is how actually methods work in Python - they are just descriptors which bind themselves to an instances when accessed as instance's attribute!
To support method calls, functions include the __get__() method for binding methods during attribute access.
So, we need to just delegate binding machinery to function itself:
_decorator.__get__(instance, owner)
and get method with right binding!
I'd like to write a decorator that does somewhat different things when it gets a function or a method.
for example, I'd like to write a cache decorator but I don't want to have self as part of the key if it's a method.
def cached(f):
def _internal(*args, **kwargs):
if ismethod(f):
key = create_key(*args[1:], **kwargs) # ignore self from args
else: # this is a regular function
key = create_key(*args, **kwargs)
return actual_cache_mechanism(key, f, *args, **kwargs)
return _internal
class A:
#cached
def b(self, something):
...
#cached
def c(something):
...
the problem is that when #cached is called, it cannot distinguish between methods and functions as both are of type function.
can that even be done? As I'm thinking of it I feel that actually methods have no idea about the context in which they are being defined in...
Thanks!
This is kind of an ugly hack, but you can use obj.__qualname__ to see if obj was defined in a class, by checking if it has a period
if "." in obj.__qualname__":
#obj is a member of an object, so it is a method
I'm not sure if it will work nicely for decorators though, since for this to work the method would need to be defined in the class.
I think it is desirable to avoid such introspecting decorator in the name of good pythonic style.
You can always factor out the function to be cached to accept just the required arguments:
#cached
def func(something):
return ...
class A:
def b(self, something):
self.bvalue = func(something)
For the case mentioned in comments (an object is needed to get the result, but its value does not affect it, e.g. a socket), please refer to these questions: How to ignore a parameter in functools. lru_cache? and Make #lru_cache ignore some of the function arguments
I have a whole bunch of Django TestCases that use the setUp method to initialize some properties that are used throughout numerous tests, the way they're constructed and depend on each other is logic I want to move out of the test cases and reuse
def setUp(self):
self.property_1 = ##some logic
...
I wanted to rewrite these as some convenience wrapper that could be injected into the class with a simple inheritance or decorator, e.g.
#with_property_1(x=1, y=2)
def setUp(self):
...
def with_property_1(**model_kwargs):
def wrapper(f):
def wrapped(*args, **kwargs):
self = args[0]
self.property_1 = ## logic
f(*args, **kwargs)
return wrapped
return wrapper
but the trouble is that PyCharm doesn't recognize that those instance properties exist because nothing inside the TestCase class proper ever sets them. Is there a either another way I can achieve this nicely, or a way to cajole PyCharm into recognizing these properties are legitimate given the existence of the decorator?
Disclaimer: this is untested.
The issue here is that Python cannot just magically put self into context in the decorator (that will not work in any IDE). What you may be forgetting is that self is one of the arguments passed into each class method when you call it. Therefore, it's present in your *args and you can manipulate it.
Here's my trial code:
def with_property_1(**model_kwargs):
def wrapper(f):
def wrapped(*args, **kwargs):
for key, value in model_kwargs.items():
setattr(args[0], key, value)
f(*args, **kwargs)
return wrapped
return wrapper
Explanation:
Iterate over each key/value in your **model_kwargs.
Modify args[0], which should be the self variable, with the kwargs that were provided, using setattr.
Call the function as normal with your updated self variable.
I'm trying to write a class method decorator that modifies its class' state. I'm having troubles implementing it at the moment.
Side question: When does a decorator get called? Does it load when the class is instantiated or on during read time when the class read?
What I'm trying to do is this:
class ObjMeta(object):
methods = []
# This should be a decorator that magically updates the 'methods'
# attribute (or list) of this class that's being read by the proxying
# class below.
def method_wrapper(method):
#functools.wraps(method)
def wrapper(*args, **kwargs):
ObjMeta.methods.append(method.__name__)
return method(*args, **kwargs)
return wrapper
# Our methods
#method_wrapper
def method1(self, *args):
return args
#method_wrapper
def method2(self, *args):
return args
class Obj(object):
klass = None
def __init__(self, object_class=ObjMeta):
self.klass = object_class
self._set_methods(object_class)
# We dynamically load the method proxies that calls to our meta class
# that actually contains the methods. It's actually dependent to the
# meta class' methods attribute that contains a list of names of its
# existing methods. This is where I wanted it to be done automagically with
# the help of decorators
def _set_methods(self, object_class):
for method_name in object_class:
setattr(self, method_name, self._proxy_method(method_name))
# Proxies the method that's being called to our meta class
def _proxy_method(self, method_name):
def wrapper(*fargs, **fkwargs):
return getattr(self.klass(*fargs, **fkwargs), method_name)
return wrapper()
I think it's ugly to write a list of methods manually in the class so perhaps a decorator would fix this.
It's for an open-source project I'm working that ports underscore.js to python. I understand that it says I should just use itertools or something. I'm just doing this just for the love of programming and learning. BTW, project is hosted here
Thanks!
There are a few things wrong here.
Anything inside the inner wrapper is called when the method itself is called. Basically, you're replacing the method with that function, which wraps the original. So, your code as it stands would add the method name to the list each time it is called, which probably isn't what you want. Instead, that append should be at the method_wrapper level, ie outside of the inner wrapper. This is called when the method is defined, which happens the first time the module containing the class is imported.
The second thing wrong is that you never actually call the method - you simply return it. Instead of return method you should be returning the value of calling the method with the supplied args - return method(*args, **kwargs).
I am trying to write a base crud controller class that does the
following:
class BaseCrudController:
model = ""
field_validation = {}
template_dir = ""
#expose(self.template_dir)
def new(self, *args, **kwargs)
....
#validate(self.field_validation, error_handler=new)
#expose()
def post(self, *args, **kwargs):
...
My intent is to have my controllers extend this base class, set the
model, field_validation, and template locations, and am ready to go.
Unfortunately, decorators (to my understanding), are interpreted when
the function is defined. Hence it won't have access to instance's
value. Is there a way to pass in dynamic data or values from the sub
class?
For example:
class AddressController(BaseCrudController):
model = Address
template_dir = "addressbook.templates.addresses"
When I try to load AddressController, it says "self is not defined". I am assuming that the base class is evaluating the decorator before the sub class is initialized.
Thanks,
Steve
Perhaps using a factory to create the class would be better than subclassing:
def CrudControllerFactory(model, field_validation, template_dir):
class BaseCrudController:
#expose(template_dir)
def new(self, *args, **kwargs)
....
#validate(field_validation, error_handler=new)
#expose()
def post(self, *args, **kwargs):
....
return BaseCrudController
Unfortunately, decorators (to my
understanding), are interpreted when
the function is defined. Hence it
won't have access to instance's value.
Is there a way to pass in dynamic data
or values from the sub class?
The template needs to be called with the name of the relevant attribute; the wrapper can then get that attribute's value dynamically. For example:
import functools
def expose(attname=None):
if attname:
def makewrapper(f):
#functools.wraps(f)
def wrapper(self, *a, **k):
attvalue = getattr(self, attname, None)
...use attvalue as needed...
return wrapper
return makewrapper
else:
...same but without the getattr...
Note that the complication is only because, judging from the code snippets in your Q, you want to allow the expose decorator to be used both with and without an argument (you could move the if attname guard to live within wrapper, but then you'd uselessly repeat the check at each call -- the code within wrapper may also need to be pretty different in the two cases, I imagine -- so, shoehorning two different control flows into one wrapper may be even more complicated). BTW, this is a dubious design decision, IMHO. But, it's quite separate from your actual Q about "dynamic data".
The point is, by using the attribute name as the argument, you empower your decorator to fetch the value dynamically "just in time" when it's needed. Think of it as "an extra level of indirection", that well-known panacea for all difficulties in programming!-)