how to pass in dynamic data to decorators - python

I am trying to write a base crud controller class that does the
following:
class BaseCrudController:
model = ""
field_validation = {}
template_dir = ""
#expose(self.template_dir)
def new(self, *args, **kwargs)
....
#validate(self.field_validation, error_handler=new)
#expose()
def post(self, *args, **kwargs):
...
My intent is to have my controllers extend this base class, set the
model, field_validation, and template locations, and am ready to go.
Unfortunately, decorators (to my understanding), are interpreted when
the function is defined. Hence it won't have access to instance's
value. Is there a way to pass in dynamic data or values from the sub
class?
For example:
class AddressController(BaseCrudController):
model = Address
template_dir = "addressbook.templates.addresses"
When I try to load AddressController, it says "self is not defined". I am assuming that the base class is evaluating the decorator before the sub class is initialized.
Thanks,
Steve

Perhaps using a factory to create the class would be better than subclassing:
def CrudControllerFactory(model, field_validation, template_dir):
class BaseCrudController:
#expose(template_dir)
def new(self, *args, **kwargs)
....
#validate(field_validation, error_handler=new)
#expose()
def post(self, *args, **kwargs):
....
return BaseCrudController

Unfortunately, decorators (to my
understanding), are interpreted when
the function is defined. Hence it
won't have access to instance's value.
Is there a way to pass in dynamic data
or values from the sub class?
The template needs to be called with the name of the relevant attribute; the wrapper can then get that attribute's value dynamically. For example:
import functools
def expose(attname=None):
if attname:
def makewrapper(f):
#functools.wraps(f)
def wrapper(self, *a, **k):
attvalue = getattr(self, attname, None)
...use attvalue as needed...
return wrapper
return makewrapper
else:
...same but without the getattr...
Note that the complication is only because, judging from the code snippets in your Q, you want to allow the expose decorator to be used both with and without an argument (you could move the if attname guard to live within wrapper, but then you'd uselessly repeat the check at each call -- the code within wrapper may also need to be pretty different in the two cases, I imagine -- so, shoehorning two different control flows into one wrapper may be even more complicated). BTW, this is a dubious design decision, IMHO. But, it's quite separate from your actual Q about "dynamic data".
The point is, by using the attribute name as the argument, you empower your decorator to fetch the value dynamically "just in time" when it's needed. Think of it as "an extra level of indirection", that well-known panacea for all difficulties in programming!-)

Related

How to inherit from a class instantiated with a builder?

I have a class Document, this class is really complex to instantiate so I have a builder object to create them. Both elements are not mine, so I can't change them
Now, I want to create a subclass of Document, just to add some specific methods. In order to keep using the provided builder I tried this:
class SpecialDocument(Document):
def __new__(cls, *args):
return DocumentBuilder(*args)
def __init__(self, *args, **kwargs):
#My initialization
The problem here is that the __init__ method never executes cause the __new__ method doesn't return a SpecialDocument (It returns a Document)
In my particular case I don't need to build my SpecialDocument differently from how I build a Document. Is there a way to use the same builder? If not, how can I achieve this? I just want to inherit from Document to add particular functionalities, maybe it could be achieved with metaclasses but I never used them (Probably cause I don't fully understand it), a little insight on them would be nice if it can help solving my problem
You don't actually need a metaclass here - you just have to proper call the superclass' __new__ method. The way you are doing it, the instantiation of the superclass does not "know" it is being called from a subclass at all.
So, just write your code like this instead:
class SpecialDocument(Document):
def __new__(cls, *args):
return super().__new__(cls, *args)
def __init__(self, *args, **kwargs):
#My initialization
Now, that is the ordinary way to do it - and would work if the code in your "builder" function was correctly placed inside Docment's __new__ or __init__.
Since the code there does nt do that, and you can[ t pass your subclass as a parameter to the builder, a working solution might be to create a normal document, and swap its class after it has been built:
def special_document_init(special_document):
...
class SpecialDocument(Document):
def my_special_method(self, ...):
...
def overriden_method(self):
...
result = super().overriden_method()
...
def build_special_document(*args):
document = DocumentBuilder(*args)
document.__class__ = SpecialDocument
special_document_init(document)
return document

Python Decorators that modifies a bound method and its class' state

I'm trying to write a class method decorator that modifies its class' state. I'm having troubles implementing it at the moment.
Side question: When does a decorator get called? Does it load when the class is instantiated or on during read time when the class read?
What I'm trying to do is this:
class ObjMeta(object):
methods = []
# This should be a decorator that magically updates the 'methods'
# attribute (or list) of this class that's being read by the proxying
# class below.
def method_wrapper(method):
#functools.wraps(method)
def wrapper(*args, **kwargs):
ObjMeta.methods.append(method.__name__)
return method(*args, **kwargs)
return wrapper
# Our methods
#method_wrapper
def method1(self, *args):
return args
#method_wrapper
def method2(self, *args):
return args
class Obj(object):
klass = None
def __init__(self, object_class=ObjMeta):
self.klass = object_class
self._set_methods(object_class)
# We dynamically load the method proxies that calls to our meta class
# that actually contains the methods. It's actually dependent to the
# meta class' methods attribute that contains a list of names of its
# existing methods. This is where I wanted it to be done automagically with
# the help of decorators
def _set_methods(self, object_class):
for method_name in object_class:
setattr(self, method_name, self._proxy_method(method_name))
# Proxies the method that's being called to our meta class
def _proxy_method(self, method_name):
def wrapper(*fargs, **fkwargs):
return getattr(self.klass(*fargs, **fkwargs), method_name)
return wrapper()
I think it's ugly to write a list of methods manually in the class so perhaps a decorator would fix this.
It's for an open-source project I'm working that ports underscore.js to python. I understand that it says I should just use itertools or something. I'm just doing this just for the love of programming and learning. BTW, project is hosted here
Thanks!
There are a few things wrong here.
Anything inside the inner wrapper is called when the method itself is called. Basically, you're replacing the method with that function, which wraps the original. So, your code as it stands would add the method name to the list each time it is called, which probably isn't what you want. Instead, that append should be at the method_wrapper level, ie outside of the inner wrapper. This is called when the method is defined, which happens the first time the module containing the class is imported.
The second thing wrong is that you never actually call the method - you simply return it. Instead of return method you should be returning the value of calling the method with the supplied args - return method(*args, **kwargs).

Redefinition of class method in python

Context
I'm trying to have some "plugins" (I'm not sure this is the correct definition for this) to my code. By "plugin", I mean a module which defines a model (this is a scientific code) in such a way that its existence is enough to use it anywhere else in the code.
Of course these plugins must follow a template which uses some modules/function/classes defined in my code. Here is a small snippet for the relevant part of my code:
# [In the code]
class AllModels():
def __init__(self):
"""
Init.
"""
self.count = 0
def register(self, name, model):
"""
Adds a model to the code
"""
setattr(self, name, model)
self.count += 1
return
class Model():
def __init__(self, **kwargs):
"""
Some constants that defines a model
"""
self.a = kwargs.get("a", None)
self.b = kwargs.get("b", None)
# and so on...
def function1(self, *args, **kwargs):
"""
A function that all models will have, but which needs:
- to have a default behavior (when the instance is created)
- to be redefinable by the "plugin" (ie. the model)
"""
# default code for the default behavior
return
instance = AllModels()
and here is the relevant part of the "plugin":
# [in the plugin file]
from code import Model, instance
newmodel = Model(a="a name", b="some other stuff")
def function1(*args, **kwargs):
"""
Work to do by this model
"""
# some specific model-dependent work
return
instance.register(newmodel)
Additional information and requirements
function1 has exactly the same signature for any model plugin, but
is usually doing a different job for each.
I'd like a default behavior for the function1 so that if it is not
defined by the plugin, I'll still be able to do something (try
different possibilities, and/or raise a warning/error).
In the plugin, function1 may use some other functions that are only defined in this plugin. I'm stating this because the code is running with the multiprocessing module, and I need the instance instance of AllModels to be able to call function1 in child processes. instance is defined in the parent process, as well as the model plugins, but will be used in different child processes (no modification is done to it though).
it would be awesome that function1, when "redefined" by the plugin, be able to access the attributes of the Model instance (ie. self).
Problem
I've read many different sources of python documentation and some SO question. I only see two/three possible solutions to this problem:
1) not declaring function1 method in Model class, but just set it as an attribute when the plugin creates a new instance of it.
# [in the plugin file]
def function1(*args, **kwargs):
# ....
return
newmodel.function1 = function1
and then call it whenever needed. In that case the attribute function1 in the object Model would be initiate to None probably. One caveat of that is that there is no "default behaviour" for function1 (it has to be dealt in the code, eg. testing if instance.function1 is None: ...), and an even bigger one is that I can't access self this way...
2) using somehow the python decorators. I've never used this, and the documentation I've read is not that simple (I mean not straight forward due to the huge number of possibilities on its usage). But it seems to be a good solution. However I'm worried about its performance impact (I've read that it could slow down the execution of the decorated function/method). If this solution is the best option, then I'd like to know how to use it (a quick snippet maybe), and if it is possible to use attributes of the class Model:
# [in the plugin file]
#mydecorator
def function1(self, *args, **kwargs):
"""
I'm not sure I can use *self*, but it would be great since some attributes of self are used for some other function similar to *function1*...
"""
# some stuff using *self*, eg.:
x = self.var **2 + 3.4
# where self.var has been defined before, eg.: newmodel.var = 100.
3) using the module types and its MethodType... I'm not sure that is relevant in my case... but I may be wrong.
As you can probably see after this long question, I'm not very familiar with such python features, and my understanding of decorators is really poor now. While keeping reading some documentation, I thought that might be worth to ask the question here since I'm not sure of the direction to take in order to treat my problem.
Solution
The beauty of the answer of Senderle is that it is really simple and obvious... And having missed it is a shame. Sorry for polluting SO with that question.
Well, unless I'm mistaken, you want to subclass Model. This is sort of like creating an instance of Model and replacing its function1 attribute with a function defined in the plugin module (i.e. your option 1); but it's much cleaner, and takes care of all the details for you:
# [in the plugin file]
from code import Model, instance
class MyModel(Model):
def function1(*args, **kwargs):
"""
Work to do by this model
"""
# some specific model-dependent work
return
newmodel = MyModel(a="a name", b="some other stuff")
instance.register(newmodel)
This way, all the other methods (functions "attached" to a Model instance) are inherited from Model; they will behave in just the same way, but function1 will be overridden, and will follow your customized function1 definition.
Could you write a dummy function1() function in the Model class and raise a NotImplementedError? That way, if anyone tries to inherit from Model without implementing function1(), they'll get an exception when they try to run the code. If you're running the code for them, you can catch that error and return a helpful error message to the user.
For example:
class Model:
#Your code
def function1():
raise NotImplementedError("You need to implement function1
when you inherit from Model")
Then, you can do the following when you run the code:
try:
modelObj.function1()
except NotImplementedError as e:
#Perform error handling here
EDIT: The official Python documentation for NotImplementedError states: "In user defined base classes, abstract methods should raise this exception when they require derived classes to override the method." That does seem to fit the requirements here.
What you are trying to do canbe done in pretty straightforward ways - just using Objected Oriented techniques and taking advantage that in Python functions are also normal objects -
One simple thing to do is just having your "model" class to accept the "function1" as a parameter, and store it as an object member.
Some code like this, with minimal changes to your code - though much more interesting things are certainly possible:
# [In the code]
class AllModels():
def __init__(self):
"""
Init.
"""
self.count = 0
def register(self, name, **kwargs):
"""
Adds a model to the code
"""
model = Model(**kwargs)
setattr(self, name, model)
self.count += 1
return
class Model():
def __init__(self, **kwargs):
"""
Some constants that defines a model
"""
self.a = kwargs.get("a", None)
self.b = kwargs.get("b", None)
if "function1" in kwargs:
self.real_function1 = kwargs["function1"]
self.function1.__doc__ = kwargs["function1"].__doc__
# and so on...
def function1(self, *args, **kwargs):
"""
A function that all models will have, but which needs:
- to have a default behavior (when the instance is created)
- to be redefinable by the "plugin" (ie. the model)
"""
if self.real_function1:
return self.real_function1(self, *args, **kwargs)
# default code for the default behavior
return
instance = AllModels()
and
# [in the plugin file]
from code import Model, instance
newmodel = Model(a="a name", b="some other stuff")
def function1(self, *args, **kwargs):
"""
Work to do by this model
"""
# some specific model-dependent work
return
instance.register("name", function1 = function1, a="a name", b="some other stuff")

Decorating a method

In my Python app, I'm using events to communicate between different plugins.
Now, instead of registering the methods to the events manually, I thought I might use decorators to do that for me.
I would like to have it look like this:
#events.listento('event.name')
def myClassMethod(self, event):
...
I have first tried to do it like this:
def listento(to):
def listen_(func):
myEventManager.listen(to, func)
def wrapper(*args, **kwargs):
return func(*args, **kwargs)
return func
return listen_
When I callmyEventManger.listen('event', self.method)from within the instance, everything is running fine. However, if I use the decorator approach, theselfargument is never passed.
The other approach that I have tried, after searching for a solution on the Internet, is to use a class as a decorator:
class listen(object):
def __init__(self, method):
myEventManager.listen('frontend.route.register', self)
self._method = method
self._name = method.__name__
self._self = None
def __get__(self, instance, owner):
self._self = instance
return self
def __call__(self, *args, **kwargs):
return self._method(self._self, *args, **kwargs)
The problem with this approach is that I don't really understand the concept of__get__, and that I don't know how I'd incorporate the parameters.
Just for testing I have tried using a fixed event to listen to, but with this approach, nothing happens. When I add print statements, I can see that__init__is called.
If I add an additional, "old style" event registration, both__get__and__call__get executed, and the event works, despite the new decorator.
What would be the best way to achieve what I'm looking for, or am I just missing some important concept with decorators?
The decorator approach isn't working because the decorator is being called when the class is constructed, not when the instance is constructed. When you say
class Foo(object):
#some_decorator
def bar(self, *args, **kwargs):
# etc etc
then some_decorator will be called when the class Foo is constructed, and it will be passed an unbound method, not the bound method of an instance. That's why self isn't getting passed.
The second method, on the other hand, could work as long as you only ever create one object of each class you use the decorator on, and if you're a bit clever. If you define listen as above and then define
class Foo(object):
def __init__(self, *args, **kwargs):
self.some_method = self.some_method # SEE BELOW FOR EXPLANATION
# etc etc
#listen
def some_method(self, *args, **kwargs):
# etc etc
Then listen.__get__ would be called when someone tried to call f.some_method directly for some f...but the whole point of your scheme is that no-one's doing that! The event call back mechanism is calling the listen instance directly 'cause that's what it gets passed and the listen instance is calling the unbound method it squirrelled away when it was created. listen.__get__ won't ever get called and the _self parameter is never getting set properly...unless you explicitly access self.some_method yourself, as I did in the __init__ method above. Then listen.__get__ will be called upon instance creation and _self will be set properly.
Problem is (a) this is a horrible, horrible hack and (b) if you try to create two instances of Foo then the second one will overwrite the _self set by the first, because there's still only one listen object being created, and that's associated to the class, not the instance. If you only ever use one Foo instance then you're fine, but if you have to have the event trigger two different Foo's then you'll just have to use your "old style" event registration.
The TL,DR version: decorating a method decorates the unbound method of the class, whereas you want your event manager to get passed the bound method of an instance.
Part of your code is:
def wrapper(*args, **kwargs):
return func(*args, **kwargs)
return func
which defines wrapper then completely ignores it and returns func instead. Hard to say whether this is a real problem in your real code because obviously you're not posting that (as proven by typoes such as myEventManagre, myEvnetManager, &c), but if that's what you're doing in your actual code it is obviously part of your problem.

Use python decorators on class methods and subclass methods

Goal: Make it possible to decorate class methods. When a class method gets decorated, it gets stored in a dictionary so that other class methods can reference it by a string name.
Motivation: I want to implement the equivalent of ASP.Net's WebMethods. I am building this on top of google app engine, but that does not affect the point of difficulty that I am having.
How it Would look if it worked:
class UsefulClass(WebmethodBaseClass):
def someMethod(self, blah):
print(blah)
#webmethod
def webby(self, blah):
print(blah)
# the implementation of this class could be completely different, it does not matter
# the only important thing is having access to the web methods defined in sub classes
class WebmethodBaseClass():
def post(self, methodName):
webmethods[methodName]("kapow")
...
a = UsefulClass()
a.post("someMethod") # should error
a.post("webby") # prints "kapow"
There could be other ways to go about this. I am very open to suggestions
This is unnecessary. Just use getattr:
class WebmethodBaseClass():
def post(self, methodName):
getattr(self, methodName)("kapow")
The only caveat is that you have to make sure that only methods intended for use as webmethods can be used thus. The simplest solution, IMO, is to adopt the convention that non-webmethods start with an underscore and have the post method refuse to service such names.
If you really want to use decorators, try this:
def webmethod(f):
f.is_webmethod = True
return f
and get post to check for the existence of the is_webmethod attribute before calling the method.
This would seem to be the simplest approach to meet your specs as stated:
webmethods = {}
def webmethod(f):
webmethods[f.__name__] = f
return f
and, in WebmethodBaseClass,
def post(self, methodName):
webmethods[methodName](self, "kapow")
I suspect you want something different (e.g., separate namespaces for different subclasses vs a single global webmethods dictionary...?), but it's hard to guess without more info exactly how your desires differ from your specs -- so maybe you can tell us how this simplistic approach fails to achieve some of your desiderata, so it can be enriched according to what you actually want.
class UsefulClass(WebmethodBaseClass):
def someMethod(self, blah):
print(blah)
#webmethod
def webby(self, blah):
print(blah)
class WebmethodBaseClass():
def post(self, methodName):
method = getattr(self, methodName)
if method.webmethod:
method("kapow")
...
def webmethod(f):
f.webmethod = True
return f
a = UsefulClass()
a.post("someMethod") # should error
a.post("webby") # prints "kapow"

Categories