This question already has answers here:
Difference between decorator classes and decorator functions
(3 answers)
Closed 7 years ago.
In python there are two ways to declare decorators:
Class based
class mydecorator(object):
def __init__(self, f):
self.f = f
def __call__(self, *k, **kw):
# before f actions
self.f(*k, **kw)
# after f actions
Function based
def mydecorator(f):
def decorator(*k, **kw):
# before f actions
f(*k, **kw)
# after f actions
return decorator
Is there any difference between these declarations?
In which cases each of them should be used?
If you want to keep state in the decorator you should use a class.
For example, this does not work
def mydecorator(f):
x = 0
def decorator():
x += 1 # x is a nonlocal name and cant be modified
return f(x)
return decorator
There are many workarounds for this but the simplest way is to use a class
class mydecorator(object):
def __init__(self, f):
self.f = f
self.x = 0
def __call__(self, *k, **kw):
self.x += 1
return f(self.x)
When you're creating a callable returning another callable, the function approach is easier and cheaper. There are two main differences:
The function approach works automatically with methods, while if you're using your class approach, you'd have to read on descriptors and define a __get__ method.
The class approach makes keeping state easier. You could use a closure, especially in Python 3, but a class approach is generally preferred for keeping state.
Additionally, the function approach allows you to return the original function, after modifying it or storing it.
However, a decorator can return something other than a callable or something more than a callable. With a class, you can:
Add methods and properties to the decorated callable object, or implement operations on them (uh-oh).
Create descriptors that act in a special way when placed in classes (e.g. classmethod, property)
Use inheritance to implement similar but different decorators.
If you have any doubt, ask yourself: Do you want your decorator to return a function that acts exactly like a function should? Use a function returning a function. Do you want your decorator to return a custom object that does something more or something different to what a function does? Create a class and use it as a decorator.
In fact there are no 'two ways'. There is only one way (define a callable object) or as many ways as there are in python to make a callable object (it could be a method of other object, a result of lambda expression, a 'partial' object, anything that is callable).
Function definition is the easiest way to make a callable object and, as the simplest one, is probably the best in most cases. Using a class gives you more possibilities to cleanly code more complicated cases (even in the simplest cases it looks quite elegant), but it is not that obvious what it does.
No, there are (more than) two ways to make callable objects. One is to def a function, which is obviously callable. Another is to define a __call__ method in a class, which will make instances of it callable. And classes themselves are callable objects.
A decorator is nothing more than a callable object, which is intended to accept a function as its sole argument and return something callable. The following syntax:
#decorate
def some_function(...):
...
Is just a slightly nicer way of writing:
def some_function(...):
...
some_function = decorate(some_function)
The class-based example you give isn't a function that takes a function and return a function, which is the bog-standard vanilla decorator, it's a class that is initialised with a function whose instances are callable. To me, this is a little weird if you're not actually using it as a class (Does it have other methods? Does its state change? Do you make several instances of it that have common behaviour encapsulated by the class?). But normal use of your decorated function will not tell the difference (unless it's a particularly invasive decorator), so do whatever feels more natural to you.
Let's just test it!
test_class = """
class mydecorator_class(object):
def __init__(self, f):
self.f = f
def __call__(self, *k, **kw):
# before f actions
print 'hi class'
self.f(*k, **kw)
print 'goodbye class'
# after f actions
#mydecorator_class
def cls():
print 'class'
cls()
"""
test_deco = """
def mydecorator_func(f):
def decorator(*k, **kw):
# before f actions
print 'hi function'
f(*k, **kw)
print 'goodbye function'
# after f actions
return decorator
#mydecorator_func
def fun():
print 'func'
fun()
"""
if __name__ == "__main__":
import timeit
r = timeit.Timer(test_class).timeit(1000)
r2 = timeit.Timer(test_deco).timeit(1000)
print r, r2
I've got results like this : 0.0499339103699 0.0824959278107
This is means that class deco 2 times faster?
Related
I was wondering if there was a way for a decorated function to refer to an object created by the wrapper of a decorator. My question arose when I was thinking to use a decorator to :
make a wrapper that creates a figure with subplots
inside the wrapper execute the decorated function which would add some plots
finally save the figure in the wrapper
However, the decorated function would need to refer the figure created by the wrapper. How can the decorated function refer to that object ? Do we necessarily have to resort to global variables ?
Here is a short example where I reference in the decorated function a variable created in the wrapper (but I did not manage to do this without tweaking with globals):
def my_decorator(func):
def my_decorator_wrapper(*args, **kwargs):
global x
x = 0
print("x in wrapper:", x)
return func(*args, **kwargs)
return my_decorator_wrapper
#my_decorator
def decorated_func():
global x
x += 1
print("x in decorated_func:", x)
decorated_func()
# prints:
# x in wrapper: 0
# x in decorated_func: 1
I know this would be easily done in a class, but I am asking this question out of curiosity.
Try to avoid using global variables.
Use arguments to pass objects to functions
There is one canonical way to pass a value to a function: arguments.
Pass the object as argument to the decorated function when the wrapper is called.
from functools import wraps
def decorator(f):
obj = 1
#wraps(f)
def wrapper(*args):
return f(obj, *args)
return wrapper
#decorator
def func(x)
print(x)
func() # prints 1
Use a default argument for passing the same object
If you need to pass the same object to all functions, storing it as default argument of your decorator is an alternative.
from functools import wraps
def decorator(f, obj={}):
#wraps(f)
def wrapper(*args):
return f(obj, *args)
return wrapper
#decorator
def func(params)
params['foo'] = True
#decorator
def gunc(params)
print(params)
func()
# proof that gunc receives the same object
gunc() # prints {'foo': True}
The above creates a common private dict which can only be accessed by decorated functions. Since a dict is mutable, changes will be reflected across function calls.
Yes, the function can refer to it by looking at itself.
the decorator end. it just takes attributes and sets them on the function
if it looks a bit complicated, that's because decorators that take parameters need this particular structure to work. see Decorators with parameters?
def declare_view(**kwds):
"""declaratively assocatiate a Django View function with resources
"""
def actual_decorator(func):
for k, v in kwds.items():
setattr(func, k, v)
return func
return actual_decorator
calling the decorator
#declare_view(
x=2
)
def decorated_func():
#the function can look at its own name, because the function exists
#by the time it gets called.
print("x in decorated_func:", decorated_func.x)
decorated_func()
output
x in decorated_func: 2
In practice I've used this quite a bit. The idea for me is to associate Django view functions with particular backend data classes and templates they have to collaborate with. Because it is declarative, I can introspect through all the Django views and track their associated URLs as well as custom data objects and templates. Works very well, but yes, the function does expect certain attributes to be existing on itself. It doesn't know that a decorator set them.
Oh, and there's no good reason, in my case, for these to be passed as parameters in my use cases, these variables hold basically hardcoded values which never change, from the POV of the function.
Odd at first, but quite powerful and no runtime or maintenance drawbacks.
Here's some live example that puts that in context.
#declare_view(
viewmanager_cls=backend.VueManagerDetailPSCLASSDEFN,
template_name="pssecurity/detail.html",
objecttype=constants.OBJECTTYPE_PERMISSION_LIST[0],
bundle_name="pssecurity/detail.psclassdefn",
)
def psclassdefn_detail(request, CLASSID, dbr=None, PORTAL_NAME="EMPLOYEE"):
"""
"""
f_view = psclassdefn_detail
viewmanager = f_view.viewmanager_cls(request, mdb, f_view=f_view)
...do things based on the parameters...
return viewmanager.HttpResponse(f_view.template_name)
Classes as Decorators
This article points to classes as decorators, which seems a more elegant way to point to the state defined in the decorator. It relies on function attributes and uses the special .__call__() method in the decorating class.
Here is my example revisited using a class instead of a function as a decorator:
class my_class_decorator:
def __init__(self, func):
self.func = func
self.x = 0
def __call__(self, *args, **kwargs):
print("x in wrapper:", self.x)
return self.func(*args, **kwargs)
#my_class_decorator
def decorated_func():
decorated_func.x += 1
print("x in decorated_func:", decorated_func.x)
decorated_func()
# prints:
# x in wrapper: 0
# x in decorated_func: 1
I've only seen examples for setting the __repr__ method in class definitions. Is it possible to change the __repr__ for functions either in their definitions or after defining them?
I've attempted without success...
>>> def f():
pass
>>> f
<function f at 0x1026730c8>
>>> f.__repr__ = lambda: '<New repr>'
>>> f
<function __main__.f>
Yes, if you're willing to forgo the function actually being a function.
First, define a class for our new type:
import functools
class reprwrapper(object):
def __init__(self, repr, func):
self._repr = repr
self._func = func
functools.update_wrapper(self, func)
def __call__(self, *args, **kw):
return self._func(*args, **kw)
def __repr__(self):
return self._repr(self._func)
Add in a decorator function:
def withrepr(reprfun):
def _wrap(func):
return reprwrapper(reprfun, func)
return _wrap
And now we can define the repr along with the function:
#withrepr(lambda x: "<Func: %s>" % x.__name__)
def mul42(y):
return y*42
Now repr(mul42) produces '<Func: mul42>'
No, because repr(f) is done as type(f).__repr__(f) instead.
In order to do that, you'd need to change the __repr__ function for the given class, which in this case is the built-in function class (types.FunctionType). Since in Python you cannot edit built-in classes, only subclass them, you cannot.
However, there are two approaches you could follow:
Wrap some functions as kwatford suggested
Create your own representation protocol with your own repr function. For example, you could define a myrepr function that looks for __myrepr__ methods first, which you cannot add to the function class but you can add it to individual function objects as you suggest (as well as your custom classes and objects), then defaults to repr if __myrepr__ is not found. A possible implementation for this would be:
def myrepr(x):
try:
x.__myrepr__
except AttributeError:
return repr(x)
else:
return x.__myrepr__()
Then you could define __myrepr__ methods and use the myrepr function. Alternatively, you could also do __builtins__.repr = myrepr to make your function the default repr and keep using repr. This approach would end up doing exactly what you want, though editing __builtins__ may not always be desirable.
This appears to be difficult. Kwatford's approach only solves this problem partially since it does not work for functions in classes, becuase self would be treated like a positional argument, as explained in Decorating Python class methods - how do I pass the instance to the decorator? - However, the solution for that question is not applicable to this case, unfortunately, as using __get__() and functools.partial would override the custom __repr__().
Is it possible to inspect a function/method to see whether it can be used as a decorator? In that it follows the usual way decorators wrap other functions and return a callable? Specifically, I'm looking to validate 3rd party code.
By applying a suspected decorator, catching exceptions, and then testing whether the result contains a __call__ method, you could produce a guess as to whether a given callable is a decorator or not. But it will be only a guess, not a guarantee.
Beyond that, I do not believe what you want will be possible in general, due to the dynamically typed nature of the Python language and to the special treatment of built-in functions in the CPython interpreter. It is not possible to programmatically tell whether a callable will accept another callable as an argument, or what type its return value will have. Also, in CPython, for functions implemented in C, you cannot even inspect a callable to see how many arguments it accepts.
The word "decorator" can be taken to mean different things. One way to define it is, a decorator is any callable that accepts a single (callable) argument and returns a callable.
Note that I have not even used the word "function" in this definition; it would actually be incorrect to do so. Indeed, some commonly used decorators have strange properties:
The built-in classmethod and staticmethod decorators return descriptor objects, not functions.
Since language version 2.6 you can decorate classes, not just functions and methods.
Any class containing an __init__(self, somecallable) method and a __call__(self, *args, **kwargs) method can be used as a decorator.
Since there is no standardized decorator in Python, there's no real way of telling if a function is a decorator unless you know something about the decorator you're looking for.
If the decorator is under your control, you can add a mark to indicate it's a decorated function. Otherwise there is no real unified way of doing this. Take this example for instance:
def decorator(func):
return g
#decorator
def f()
pass
def g():
pass
In the above example, in run-time, f and g will be identical, and there is no way of telling the two apart.
Any callable with the right number of arguments can be used as a decorator. Remember that
#foo
def bar(...):
is exactly the same as
def bar(...):
...
bar = foo(bar)
Naturally, since foo could return anything, you have no way of checking whether a function has been decorated or not. Although foo could be nice and leave a mark, it has no obligation to do so.
If you are given some Python code and you want to find all the things that are decorators, you can do so by parsing the code into an abstract syntax tree then walking the tree looking for decorated functions. Here's an example, storing the .ids of the decorators. Obviously, you could store the astobjects if you wanted to.
>>> class DecoratorFinder(ast.NodeVisitor):
... def __init__(self, *args, **kwargs):
... super(DecoratorFinder, self).__init__(*args, **kwargs)
... self.decorators = set()
...
... def visit_FunctionDef(self, node):
... self.decorators.update(dec.id for dec in node.decorator_list)
... self.generic_visit(node)
...
>>> finder = DecoratorFinder()
>>> x = ast.parse("""
... #dec
... def foo():
... pass
... """)
>>> finder.visit(x)
>>> finder.decorators
set(['dec'])
No this is not possible. May be instead of checking if f is a decorator, you should think why you need to check that?
If you are expecting some specific decorator, you can directly check that, if you want some specific behavior/methods/attributes you can check that
If you want to check if some callable f can be used as decorator, you can test the decorator behavior by passing some dummy function, but in general it may not work or have different behavior for different inputs.
Here is a such naive check:
def decorator1(func):
def _wrapper(*args, **kwargs):
print "before"
func(*args, **kwargs)
print "after"
return _wrapper
def dummy_func(): pass
out_func = decorator1(dummy_func)
if callable(out_func) and dummy_func != out_func:
print "aha decorated!"
I've never done anything like this, but in general python relies on "duck-typing" in situations like this. So you could just try to decorate a dummy function and see if a callable is returned.
How to get all methods of a given class A that are decorated with the #decorator2?
class A():
def method_a(self):
pass
#decorator1
def method_b(self, b):
pass
#decorator2
def method_c(self, t=5):
pass
Method 1: Basic registering decorator
I already answered this question here: Calling functions by array index in Python =)
Method 2: Sourcecode parsing
If you do not have control over the class definition, which is one interpretation of what you'd like to suppose, this is impossible (without code-reading-reflection), since for example the decorator could be a no-op decorator (like in my linked example) that merely returns the function unmodified. (Nevertheless if you allow yourself to wrap/redefine the decorators, see Method 3: Converting decorators to be "self-aware", then you will find an elegant solution)
It is a terrible terrible hack, but you could use the inspect module to read the sourcecode itself, and parse it. This will not work in an interactive interpreter, because the inspect module will refuse to give sourcecode in interactive mode. However, below is a proof of concept.
#!/usr/bin/python3
import inspect
def deco(func):
return func
def deco2():
def wrapper(func):
pass
return wrapper
class Test(object):
#deco
def method(self):
pass
#deco2()
def method2(self):
pass
def methodsWithDecorator(cls, decoratorName):
sourcelines = inspect.getsourcelines(cls)[0]
for i,line in enumerate(sourcelines):
line = line.strip()
if line.split('(')[0].strip() == '#'+decoratorName: # leaving a bit out
nextLine = sourcelines[i+1]
name = nextLine.split('def')[1].split('(')[0].strip()
yield(name)
It works!:
>>> print(list( methodsWithDecorator(Test, 'deco') ))
['method']
Note that one has to pay attention to parsing and the python syntax, e.g. #deco and #deco(... are valid results, but #deco2 should not be returned if we merely ask for 'deco'. We notice that according to the official python syntax at http://docs.python.org/reference/compound_stmts.html decorators are as follows:
decorator ::= "#" dotted_name ["(" [argument_list [","]] ")"] NEWLINE
We breathe a sigh of relief at not having to deal with cases like #(deco). But note that this still doesn't really help you if you have really really complicated decorators, such as #getDecorator(...), e.g.
def getDecorator():
return deco
Thus, this best-that-you-can-do strategy of parsing code cannot detect cases like this. Though if you are using this method, what you're really after is what is written on top of the method in the definition, which in this case is getDecorator.
According to the spec, it is also valid to have #foo1.bar2.baz3(...) as a decorator. You can extend this method to work with that. You might also be able to extend this method to return a <function object ...> rather than the function's name, with lots of effort. This method however is hackish and terrible.
Method 3: Converting decorators to be "self-aware"
If you do not have control over the decorator definition (which is another interpretation of what you'd like), then all these issues go away because you have control over how the decorator is applied. Thus, you can modify the decorator by wrapping it, to create your own decorator, and use that to decorate your functions. Let me say that yet again: you can make a decorator that decorates the decorator you have no control over, "enlightening" it, which in our case makes it do what it was doing before but also append a .decorator metadata property to the callable it returns, allowing you to keep track of "was this function decorated or not? let's check function.decorator!". And then you can iterate over the methods of the class, and just check to see if the decorator has the appropriate .decorator property! =) As demonstrated here:
def makeRegisteringDecorator(foreignDecorator):
"""
Returns a copy of foreignDecorator, which is identical in every
way(*), except also appends a .decorator property to the callable it
spits out.
"""
def newDecorator(func):
# Call to newDecorator(method)
# Exactly like old decorator, but output keeps track of what decorated it
R = foreignDecorator(func) # apply foreignDecorator, like call to foreignDecorator(method) would have done
R.decorator = newDecorator # keep track of decorator
#R.original = func # might as well keep track of everything!
return R
newDecorator.__name__ = foreignDecorator.__name__
newDecorator.__doc__ = foreignDecorator.__doc__
# (*)We can be somewhat "hygienic", but newDecorator still isn't signature-preserving, i.e. you will not be able to get a runtime list of parameters. For that, you need hackish libraries...but in this case, the only argument is func, so it's not a big issue
return newDecorator
Demonstration for #decorator:
deco = makeRegisteringDecorator(deco)
class Test2(object):
#deco
def method(self):
pass
#deco2()
def method2(self):
pass
def methodsWithDecorator(cls, decorator):
"""
Returns all methods in CLS with DECORATOR as the
outermost decorator.
DECORATOR must be a "registering decorator"; one
can make any decorator "registering" via the
makeRegisteringDecorator function.
"""
for maybeDecorated in cls.__dict__.values():
if hasattr(maybeDecorated, 'decorator'):
if maybeDecorated.decorator == decorator:
print(maybeDecorated)
yield maybeDecorated
It works!:
>>> print(list( methodsWithDecorator(Test2, deco) ))
[<function method at 0x7d62f8>]
However, a "registered decorator" must be the outermost decorator, otherwise the .decorator attribute annotation will be lost. For example in a train of
#decoOutermost
#deco
#decoInnermost
def func(): ...
you can only see metadata that decoOutermost exposes, unless we keep references to "more-inner" wrappers.
sidenote: the above method can also build up a .decorator that keeps track of the entire stack of applied decorators and input functions and decorator-factory arguments. =) For example if you consider the commented-out line R.original = func, it is feasible to use a method like this to keep track of all wrapper layers. This is personally what I'd do if I wrote a decorator library, because it allows for deep introspection.
There is also a difference between #foo and #bar(...). While they are both "decorator expressons" as defined in the spec, note that foo is a decorator, while bar(...) returns a dynamically-created decorator, which is then applied. Thus you'd need a separate function makeRegisteringDecoratorFactory, that is somewhat like makeRegisteringDecorator but even MORE META:
def makeRegisteringDecoratorFactory(foreignDecoratorFactory):
def newDecoratorFactory(*args, **kw):
oldGeneratedDecorator = foreignDecoratorFactory(*args, **kw)
def newGeneratedDecorator(func):
modifiedFunc = oldGeneratedDecorator(func)
modifiedFunc.decorator = newDecoratorFactory # keep track of decorator
return modifiedFunc
return newGeneratedDecorator
newDecoratorFactory.__name__ = foreignDecoratorFactory.__name__
newDecoratorFactory.__doc__ = foreignDecoratorFactory.__doc__
return newDecoratorFactory
Demonstration for #decorator(...):
def deco2():
def simpleDeco(func):
return func
return simpleDeco
deco2 = makeRegisteringDecoratorFactory(deco2)
print(deco2.__name__)
# RESULT: 'deco2'
#deco2()
def f():
pass
This generator-factory wrapper also works:
>>> print(f.decorator)
<function deco2 at 0x6a6408>
bonus Let's even try the following with Method #3:
def getDecorator(): # let's do some dispatching!
return deco
class Test3(object):
#getDecorator()
def method(self):
pass
#deco2()
def method2(self):
pass
Result:
>>> print(list( methodsWithDecorator(Test3, deco) ))
[<function method at 0x7d62f8>]
As you can see, unlike method2, #deco is correctly recognized even though it was never explicitly written in the class. Unlike method2, this will also work if the method is added at runtime (manually, via a metaclass, etc.) or inherited.
Be aware that you can also decorate a class, so if you "enlighten" a decorator that is used to both decorate methods and classes, and then write a class within the body of the class you want to analyze, then methodsWithDecorator will return decorated classes as well as decorated methods. One could consider this a feature, but you can easily write logic to ignore those by examining the argument to the decorator, i.e. .original, to achieve the desired semantics.
To expand upon #ninjagecko's excellent answer in Method 2: Source code parsing, you can use the ast module introduced in Python 2.6 to perform self-inspection as long as the inspect module has access to the source code.
def findDecorators(target):
import ast, inspect
res = {}
def visit_FunctionDef(node):
res[node.name] = [ast.dump(e) for e in node.decorator_list]
V = ast.NodeVisitor()
V.visit_FunctionDef = visit_FunctionDef
V.visit(compile(inspect.getsource(target), '?', 'exec', ast.PyCF_ONLY_AST))
return res
I added a slightly more complicated decorated method:
#x.y.decorator2
def method_d(self, t=5): pass
Results:
> findDecorators(A)
{'method_a': [],
'method_b': ["Name(id='decorator1', ctx=Load())"],
'method_c': ["Name(id='decorator2', ctx=Load())"],
'method_d': ["Attribute(value=Attribute(value=Name(id='x', ctx=Load()), attr='y', ctx=Load()), attr='decorator2', ctx=Load())"]}
If you do have control over the decorators, you can use decorator classes rather than functions:
class awesome(object):
def __init__(self, method):
self._method = method
def __call__(self, obj, *args, **kwargs):
return self._method(obj, *args, **kwargs)
#classmethod
def methods(cls, subject):
def g():
for name in dir(subject):
method = getattr(subject, name)
if isinstance(method, awesome):
yield name, method
return {name: method for name,method in g()}
class Robot(object):
#awesome
def think(self):
return 0
#awesome
def walk(self):
return 0
def irritate(self, other):
return 0
and if I call awesome.methods(Robot) it returns
{'think': <mymodule.awesome object at 0x000000000782EAC8>, 'walk': <mymodulel.awesome object at 0x000000000782EB00>}
For those of us who just want the absolute simplest possible case - namely, a single-file solution where we have total control over both the class we're working with and the decorator we're trying to track, I've got an answer. ninjagecko linked to a solution for when you have control over the decorator you want to track, but I personally found it to be complicated and really hard to understand, possibly because I've never worked with decorators until now. So, I've created the following example, with the goal of being as straightforward and simple as possible. It's a decorator, a class with several decorated methods, and code to retrieve+run all methods that have a specific decorator applied to them.
# our decorator
def cool(func, *args, **kwargs):
def decorated_func(*args, **kwargs):
print("cool pre-function decorator tasks here.")
return_value = func(*args, **kwargs)
print("cool post-function decorator tasks here.")
return return_value
# add is_cool property to function so that we can check for its existence later
decorated_func.is_cool = True
return decorated_func
# our class, in which we will use the decorator
class MyClass:
def __init__(self, name):
self.name = name
# this method isn't decorated with the cool decorator, so it won't show up
# when we retrieve all the cool methods
def do_something_boring(self, task):
print(f"{self.name} does {task}")
#cool
# thanks to *args and **kwargs, the decorator properly passes method parameters
def say_catchphrase(self, *args, catchphrase="I'm so cool you could cook an egg on me.", **kwargs):
print(f"{self.name} says \"{catchphrase}\"")
#cool
# the decorator also properly handles methods with return values
def explode(self, *args, **kwargs):
print(f"{self.name} explodes.")
return 4
def get_all_cool_methods(self):
"""Get all methods decorated with the "cool" decorator.
"""
cool_methods = {name: getattr(self, name)
# get all attributes, including methods, properties, and builtins
for name in dir(self)
# but we only want methods
if callable(getattr(self, name))
# and we don't need builtins
and not name.startswith("__")
# and we only want the cool methods
and hasattr(getattr(self, name), "is_cool")
}
return cool_methods
if __name__ == "__main__":
jeff = MyClass(name="Jeff")
cool_methods = jeff.get_all_cool_methods()
for method_name, cool_method in cool_methods.items():
print(f"{method_name}: {cool_method} ...")
# you can call the decorated methods you retrieved, just like normal,
# but you don't need to reference the actual instance to do so
return_value = cool_method()
print(f"return value = {return_value}\n")
Running the above example gives us the following output:
explode: <bound method cool.<locals>.decorated_func of <__main__.MyClass object at 0x00000220B3ACD430>> ...
cool pre-function decorator tasks here.
Jeff explodes.
cool post-function decorator tasks here.
return value = 4
say_catchphrase: <bound method cool.<locals>.decorated_func of <__main__.MyClass object at 0x00000220B3ACD430>> ...
cool pre-function decorator tasks here.
Jeff says "I'm so cool you could cook an egg on me."
cool post-function decorator tasks here.
return value = None
Note that the decorated methods in this example have different types of return values and different signatures, so the practical value of being able to retrieve and run them all is a bit dubious. However, in cases where there are many similar methods, all with the same signature and/or type of return value (like if you're writing a connector to retrieve unnormalized data from one database, normalize it, and insert it into a second, normalized database, and you have a bunch similar methods, e.g. 15 read_and_normalize_table_X methods), being able to retrieve (and run) them all on the fly could be more useful.
Maybe, if the decorators are not too complex (but I don't know if there is a less hacky way).
def decorator1(f):
def new_f():
print "Entering decorator1", f.__name__
f()
new_f.__name__ = f.__name__
return new_f
def decorator2(f):
def new_f():
print "Entering decorator2", f.__name__
f()
new_f.__name__ = f.__name__
return new_f
class A():
def method_a(self):
pass
#decorator1
def method_b(self, b):
pass
#decorator2
def method_c(self, t=5):
pass
print A.method_a.im_func.func_code.co_firstlineno
print A.method_b.im_func.func_code.co_firstlineno
print A.method_c.im_func.func_code.co_firstlineno
I don't want to add much, just a simple variation of ninjagecko's Method 2. It works wonders.
Same code, but using list comprehension instead of a generator, which is what I needed.
def methodsWithDecorator(cls, decoratorName):
sourcelines = inspect.getsourcelines(cls)[0]
return [ sourcelines[i+1].split('def')[1].split('(')[0].strip()
for i, line in enumerate(sourcelines)
if line.split('(')[0].strip() == '#'+decoratorName]
A simple way to solve this problem is to put code in the decorator that adds each function/method, that is passed in, to a data set (for example a list).
e.g.
def deco(foo):
functions.append(foo)
return foo
now every function with the deco decorator will be added to functions.
I'm busy creating a metaclass that replaces a stub function on a class with a new one with a proper implementation. The original function could use any signature. My problem is that I can't figure out how to create a new function with the same signature as the old one. How would I do this?
Update
This has nothing to do with the actual question which is "How do I dynamically create a function with the same signature as another function?" but I'm adding this to show why I can't use subclasses.
I'm trying to implement something like Scala Case Classes in Python. (Not the pattern matching aspect just the automatically generated properties, eq, hash and str methods.)
I want something like this:
>>> class MyCaseClass():
... __metaclass__ = CaseMetaClass
... def __init__(self, a, b):
... pass
>>> instance = MyCaseClass(1, 'x')
>>> instance.a
1
>>> instance.b
'x'
>>> str(instance)
MyCaseClass(1, 'x')
As far as I can see, there is no way to that with subclasses.
I believe functools.wraps does not reproduce the original call signature. However, Michele Simionato's decorator module does:
import decorator
class FooType(type):
def __init__(cls,name,bases,clsdict):
#decorator.decorator
def modify_stub(func, *args,**kw):
return func(*args,**kw)+' + new'
setattr(cls,'stub',modify_stub(clsdict['stub']))
class Foo(object):
__metaclass__=FooType
def stub(self,a,b,c):
return 'original'
foo=Foo()
help(foo.stub)
# Help on method stub in module __main__:
# stub(self, a, b, c) method of __main__.Foo instance
print(foo.stub(1,2,3))
# original + new
use functools.wraps
>>> from functools import wraps
>>> def f(a,b):
return a+b
>>> #wraps(f)
def f2(*args):
print(args)
return f(*args)
>>> f2(2,5)
(2, 5)
7
It is possible to do this, using inspect.getargspecs. There's even a PEP in place to make it easier.
BUT -- this is not a good thing to do. Can you imagine how much of a debugging/maintenance nightmare it would be to have your functions dynamically created at runtime -- and not only that, but done so by a metaclass?! I don't understand why you have to replace the stub dynamically; can't you just change the code when you want to change the function? I mean, suppose you have a class
class Spam( object ):
def ham( self, a, b ):
return NotImplemented
Since you don't know what it's meant to do, the metaclass can't actually implement any functionality. If you knew what ham were meant to do, you could do it in ham or one of its parent classes, instead of returning NotImplemented.