Consider this code:
class Bar(object): pass
class Foo(object):
def bar(self): return Bar()
f = Foo()
def Bar(): pass
print(f.bar())
It prints None. But the poor guy bar didn't expect Bar to be what it became afterward!
My question is, what is the "best" (most elegant, most efficient, most Pythonic, whatever) code I can write inside of Foo (hopefully without polluting outside scopes) that will make sure bar can reference the Bar that was defined at the time of its declaration rather than invocation?
(It's not that I can't come up with any solutions, but rather it's that I don't know what the proper solution is.)
To "statically bind" a name at the time of function/method creation, you could use a default argument:
class Bar(object):
pass
class Foo(object):
def bar(self, Bar=Bar):
return Bar()
Per this famous Python "gotcha", default argument values are only evaluated once.
It isn't terribly useful in this case, as it's your own silly fault for naming a function the same as a class (and the ensuing error gives you useful information); binding the name only makes it more difficult to e.g. mock out the class later on. However, it can be useful to solve issues with late binding in nested functions, per this question.
Per ShadowRanger's comment, if you're using Python 3.x you can add *, to the list of arguments (bar(self, *, Bar=Bar)) to prevent the caller clobbering the default accidentally by passing too many positional arguments; any additional arguments will raise a TypeError.
Answering my own question, but the answer is to put everything in a closure:
def static_define(locals, super, tuple): # All arguments are bound STATICALLY
# Anything defined here will be exported externally
class Foo(tuple):
def __new__(cls, *args):
return super(Foo, cls).__new__(cls, args)
return locals()
# The following line "exports" all the returned locals into the current global namespace
(lambda f, bi: (lambda g: g.pop(f.__name__) and bi.None or g.update((lambda args: (lambda l: bi.reduce(lambda l, v: l.pop(v, bi.None) and bi.None or l, args, l))(f(*bi.map(lambda v: bi.eval(v, g), args))))(bi.__import__('inspect').getargspec(f).args)))(f.__globals__))(static_define, __builtins__)
super = None # tamper!
print Foo() # unaffected by tampering
Related
Here is what I want to do:
class demo(object):
def a(self):
pass
def b(self, param=self.a): #I tried demo.a as well after making a static
param()
The problem is apparently that one can't access the class in the function declaration line.
Is there a way to add a prototype like in c(++)?
At the moment I use a ugly workarround:
def b(self, param=True): #my real function shall be able to use None, to skip the function call
if param == True:
param = self.a
if param != None: #This explainds why I can't take None as default,
#for param, I jsut needed something as default which was
#neither none or a callable function (don't want to force the user to create dummy lambdas)
param()
So is it possible to achieve something like described in the top part without this ugly workarround? Note bene: I am bound to Jython which is approximately python 2.5 (I know there is 2.7 but I can't upgrade)
Short answer: No.
I think the best way to do it, if you want to be able to pass objects like None, True, etc., is to create a custom placeholder object like so:
default_value = object()
class demo(object):
def a(self):
pass
def b(self, param=default_value):
if param is default_value:
self.a()
else:
param()
You can use the funciton a as the default value for b, like so:
def b(self, param=a):
This will work long as a is defined before b. But the function a is not the same as the bound method self.a, so you'd need to bind it before you could call it, and you would need some way to distinguish a passed callable from the default method a, so that you could bind the latter but not the former. This would obviously be much messier than the comparatively short and readable code that I suggest.
Don't tell anyone I showed you this.
class demo:
def a(self): print(self, "called 'a'")
def b(self, param): param(self)
demo.b.__defaults__ = (demo.a,)
demo().b()
(In 2.x, __defaults__ is spelled func_defaults.)
I'll answer this question again, contradicting my earlier answer:
Short answer: YES! (sort of)
With the help of a method decorator, this is possible. The code is long and somewhat ugly, but the usage is short and simple.
The problem was that we can only use unbound methods as default arguments. Well, what if we create a wrapping function -- a decorator -- which binds the arguments before calling the real function?
First we create a helper class that can perform this task.
from inspect import getcallargs
from types import MethodType
from functools import wraps
class MethodBinder(object):
def __init__(self, function):
self.function = function
def set_defaults(self, args, kwargs):
kwargs = getcallargs(self.function, *args, **kwargs)
# This is the self of the method we wish to call
method_self = kwargs["self"]
# First we build a list of the functions that are bound to self
targets = set()
for attr_name in dir(method_self):
attr = getattr(method_self, attr_name)
# For older python versions, replace __func__ with im_func
if hasattr(attr, "__func__"):
targets.add(attr.__func__)
# Now we check whether any of the arguments are identical to the
# functions we found above. If so, we bind them to self.
ret = {}
for kw, val in kwargs.items():
if val in targets:
ret[kw] = MethodType(val, method_self)
else:
ret[kw] = val
return ret
So instances of MethodBinder are associated with a method (or rather a function that will become a method). MethodBinders method set_defaults may be given the arguments used to call the associated method, and it will bind any unbound method of the self of the associated method and return a kwargs dict that may be used to call the associated method.
Now we can create a decorator using this class:
def bind_args(f):
# f will be b in the below example
binder = MethodBinder(f)
#wraps(f)
def wrapper(*args, **kwargs):
# The wrapper function will get called instead of b, so args and kwargs
# contains b's arguments. Let's bind any unbound function arguments:
kwargs = binder.set_defaults(args, kwargs)
# All arguments have been turned into keyword arguments. Now we
# may call the real method with the modified arguments and return
# the result.
return f(**kwargs)
return wrapper
Now that we've put the uglyness behind us, let's show the simple and pretty usage:
class demo(object):
def a(self):
print("{0}.a called!".format(self))
#bind_args
def b(self, param=a):
param()
def other():
print("other called")
demo().b()
demo().b(other)
This recipe uses a rather new addition to python, getcallargs from inspect. It's available only in newer versions of python2.7 and 3.1.
You can put the method name in the function definition:
class Demo(object):
def a(self):
print 'a'
def b(self, param='a'):
if param:
getattr(self, param)()
However, you'll still have to check whether param has a value of whether it is None. Note that this approach should not be used for untrusted input as it allows execution of any function of that class.
I like lazyr's answer but maybe you will like this solution better:
class Demo(object):
def a(self):
pass
def b(self, *args):
if not args:
param=self.a
elif len(args)>1:
raise TypeError("b() takes at most 1 positional argument")
else:
param=args[0]
if param is not None:
param()
I also prefer lazyr's answer (I usually use None as the default parameter), but you can also use keyword arguments to be more explicit about it:
def b(self, **kwargs):
param = kwargs.get('param', self.a)
if param: param()
You can still use None as a parameter, resulting in param not being executed. However, if you don't include the keyword argument param=, it will default to a().
demo.b() #demo.a() executed
demo.b(param=some_func) #some_func() executed
demo.b(param=None) #nothing executed.
Really sorry for the extremely stupid title, but if I know what it is, I wouldn't write here (:
def some_decorator( func ):
# ..
class A:
#some_decorator
def func():
pass
#func.some_decorator # this one here - func.some_decorator ?
def func():
pass
some_decorator decorates func - that's OK. But what is func.some_decorator and how some_decorator becomes a member ( or something else ? ) of func?
P.S. I'm 90% sure, that there's such question here (as this seems something basic), but I don't know how to search it. If there's a exact duplicate, I'll delete this question.
Note : It's not typo, nor accident, that both member functions are named func. The decorator is for overloading: the question is related to Decorating method (class methods overloading)
Remember that the function definition with decorator is equivalent to this:
def func():
pass
func = some_decorator(func)
So in the following lines, func doesn't refer to the function you defined but to what the decorator turned it into. Also note that decorators can return any object, not just functions. So some_decorator returns an object with a method (it's unfortunate that the names some_decorator and func are reused in the example - it's confusing, but doesn't change anything about the concept) that is itself a decorator. As the expression after the # is evaluated first, you still have a reference to the first decorator's method after you defined another plain function func. That decorator is applied to this new function. The full example is then equivalent to this:
class A:
def func():
pass
func = some_decorator(func)
_decorator = func.some_decorator
def func():
pass
func = _decorator(func)
One way to clarify this is to demonstrate it with a concrete example that behaves like this, the builtin property descriptor:
class C(object):
#property
def x(self):
"This is a property object, not a function"
return self._x
#x.setter
def x(self, val):
self._x = val
>>> c = C()
>>> c.x = 1
>>> c.x
1
>>> C.x
<property object at 0x2396100>
>>> C.x.__doc__
'This is a property object, not a function'
>>> C.x.getter.__doc__
'Descriptor to change the getter on a property.'
>>> C.x.setter.__doc__
'Descriptor to change the setter on a property.'
>>> C.x.deleter.__doc__
'Descriptor to change the deleter on a property.'
The first invocation of property (as a decorator) means that x is not a function - it is a property descriptor. A feature of properties is that they allow you to initially define just the fget method, and then provide fset and fdel later by using the property.setter and property.deleter decorators (although since each of these creates a new property object, you do need to make sure to use the same name each time).
Something similar will usually be the case whenever you see code using this kind of pattern. Ideally, the naming of the decorators involved will make it reasonably clear what is going on (e.g. most people seem to grasp the idiom for defining property attributes reasonably easily).
How to get all methods of a given class A that are decorated with the #decorator2?
class A():
def method_a(self):
pass
#decorator1
def method_b(self, b):
pass
#decorator2
def method_c(self, t=5):
pass
Method 1: Basic registering decorator
I already answered this question here: Calling functions by array index in Python =)
Method 2: Sourcecode parsing
If you do not have control over the class definition, which is one interpretation of what you'd like to suppose, this is impossible (without code-reading-reflection), since for example the decorator could be a no-op decorator (like in my linked example) that merely returns the function unmodified. (Nevertheless if you allow yourself to wrap/redefine the decorators, see Method 3: Converting decorators to be "self-aware", then you will find an elegant solution)
It is a terrible terrible hack, but you could use the inspect module to read the sourcecode itself, and parse it. This will not work in an interactive interpreter, because the inspect module will refuse to give sourcecode in interactive mode. However, below is a proof of concept.
#!/usr/bin/python3
import inspect
def deco(func):
return func
def deco2():
def wrapper(func):
pass
return wrapper
class Test(object):
#deco
def method(self):
pass
#deco2()
def method2(self):
pass
def methodsWithDecorator(cls, decoratorName):
sourcelines = inspect.getsourcelines(cls)[0]
for i,line in enumerate(sourcelines):
line = line.strip()
if line.split('(')[0].strip() == '#'+decoratorName: # leaving a bit out
nextLine = sourcelines[i+1]
name = nextLine.split('def')[1].split('(')[0].strip()
yield(name)
It works!:
>>> print(list( methodsWithDecorator(Test, 'deco') ))
['method']
Note that one has to pay attention to parsing and the python syntax, e.g. #deco and #deco(... are valid results, but #deco2 should not be returned if we merely ask for 'deco'. We notice that according to the official python syntax at http://docs.python.org/reference/compound_stmts.html decorators are as follows:
decorator ::= "#" dotted_name ["(" [argument_list [","]] ")"] NEWLINE
We breathe a sigh of relief at not having to deal with cases like #(deco). But note that this still doesn't really help you if you have really really complicated decorators, such as #getDecorator(...), e.g.
def getDecorator():
return deco
Thus, this best-that-you-can-do strategy of parsing code cannot detect cases like this. Though if you are using this method, what you're really after is what is written on top of the method in the definition, which in this case is getDecorator.
According to the spec, it is also valid to have #foo1.bar2.baz3(...) as a decorator. You can extend this method to work with that. You might also be able to extend this method to return a <function object ...> rather than the function's name, with lots of effort. This method however is hackish and terrible.
Method 3: Converting decorators to be "self-aware"
If you do not have control over the decorator definition (which is another interpretation of what you'd like), then all these issues go away because you have control over how the decorator is applied. Thus, you can modify the decorator by wrapping it, to create your own decorator, and use that to decorate your functions. Let me say that yet again: you can make a decorator that decorates the decorator you have no control over, "enlightening" it, which in our case makes it do what it was doing before but also append a .decorator metadata property to the callable it returns, allowing you to keep track of "was this function decorated or not? let's check function.decorator!". And then you can iterate over the methods of the class, and just check to see if the decorator has the appropriate .decorator property! =) As demonstrated here:
def makeRegisteringDecorator(foreignDecorator):
"""
Returns a copy of foreignDecorator, which is identical in every
way(*), except also appends a .decorator property to the callable it
spits out.
"""
def newDecorator(func):
# Call to newDecorator(method)
# Exactly like old decorator, but output keeps track of what decorated it
R = foreignDecorator(func) # apply foreignDecorator, like call to foreignDecorator(method) would have done
R.decorator = newDecorator # keep track of decorator
#R.original = func # might as well keep track of everything!
return R
newDecorator.__name__ = foreignDecorator.__name__
newDecorator.__doc__ = foreignDecorator.__doc__
# (*)We can be somewhat "hygienic", but newDecorator still isn't signature-preserving, i.e. you will not be able to get a runtime list of parameters. For that, you need hackish libraries...but in this case, the only argument is func, so it's not a big issue
return newDecorator
Demonstration for #decorator:
deco = makeRegisteringDecorator(deco)
class Test2(object):
#deco
def method(self):
pass
#deco2()
def method2(self):
pass
def methodsWithDecorator(cls, decorator):
"""
Returns all methods in CLS with DECORATOR as the
outermost decorator.
DECORATOR must be a "registering decorator"; one
can make any decorator "registering" via the
makeRegisteringDecorator function.
"""
for maybeDecorated in cls.__dict__.values():
if hasattr(maybeDecorated, 'decorator'):
if maybeDecorated.decorator == decorator:
print(maybeDecorated)
yield maybeDecorated
It works!:
>>> print(list( methodsWithDecorator(Test2, deco) ))
[<function method at 0x7d62f8>]
However, a "registered decorator" must be the outermost decorator, otherwise the .decorator attribute annotation will be lost. For example in a train of
#decoOutermost
#deco
#decoInnermost
def func(): ...
you can only see metadata that decoOutermost exposes, unless we keep references to "more-inner" wrappers.
sidenote: the above method can also build up a .decorator that keeps track of the entire stack of applied decorators and input functions and decorator-factory arguments. =) For example if you consider the commented-out line R.original = func, it is feasible to use a method like this to keep track of all wrapper layers. This is personally what I'd do if I wrote a decorator library, because it allows for deep introspection.
There is also a difference between #foo and #bar(...). While they are both "decorator expressons" as defined in the spec, note that foo is a decorator, while bar(...) returns a dynamically-created decorator, which is then applied. Thus you'd need a separate function makeRegisteringDecoratorFactory, that is somewhat like makeRegisteringDecorator but even MORE META:
def makeRegisteringDecoratorFactory(foreignDecoratorFactory):
def newDecoratorFactory(*args, **kw):
oldGeneratedDecorator = foreignDecoratorFactory(*args, **kw)
def newGeneratedDecorator(func):
modifiedFunc = oldGeneratedDecorator(func)
modifiedFunc.decorator = newDecoratorFactory # keep track of decorator
return modifiedFunc
return newGeneratedDecorator
newDecoratorFactory.__name__ = foreignDecoratorFactory.__name__
newDecoratorFactory.__doc__ = foreignDecoratorFactory.__doc__
return newDecoratorFactory
Demonstration for #decorator(...):
def deco2():
def simpleDeco(func):
return func
return simpleDeco
deco2 = makeRegisteringDecoratorFactory(deco2)
print(deco2.__name__)
# RESULT: 'deco2'
#deco2()
def f():
pass
This generator-factory wrapper also works:
>>> print(f.decorator)
<function deco2 at 0x6a6408>
bonus Let's even try the following with Method #3:
def getDecorator(): # let's do some dispatching!
return deco
class Test3(object):
#getDecorator()
def method(self):
pass
#deco2()
def method2(self):
pass
Result:
>>> print(list( methodsWithDecorator(Test3, deco) ))
[<function method at 0x7d62f8>]
As you can see, unlike method2, #deco is correctly recognized even though it was never explicitly written in the class. Unlike method2, this will also work if the method is added at runtime (manually, via a metaclass, etc.) or inherited.
Be aware that you can also decorate a class, so if you "enlighten" a decorator that is used to both decorate methods and classes, and then write a class within the body of the class you want to analyze, then methodsWithDecorator will return decorated classes as well as decorated methods. One could consider this a feature, but you can easily write logic to ignore those by examining the argument to the decorator, i.e. .original, to achieve the desired semantics.
To expand upon #ninjagecko's excellent answer in Method 2: Source code parsing, you can use the ast module introduced in Python 2.6 to perform self-inspection as long as the inspect module has access to the source code.
def findDecorators(target):
import ast, inspect
res = {}
def visit_FunctionDef(node):
res[node.name] = [ast.dump(e) for e in node.decorator_list]
V = ast.NodeVisitor()
V.visit_FunctionDef = visit_FunctionDef
V.visit(compile(inspect.getsource(target), '?', 'exec', ast.PyCF_ONLY_AST))
return res
I added a slightly more complicated decorated method:
#x.y.decorator2
def method_d(self, t=5): pass
Results:
> findDecorators(A)
{'method_a': [],
'method_b': ["Name(id='decorator1', ctx=Load())"],
'method_c': ["Name(id='decorator2', ctx=Load())"],
'method_d': ["Attribute(value=Attribute(value=Name(id='x', ctx=Load()), attr='y', ctx=Load()), attr='decorator2', ctx=Load())"]}
If you do have control over the decorators, you can use decorator classes rather than functions:
class awesome(object):
def __init__(self, method):
self._method = method
def __call__(self, obj, *args, **kwargs):
return self._method(obj, *args, **kwargs)
#classmethod
def methods(cls, subject):
def g():
for name in dir(subject):
method = getattr(subject, name)
if isinstance(method, awesome):
yield name, method
return {name: method for name,method in g()}
class Robot(object):
#awesome
def think(self):
return 0
#awesome
def walk(self):
return 0
def irritate(self, other):
return 0
and if I call awesome.methods(Robot) it returns
{'think': <mymodule.awesome object at 0x000000000782EAC8>, 'walk': <mymodulel.awesome object at 0x000000000782EB00>}
For those of us who just want the absolute simplest possible case - namely, a single-file solution where we have total control over both the class we're working with and the decorator we're trying to track, I've got an answer. ninjagecko linked to a solution for when you have control over the decorator you want to track, but I personally found it to be complicated and really hard to understand, possibly because I've never worked with decorators until now. So, I've created the following example, with the goal of being as straightforward and simple as possible. It's a decorator, a class with several decorated methods, and code to retrieve+run all methods that have a specific decorator applied to them.
# our decorator
def cool(func, *args, **kwargs):
def decorated_func(*args, **kwargs):
print("cool pre-function decorator tasks here.")
return_value = func(*args, **kwargs)
print("cool post-function decorator tasks here.")
return return_value
# add is_cool property to function so that we can check for its existence later
decorated_func.is_cool = True
return decorated_func
# our class, in which we will use the decorator
class MyClass:
def __init__(self, name):
self.name = name
# this method isn't decorated with the cool decorator, so it won't show up
# when we retrieve all the cool methods
def do_something_boring(self, task):
print(f"{self.name} does {task}")
#cool
# thanks to *args and **kwargs, the decorator properly passes method parameters
def say_catchphrase(self, *args, catchphrase="I'm so cool you could cook an egg on me.", **kwargs):
print(f"{self.name} says \"{catchphrase}\"")
#cool
# the decorator also properly handles methods with return values
def explode(self, *args, **kwargs):
print(f"{self.name} explodes.")
return 4
def get_all_cool_methods(self):
"""Get all methods decorated with the "cool" decorator.
"""
cool_methods = {name: getattr(self, name)
# get all attributes, including methods, properties, and builtins
for name in dir(self)
# but we only want methods
if callable(getattr(self, name))
# and we don't need builtins
and not name.startswith("__")
# and we only want the cool methods
and hasattr(getattr(self, name), "is_cool")
}
return cool_methods
if __name__ == "__main__":
jeff = MyClass(name="Jeff")
cool_methods = jeff.get_all_cool_methods()
for method_name, cool_method in cool_methods.items():
print(f"{method_name}: {cool_method} ...")
# you can call the decorated methods you retrieved, just like normal,
# but you don't need to reference the actual instance to do so
return_value = cool_method()
print(f"return value = {return_value}\n")
Running the above example gives us the following output:
explode: <bound method cool.<locals>.decorated_func of <__main__.MyClass object at 0x00000220B3ACD430>> ...
cool pre-function decorator tasks here.
Jeff explodes.
cool post-function decorator tasks here.
return value = 4
say_catchphrase: <bound method cool.<locals>.decorated_func of <__main__.MyClass object at 0x00000220B3ACD430>> ...
cool pre-function decorator tasks here.
Jeff says "I'm so cool you could cook an egg on me."
cool post-function decorator tasks here.
return value = None
Note that the decorated methods in this example have different types of return values and different signatures, so the practical value of being able to retrieve and run them all is a bit dubious. However, in cases where there are many similar methods, all with the same signature and/or type of return value (like if you're writing a connector to retrieve unnormalized data from one database, normalize it, and insert it into a second, normalized database, and you have a bunch similar methods, e.g. 15 read_and_normalize_table_X methods), being able to retrieve (and run) them all on the fly could be more useful.
Maybe, if the decorators are not too complex (but I don't know if there is a less hacky way).
def decorator1(f):
def new_f():
print "Entering decorator1", f.__name__
f()
new_f.__name__ = f.__name__
return new_f
def decorator2(f):
def new_f():
print "Entering decorator2", f.__name__
f()
new_f.__name__ = f.__name__
return new_f
class A():
def method_a(self):
pass
#decorator1
def method_b(self, b):
pass
#decorator2
def method_c(self, t=5):
pass
print A.method_a.im_func.func_code.co_firstlineno
print A.method_b.im_func.func_code.co_firstlineno
print A.method_c.im_func.func_code.co_firstlineno
I don't want to add much, just a simple variation of ninjagecko's Method 2. It works wonders.
Same code, but using list comprehension instead of a generator, which is what I needed.
def methodsWithDecorator(cls, decoratorName):
sourcelines = inspect.getsourcelines(cls)[0]
return [ sourcelines[i+1].split('def')[1].split('(')[0].strip()
for i, line in enumerate(sourcelines)
if line.split('(')[0].strip() == '#'+decoratorName]
A simple way to solve this problem is to put code in the decorator that adds each function/method, that is passed in, to a data set (for example a list).
e.g.
def deco(foo):
functions.append(foo)
return foo
now every function with the deco decorator will be added to functions.
As far as I know, self is just a very powerful convention and it's not really a reserved keyword in Python. Java and C# have this as a keyword. I really find it weird that they didn't make a reserved keyword for it in Python. Is there any reason behind this?
Because self is just a parameter to a function, like any other parameter. For example, the following call:
a = A()
a.x()
essentially gets converted to:
a = A()
A.x(a)
Not making self a reserved word has had the fortunate result as well that for class methods, you can rename the first parameter to something else (normally cls). And of course for static methods, the first parameter has no relationship to the instance it is called on e.g.:
class A:
def method(self):
pass
#classmethod
def class_method(cls):
pass
#staticmethod
def static_method():
pass
class B(A):
pass
b = B()
b.method() # self is b
b.class_method() # cls is B
b.static_method() # no parameter passed
Guido van Rossum has blogged on why explicit self has to stay: http://neopythonic.blogspot.com/2008/10/why-explicit-self-has-to-stay.html
I believe that post provides some insight into the design decisions behind explicit self.
Because a method is just a function who's first parameter is used to pass the object. You can write a method like this:
class Foo(object):
pass
def setx(self, x):
self.x = x
Foo.setx = setx
foo = Foo()
foo.setx(42)
print foo.x # Prints 42.
Whatever the merit or otherwise of this philosophy, it does result in a more unified notion of functions and methods.
I want to write a decorator that acts differently depending on whether it is applied to a function or to a method.
def some_decorator(func):
if the_magic_happens_here(func): # <---- Point of interest
print 'Yay, found a method ^_^ (unbound jet)'
else:
print 'Meh, just an ordinary function :/'
return func
class MyClass(object):
#some_decorator
def method(self):
pass
#some_decorator
def function():
pass
I tried inspect.ismethod(), inspect.ismethoddescriptor() and inspect.isfunction() but no luck. The problem is that a method actually is neither a bound nor an unbound method but an ordinary function as long as it is accessed from within the class body.
What I really want to do is to delay the actions of the decorator to the point the class is actually instantiated because I need the methods to be callable in their instance scope. For this, I want to mark methods with an attribute and later search for these attributes when the .__new__() method of MyClass is called. The classes for which this decorator should work are required to inherit from a class that is under my control. You can use that fact for your solution.
In the case of a normal function the delay is not necessary and the decorator should take action immediately. That is why I wand to differentiate these two cases.
I would rely on the convention that functions that will become methods have a first argument named self, and other functions don't. Fragile, but then, there's no really solid way.
So (pseudocode as I have comments in lieu of what you want to do in either case...):
import inspect
import functools
def decorator(f):
args = inspect.getargspec(f)
if args and args[0] == 'self':
# looks like a (future) method...
else:
# looks like a "real" function
#functools.wraps(f)
def wrapper # etc etc
One way to make it a bit more solid, as you say all classes involved inherit from a class under your control, is to have that class provide a metaclass (which will also of course be inherited by said classes) which checks things at the end of the class body. Make the wrapped function accessible e.g. by wrapper._f = f and the metaclass's __init__ can check that all wrapped methods did indeed have self as the first argument.
Unfortunately there's no easy way to check that other functions (non-future-methods) being wrapped didn't have such a first argument, since you're not in control of the environment in that case. The decorator might check for "top-level" functions (ones whose def is a top-level statement in their module), via the f_globals (globals dict, i.e., module's dict) and f_name attributes of the function -- if the function's such a global presumably it won't later be assigned as an attribute of the class (thereby becoming a future-method anyway;-) so the self named first arg, if there, can be diagnosed as wrong and warned about (while still treating the function as a real function;-).
One alternative would be to do the decoration in the decorator itself under the hypothesis of a real function, but also make available the original function object as wrapper._f. Then, the metaclass's __init__ can re-do the decoration for all functions in the class body that it sees have been marked this way. This approach is much more solid than the convention-relying one I just sketched, even with the extra checks. Still, something like
class Foo(Bar): ... # no decorations
#decorator
def f(*a, **k): ...
Foo.f = f # "a killer"... function becomes method!
would still be problematic -- you could try intercepting this with a __setattr__ in your metaclass (but then other assignments to class attributes after the class statement can become problematic).
The more the user's code has freedom to do funky things (and Python generally leaves the programmer a lot of such freedom), the harder time your "framework-y" code has keeping things under tight control instead, of course;-).
From Python 3.3 onwards by using PEP 3155:
def some_decorator(func):
if func.__name__ != func.__qualname__:
print('Yay, found a method ^_^ (unbound jet)')
else:
print('Meh, just an ordinary function :/')
return func
A method x of class A will have a __qualname__ that is A.x while a function x will have a __qualname__ of x.
Do you need to have the magic happen where you choose which wrapper to return, or can you defer the magic until the function is actually called?
You could always try a parameter to your decorator to indicate which of the two wrappers it should use, like
def some_decorator( clams ):
def _mydecor(func ):
#wraps(func)
def wrapping(*args....)
...
return wrapping
def _myclassdecor(func):
#wraps(func)
.....
return _mydecor if clams else _myclassdecor
The other thing that I might suggest is to create a metaclass and define the init method in the metaclass to look for methods decorated with your decorator and revise them accordingly, like Alex hinted at. Use this metaclass with your base class, and since all the classes that will use the decorator will inherit from the base class, they'll also get the metaclass type and use its init as well.
You just need to check to see if the function being decorated has an im_func attribute. If it does, then it is a method. If it doesn't then it is a function.
Note that the code sample below does the detection at call time but you can do it at decoration time as well. Just move the hasattr check to the outer decorator generator.
Python 2.6.4 (r264:75706, Dec 7 2009, 18:45:15)
[GCC 4.4.1] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> def deco(f):
... def _wrapper(*args, **kwargs):
... if hasattr(f, 'im_func'):
... print 'method'
... else:
... print 'function'
... return _wrapper
...
>>> deco(lambda x: None)()
function
>>> def f(x):
... return x + 5
...
>>> deco(f)()
function
>>> class A:
... def f(self, x):
... return x + 5
...
>>> a = A()
>>> deco(a.f)()
method
>>> deco(A.f)()
method
>>>
Edit
Oh snap! And I get it totally wrong. I so should have read Alex's post more thoroughly.
>>> class B:
... #deco
... def f(self, x):
... return x +5
...
>>> b = B()
>>> b.f()
function