I have a class with many methods. How can I modify my methods so that they can also be accessed directly as a function without creating object of that class? Is it possible.
The methods will be "unbound" (meaning, essentially, that they have no self to work with). If the functions do not operate upon self, you can turn them into static-methods (which do not take a self first argument) and then assign them to variables to be used like functions.
Like so:
class MyClass(object):
#staticmethod
def myfunc():
return "It works!"
myfunc = MyClass.myfunc
myfunc() # prints "It works!"
Essentially, you need to ask yourself "What data does my method need to (er) function?" Depending on your answer, you can use #staticmethod or #classmethod or you may find that you do in fact need a self in which case you will need to create an object before trying to use its methods.
That final case would look something like:
myobj = MyClass()
del MyClass # This is a singleton class
myfunc = myobj.myfunc
All of that aside, if you find that all of your methods are actually staticmethods, then it's better style to refactor them out of the class into plain-old functions, which they really are already. You may have learned this "class as namespace" style from Java, but that isn't correct in Python. Python namespaces are represented by modules.
Unbound Methods
To create an unbound method (i.e., its first variable is'nt self), you can decorate the method using the #staticmethod built-in decorator. If decorators or any of that is not making sense, check out the Wiki, this simple explanation, decorators as syntactic sugar and learn how to write a good one.
>>> class foo(object):
... #staticmethod
... def bar(blah_text):
... print "Unbound method `bar` of Class `foo`"
... return blah_text
...
>>> foobar = foo.bar
>>> foobar("We are the Knights who say 'Ni'!")
Unbound method `bar` of Class `foo`
"We are the Knights who say 'Ni'!"
Bound Methods
These methods are not technically 'bound', but are meant to be binded when called. You just have to point a reference to them and "Wala", you now have a reference to that method. Now you just have to pass a valid instance of that Class:
>>> class foo:
... def __init__(self, bar_value = 'bar'):
... self.bar_value = bar_value
... def bar(self, blah_text):
... return self.bar_value + blah_text
...
>>> bar = foo.bar
>>> bar(foo('We are the Knights who say '), "'Ni'")
"We are the Knights who say 'Ni'"
Edit: As is pointed out in the comments, it seems my usage of 'binding' is wrong. Could somebody with knowledge of it edit/correct my post?
You can call the function with the class name as a parameter, if you do not want to lose self:
class something:
def test(self):
print('Hello')
something.test(something)
#prints "Hello"
Related
Edit: I'm using Python 3 (some people asked).
I think this is just a syntax question, but I want to be sure there's nothing I'm missing. Notice the syntax difference in how Foo and Bar are implemented. They achieve the same thing and I want to make sure they're really doing the same thing. The output suggests that there are just two ways to do the same thing. Is that the case?
Code:
class X:
def some_method(self):
print("X.some_method called")
class Y:
def some_method(self):
print("Y.some_method called")
class Foo(X,Y):
def some_method(self):
X().some_method()
Y().some_method()
print("Foo.some_method called")
class Bar(X,Y):
def some_method(self):
X.some_method(self)
Y.some_method(self)
print("Bar.some_method called")
print("=== Fun with Foo ===")
foo_instance = Foo()
foo_instance.some_method()
print("=== Fun with Bar ===")
bar_instance = Bar()
bar_instance.some_method()
Output:
=== Fun with Foo ===
X.some_method called
Y.some_method called
Foo.some_method called
=== Fun with Bar ===
X.some_method called
Y.some_method called
Bar.some_method called
PS - Hopefully it goes without saying but this is just an abstract example, let's not worry about why I'd want to call some_method on both ancestors, I'm just trying to understand the syntax and mechanics of the language here. Thanks all!
You should be using new-style classes. If this is Python 3, you are; if you are using Python 2, you should inherit from object (or some other new-style class).
The usual way to invoke ancestor methods is using super. Read about it in the standard docs, and the other excellent articles on how it operates. It is never recommended to invoke the methods in the way you are doing because (a) it will be fragile in the face of further inheritance; and (b) you increase the maintenance effort by hardcoding references to classes.
Update: Here is an example showing how to use super to achieve this: http://ideone.com/u3si2
Also look at: http://rhettinger.wordpress.com/2011/05/26/super-considered-super/
Update 2: Here's a little library for python 2 that adds a __class__ variable and a no-args super to every method to avoid hardcoding the current name: https://github.com/marcintustin/superfixer
They aren't the same. X() creates an object of class X. When you do X().someMethod() you create a new object and then call the method on that object, not on self. X.someMethod(self) is what you want, since that calls the inherited method on the same object.
You will see the difference if your method actually does anything to the self object. For instance, if you put self.blah = 8 into your method, then after X.someMethod(self) the object you call it on will have the blah attribute set, but after X().someMethod() it will not. (Instead, you will have created a new object, set blah on that, and then thrown away that new object without using it, leaving the original object untouched.)
Here is a simple example modifying your code:
>>> class X:
...
... def some_method(self):
... print("X.some_method called on", self)
...
... class Y:
...
... def some_method(self):
... print("Y.some_method called on", self)
...
... class Foo(X,Y):
...
... def some_method(self):
... X().some_method()
... Y().some_method()
... print("Foo.some_method called on", self)
...
... class Bar(X,Y):
...
... def some_method(self):
... X.some_method(self)
... Y.some_method(self)
... print("Bar.some_method called on", self)
>>> Foo().some_method()
('X.some_method called on', <__main__.X instance at 0x0142F3C8>)
('Y.some_method called on', <__main__.Y instance at 0x0142F3C8>)
('Foo.some_method called on', <__main__.Foo instance at 0x0142F3A0>)
>>> Bar().some_method()
('X.some_method called on', <__main__.Bar instance at 0x0142F3C8>)
('Y.some_method called on', <__main__.Bar instance at 0x0142F3C8>)
('Bar.some_method called on', <__main__.Bar instance at 0x0142F3C8>)
Note that when I use Foo, the objects printed are not the same; one is an X instance, one is a Y instance, and the last is the original Foo instance that I called the method on. When Bar is used, it is the same object in each method call.
(You can also use super in some cases to avoid naming the base classes explicitly; e.g., super(Foo, self).someMethod() or in Python 3 just super().someMethod(). However, if you have a need to directly call inherited methods from two base classes, super might not be a good fit. It is generally aimed at cases where each method calls super just once, passing control to the next version of the method in the inheritance chain, which will then pass it along to the next, etc.)
Really sorry for the extremely stupid title, but if I know what it is, I wouldn't write here (:
def some_decorator( func ):
# ..
class A:
#some_decorator
def func():
pass
#func.some_decorator # this one here - func.some_decorator ?
def func():
pass
some_decorator decorates func - that's OK. But what is func.some_decorator and how some_decorator becomes a member ( or something else ? ) of func?
P.S. I'm 90% sure, that there's such question here (as this seems something basic), but I don't know how to search it. If there's a exact duplicate, I'll delete this question.
Note : It's not typo, nor accident, that both member functions are named func. The decorator is for overloading: the question is related to Decorating method (class methods overloading)
Remember that the function definition with decorator is equivalent to this:
def func():
pass
func = some_decorator(func)
So in the following lines, func doesn't refer to the function you defined but to what the decorator turned it into. Also note that decorators can return any object, not just functions. So some_decorator returns an object with a method (it's unfortunate that the names some_decorator and func are reused in the example - it's confusing, but doesn't change anything about the concept) that is itself a decorator. As the expression after the # is evaluated first, you still have a reference to the first decorator's method after you defined another plain function func. That decorator is applied to this new function. The full example is then equivalent to this:
class A:
def func():
pass
func = some_decorator(func)
_decorator = func.some_decorator
def func():
pass
func = _decorator(func)
One way to clarify this is to demonstrate it with a concrete example that behaves like this, the builtin property descriptor:
class C(object):
#property
def x(self):
"This is a property object, not a function"
return self._x
#x.setter
def x(self, val):
self._x = val
>>> c = C()
>>> c.x = 1
>>> c.x
1
>>> C.x
<property object at 0x2396100>
>>> C.x.__doc__
'This is a property object, not a function'
>>> C.x.getter.__doc__
'Descriptor to change the getter on a property.'
>>> C.x.setter.__doc__
'Descriptor to change the setter on a property.'
>>> C.x.deleter.__doc__
'Descriptor to change the deleter on a property.'
The first invocation of property (as a decorator) means that x is not a function - it is a property descriptor. A feature of properties is that they allow you to initially define just the fget method, and then provide fset and fdel later by using the property.setter and property.deleter decorators (although since each of these creates a new property object, you do need to make sure to use the same name each time).
Something similar will usually be the case whenever you see code using this kind of pattern. Ideally, the naming of the decorators involved will make it reasonably clear what is going on (e.g. most people seem to grasp the idiom for defining property attributes reasonably easily).
As far as I know, self is just a very powerful convention and it's not really a reserved keyword in Python. Java and C# have this as a keyword. I really find it weird that they didn't make a reserved keyword for it in Python. Is there any reason behind this?
Because self is just a parameter to a function, like any other parameter. For example, the following call:
a = A()
a.x()
essentially gets converted to:
a = A()
A.x(a)
Not making self a reserved word has had the fortunate result as well that for class methods, you can rename the first parameter to something else (normally cls). And of course for static methods, the first parameter has no relationship to the instance it is called on e.g.:
class A:
def method(self):
pass
#classmethod
def class_method(cls):
pass
#staticmethod
def static_method():
pass
class B(A):
pass
b = B()
b.method() # self is b
b.class_method() # cls is B
b.static_method() # no parameter passed
Guido van Rossum has blogged on why explicit self has to stay: http://neopythonic.blogspot.com/2008/10/why-explicit-self-has-to-stay.html
I believe that post provides some insight into the design decisions behind explicit self.
Because a method is just a function who's first parameter is used to pass the object. You can write a method like this:
class Foo(object):
pass
def setx(self, x):
self.x = x
Foo.setx = setx
foo = Foo()
foo.setx(42)
print foo.x # Prints 42.
Whatever the merit or otherwise of this philosophy, it does result in a more unified notion of functions and methods.
I'm busy creating a metaclass that replaces a stub function on a class with a new one with a proper implementation. The original function could use any signature. My problem is that I can't figure out how to create a new function with the same signature as the old one. How would I do this?
Update
This has nothing to do with the actual question which is "How do I dynamically create a function with the same signature as another function?" but I'm adding this to show why I can't use subclasses.
I'm trying to implement something like Scala Case Classes in Python. (Not the pattern matching aspect just the automatically generated properties, eq, hash and str methods.)
I want something like this:
>>> class MyCaseClass():
... __metaclass__ = CaseMetaClass
... def __init__(self, a, b):
... pass
>>> instance = MyCaseClass(1, 'x')
>>> instance.a
1
>>> instance.b
'x'
>>> str(instance)
MyCaseClass(1, 'x')
As far as I can see, there is no way to that with subclasses.
I believe functools.wraps does not reproduce the original call signature. However, Michele Simionato's decorator module does:
import decorator
class FooType(type):
def __init__(cls,name,bases,clsdict):
#decorator.decorator
def modify_stub(func, *args,**kw):
return func(*args,**kw)+' + new'
setattr(cls,'stub',modify_stub(clsdict['stub']))
class Foo(object):
__metaclass__=FooType
def stub(self,a,b,c):
return 'original'
foo=Foo()
help(foo.stub)
# Help on method stub in module __main__:
# stub(self, a, b, c) method of __main__.Foo instance
print(foo.stub(1,2,3))
# original + new
use functools.wraps
>>> from functools import wraps
>>> def f(a,b):
return a+b
>>> #wraps(f)
def f2(*args):
print(args)
return f(*args)
>>> f2(2,5)
(2, 5)
7
It is possible to do this, using inspect.getargspecs. There's even a PEP in place to make it easier.
BUT -- this is not a good thing to do. Can you imagine how much of a debugging/maintenance nightmare it would be to have your functions dynamically created at runtime -- and not only that, but done so by a metaclass?! I don't understand why you have to replace the stub dynamically; can't you just change the code when you want to change the function? I mean, suppose you have a class
class Spam( object ):
def ham( self, a, b ):
return NotImplemented
Since you don't know what it's meant to do, the metaclass can't actually implement any functionality. If you knew what ham were meant to do, you could do it in ham or one of its parent classes, instead of returning NotImplemented.
I want to write a decorator that acts differently depending on whether it is applied to a function or to a method.
def some_decorator(func):
if the_magic_happens_here(func): # <---- Point of interest
print 'Yay, found a method ^_^ (unbound jet)'
else:
print 'Meh, just an ordinary function :/'
return func
class MyClass(object):
#some_decorator
def method(self):
pass
#some_decorator
def function():
pass
I tried inspect.ismethod(), inspect.ismethoddescriptor() and inspect.isfunction() but no luck. The problem is that a method actually is neither a bound nor an unbound method but an ordinary function as long as it is accessed from within the class body.
What I really want to do is to delay the actions of the decorator to the point the class is actually instantiated because I need the methods to be callable in their instance scope. For this, I want to mark methods with an attribute and later search for these attributes when the .__new__() method of MyClass is called. The classes for which this decorator should work are required to inherit from a class that is under my control. You can use that fact for your solution.
In the case of a normal function the delay is not necessary and the decorator should take action immediately. That is why I wand to differentiate these two cases.
I would rely on the convention that functions that will become methods have a first argument named self, and other functions don't. Fragile, but then, there's no really solid way.
So (pseudocode as I have comments in lieu of what you want to do in either case...):
import inspect
import functools
def decorator(f):
args = inspect.getargspec(f)
if args and args[0] == 'self':
# looks like a (future) method...
else:
# looks like a "real" function
#functools.wraps(f)
def wrapper # etc etc
One way to make it a bit more solid, as you say all classes involved inherit from a class under your control, is to have that class provide a metaclass (which will also of course be inherited by said classes) which checks things at the end of the class body. Make the wrapped function accessible e.g. by wrapper._f = f and the metaclass's __init__ can check that all wrapped methods did indeed have self as the first argument.
Unfortunately there's no easy way to check that other functions (non-future-methods) being wrapped didn't have such a first argument, since you're not in control of the environment in that case. The decorator might check for "top-level" functions (ones whose def is a top-level statement in their module), via the f_globals (globals dict, i.e., module's dict) and f_name attributes of the function -- if the function's such a global presumably it won't later be assigned as an attribute of the class (thereby becoming a future-method anyway;-) so the self named first arg, if there, can be diagnosed as wrong and warned about (while still treating the function as a real function;-).
One alternative would be to do the decoration in the decorator itself under the hypothesis of a real function, but also make available the original function object as wrapper._f. Then, the metaclass's __init__ can re-do the decoration for all functions in the class body that it sees have been marked this way. This approach is much more solid than the convention-relying one I just sketched, even with the extra checks. Still, something like
class Foo(Bar): ... # no decorations
#decorator
def f(*a, **k): ...
Foo.f = f # "a killer"... function becomes method!
would still be problematic -- you could try intercepting this with a __setattr__ in your metaclass (but then other assignments to class attributes after the class statement can become problematic).
The more the user's code has freedom to do funky things (and Python generally leaves the programmer a lot of such freedom), the harder time your "framework-y" code has keeping things under tight control instead, of course;-).
From Python 3.3 onwards by using PEP 3155:
def some_decorator(func):
if func.__name__ != func.__qualname__:
print('Yay, found a method ^_^ (unbound jet)')
else:
print('Meh, just an ordinary function :/')
return func
A method x of class A will have a __qualname__ that is A.x while a function x will have a __qualname__ of x.
Do you need to have the magic happen where you choose which wrapper to return, or can you defer the magic until the function is actually called?
You could always try a parameter to your decorator to indicate which of the two wrappers it should use, like
def some_decorator( clams ):
def _mydecor(func ):
#wraps(func)
def wrapping(*args....)
...
return wrapping
def _myclassdecor(func):
#wraps(func)
.....
return _mydecor if clams else _myclassdecor
The other thing that I might suggest is to create a metaclass and define the init method in the metaclass to look for methods decorated with your decorator and revise them accordingly, like Alex hinted at. Use this metaclass with your base class, and since all the classes that will use the decorator will inherit from the base class, they'll also get the metaclass type and use its init as well.
You just need to check to see if the function being decorated has an im_func attribute. If it does, then it is a method. If it doesn't then it is a function.
Note that the code sample below does the detection at call time but you can do it at decoration time as well. Just move the hasattr check to the outer decorator generator.
Python 2.6.4 (r264:75706, Dec 7 2009, 18:45:15)
[GCC 4.4.1] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> def deco(f):
... def _wrapper(*args, **kwargs):
... if hasattr(f, 'im_func'):
... print 'method'
... else:
... print 'function'
... return _wrapper
...
>>> deco(lambda x: None)()
function
>>> def f(x):
... return x + 5
...
>>> deco(f)()
function
>>> class A:
... def f(self, x):
... return x + 5
...
>>> a = A()
>>> deco(a.f)()
method
>>> deco(A.f)()
method
>>>
Edit
Oh snap! And I get it totally wrong. I so should have read Alex's post more thoroughly.
>>> class B:
... #deco
... def f(self, x):
... return x +5
...
>>> b = B()
>>> b.f()
function