Change method definition - python

I want to change method definition of a class.
That is my case:
(I am importing these classes from another file)
class A(object):
def __init__(self, str):
self.str = str
def method_a(self):
print self.str
class B(object):
def __init__(self, str):
self.a = A(str)
def method_b(self):
self.a.method_a()
#######################################
from module import A, B
def main():
b = B('hello')
def my_method_a(self):
print self.str + 'other definition'
b.a.method_a = my_method_a
b.method_b()
if __name__ == '__main__':
main()
When I try to execute it, I get:
my_method_a() takes exactly 1 argument (0 given)
Because it does not get 'self'.
Any help please.

If you were to run type(b.a.method_a) before patching the method, you would see <type 'instancemethod'>. Running the same code after the patch produces <type 'function'>. In order for a function to work properly as a method, it must be an attribute of the class, not an instance of the class. The following would work, as you are manually invoking the magic that produces a method from a function:
b.a.method_a = my_method_a.__get__(b.a, A)
See https://wiki.python.org/moin/FromFunctionToMethod for more information.
The difference is that when you call b.a.method_a() after the patch, method_a is an attribute of the instance b.a, not of the class A. As a result, the function's __get__ method is never called to produce an instancemethod object which already has b.a bound to the first argument of method_a.
From one perspective, b.a.method_a() is identical to A.method_a(b.a). How does Python make that transition? You need to understand the descriptor protocol. All function objects implement the __get__ method to return an instancemethod object, which you can think of as the original function with the first argument bound to the appropriate object. Consider this code:
b = B()
b.a.method_a()
Does b have an attribute called a? Yes; we set it in B.__init__.
Does b.a have an attribute method_a? No.
Does type(b.a) (that is, A) have an attribute method_a? Yes.
Call A.method_a.__get__(b.a, A), since method_a was looked up for an instance. The result is an instance method object, with its first argument bound to b.a. (This is why you can consider b.a.method_a() identical to A.method_a(b.a)).
Call the resulting instance method with zero arguments.
Now consider this code.
b = B()
b.a.method_a = my_method_a
b.a.method_a()
Does b have an attribute called a? Yes; we set it in B.__init__.
Does b.a have an attribute method_a? Yes. We set it just before we tried to call it.
Since b.a.method_a was an instance lookup, not a class lookup, the descriptor protocol is not invoked and b.a.method_a.__get__ is not called, even though my_method_a has a __get__ function just like every other function.
Call b.a.method_a with zero arguments.
This produces the error, since the function expects one argument.

why not just use inheritance and method overrides:
from module import A, B
class myA(A):
def method_a(self):
print self.str + ' other definition'
class myB(B):
def __init__(self, str):
self.a = myA(str)
def main():
b = myB('hello')
b.method_b()
if __name__ == '__main__':
main()

Related

Self Attributes Live in Function Pointer?

Suppose I have a simple python3 class like so:
class Test():
def __init__(self):
self.a = 'a'
def checkIsA(self, checkA):
return self.a == checkA
And some further code such as:
def tester(func, item):
return func(item)
testObject = Test()
print(tester(testObject.checkIsA, 'a')) # prints True
How is the function pointer(?) checkIsA still aware of its class member variables (defined by self) when used independently by another function?
I want to use this functionality in a program I'm writing but I'm worried I'm not understanding these semantics correctly.
testObject.checkIsA is what's called a bound method - it remembers the instance it was taken from so that its self attribute gets automatically populated by that instance.
You can easily check a difference between a class function (unbound method) and an instance method (bound method), Python will quite happily tell you all the details:
testObject = Test()
print(Test.checkIsA) # <function Test.checkIsA at 0x00000000041236A8>
print(testObject.checkIsA) # <bound method Test.checkIsA of
# <__main__.Test object at 0x0000000004109390>>
You can simulate a call to testObject.checkIsA() directly through the unbound Test.checkIsA function by passing your instance, too, e.g.:
def tester(func, instance, item):
return func(instance, item)
testObject = Test()
print(tester(Test.checkIsA, testObject, 'a')) # prints True
Or, with functools.partial:
import functools
def tester(func, item):
return func(item)
testObject = Test()
print(tester(functools.partial(Test.checkIsA, testObject), 'a')) # prints True
And that's exactly what a bound instance method does for you in the back by supplementing the first self attribute with its stored __self__ value. You can check that, too:
testObject = Test()
print(testObject.checkIsA.__self__ is testObject) # True

Why python understanding "self", "this" and "that"?

I am new to Python with Java background, the concept of "self" in function confuses me. I understand first argument "self" mean the object itself, but I do not understand how Python make this work. I also know that I could use "this" or "that" or "somethingElse", and Python would still understanding I mean to use the object.
I copied some code from a reddit post:
class A():
def __init__(self):
self.value = ""
def b(this):
this.value = "b"
def c(that):
that.value = "c"
a = A()
print(a.value)
a.b()
print(a.value)
>>>"b"
a.c()
print(a.value)
>>>"c"
How do python knows I do not mean to use an object here in the first argument? For example I modified the above code a bit:
class A():
def __init__(self):
self.value = ""
def b(this):
this.value = "b"
def c(that):
that.value = "c"
def somethingElse(someObjectIWantToPass):
someObjectIWantToPass.value = "still referring A.value"
class B():
def __init__(self):
self.value = ""
a = A()
print(a.value)
a.b()
print(a.value)
a.c()
print(a.value)
a.somethingElse()
print(a.value)
b = B()
a.somethingElse(b)
print (b.value)
And it broke:
b
c
still referring A.value
Traceback (most recent call last):
File "D:/Documents/test.py", line 32, in <module>
a.somethingElse(b)
TypeError: somethingElse() takes 1 positional argument but 2 were given
A method's first argument is always1 its instance. Calling it self is idiomatic in Python but that name is strictly convention.
class A():
def some_method(me): # not called `self`
print(str(id(me))
a = A()
a.some_method()
print(id(a))
If you're trying to pass another arbitrary object in, it has to be the second argument.
class B():
def another_method(self, other):
print(id(other))
b = B()
b.another_method(a)
print(id(b)) # different!
print(id(a)) # the same.
1 Not actually always. #classmethod decorated methods use cls as their first argument, and #staticmethod` decorated methods have nothing passed to its first argument by default.
class C():
#classmethod
def some_classmethod(cls, other, arguments):
# first argument is not the instance, but
# the class C itself.
#staticmethod
def something_related(other, arguments):
# the first argument gets neither the instance
# nor the class.
You are too focused on syntactic sugar. Just realize that the first parameter in a non static member function in python is the reference to the current object. Whether you want to call it this, that, foobar, poop, it doesn't matter. The first parameter of a member function is considered the reference to the object on which the method is called.
The use of self is just a universal way everyone has understood it and the way Python recommends - a convention if you may.
The same goes for **kwargs and *args. These are simply conventions that have permeated the Python ecosystem and everyone just uses it that way, but it doesn't mean you can't give them a different name.
Your last example broke because the function you are calling (A.something) does not take any parameters. This will make sense if you understood what I had said earlier about first parameter in non static member function being a reference to the object on which the method was called.

How to pass arguments to python function whose first parameter is self?

Take the following simplified example.
class A(object):
variable_A = 1
variable_B = 2
def functionA(self, param):
print(param+self.variable_A)
print(A.functionA(3))
In the above example, I get the following error
Traceback (most recent call last):
File "python", line 8, in <module>
TypeError: functionA() missing 1 required positional argument: 'param'
But, if I remove the self, in the function declaration, I am not able to access the variables variable_A and variable_B in the class, and I get the following error
Traceback (most recent call last):
File "python", line 8, in <module>
File "python", line 6, in functionA
NameError: name 'self' is not defined
So, how do I access the class variables and not get the param error here?
I am using Python 3 FYI.
You must first create an instance of the class A
class A(object):
variable_A = 1
variable_B = 2
def functionA(self, param):
return (param+self.variable_A)
a = A()
print(a.functionA(3))
You can use staticmethod decorator if you don't want to use an instance.
Static methods are a special case of methods. Sometimes, you'll write code that belongs to a class, but that doesn't use the object itself at all.
class A(object):
variable_A = 1
variable_B = 2
#staticmethod
def functionA(param):
return (param+A.variable_A)
print(A.functionA(3))
Another option is to use classmethod decorator.
Class methods are methods that are not bound to an object, but to a class!
class A(object):
variable_A = 1
variable_B = 2
#classmethod
def functionA(cls,param):
return (param+cls.variable_A)
print(A.functionA(3))
functionA in your snippet above is an instance method. You do not pass "self" directly to it. Instead, you need to create an instance in order to use it. The "self" argument of the function is, in fact, the instance it's called on. E.g.:
a = A()
a.functionA(3)
P.S.
Note that your functionA calls print but doesn't return anything, meaning it implicitly returns None. You should either have it return a value and print it from the caller, or, as I have done above, call it and let it print on its own.
Create an object of A first.
a = A()
a.functionA(3)
When a function object (what the def statement creates) is an attribute of a class AND is looked up (using the obj.attrname scheme) on the class or an instance of the class, it gets turned into a method object. This method object is itself a callable. If the lookup happens on an instance, this instance will be "magically" inserted as the first argument to the function. If not, you will have to provide it by yourself (just like you would for any other argument).
You can read more about this (and how the "magic" happens here: https://wiki.python.org/moin/FromFunctionToMethod
In your case, you lookup the function on the class, so it expects two arguments (self and param), but you only pass param, hence the error.
You defined variable_A and variable_B as class attributes (attributes that will be shared between all instances of the class). If that's really the intention, and you want a method you can call on the class itself and that will be able to access class attributes, you can make functionA a classmethod (it works the same as an "instance" method except it's the class that is 'magically' inserted as first argument):
class A(object):
variable_A = 1
variable_B = 2
#classmethod
def functionA(cls, param):
return param + cls.variable_A
Then you can call functionA either directly on the class itself:
print(A.functionA(42))
or on an instance if you already have one at hand:
a = A()
# ...
print(a.functionA(42))
Now if you really wanted variable_A and variable_B to be per-instance attributes (each instance of A has it's own distinct variables), you need to 1/ create those attributes on the instance itself in the initialier method and 2/ call functionA on some A instance, ie:
class A(object):
def __init__(self, variable_A=1, variable_B=2):
self.variable_A = variableA
self.variable_B = variableB
def functionA(self, param):
return param + self.variable_A
a1 = A() # using default values
print(a1.functionA(42))
a2 = A(5) # custom value for variable_A
print(a2.functionA(42))
class A(object):
variable_A = 1
variable_B = 2
def functionA(self, param):
print(param+self.variable_A)
A().functionA(3)
A() is calling the class to create an instance
4
[Program finished]
You can use return in function and then print at last.
Posting this answer as per OP template , accepted answers and other answers are recommended way to do it.

python: super()-like proxy object that starts the MRO search at a specified class

According to the docs, super(cls, obj) returns
a proxy object that delegates method calls to a parent or sibling
class of type cls
I understand why super() offers this functionality, but I need something slightly different: I need to create a proxy object that delegates methods calls (and attribute lookups) to class cls itself; and as in super, if cls doesn't implement the method/attribute, my proxy should continue looking in the MRO order (of the new not the original class). Is there any function I can write that achieves that?
Example:
class X:
def act():
#...
class Y:
def act():
#...
class A(X, Y):
def act():
#...
class B(X, Y):
def act():
#...
class C(A, B):
def act():
#...
c = C()
b = some_magic_function(B, c)
# `b` needs to delegate calls to `act` to B, and look up attribute `s` in B
# I will pass `b` somewhere else, and have no control over it
Of course, I could do b = super(A, c), but that relies on knowing the exact class hierarchy and the fact that B follows A in the MRO. It would silently break if any of these two assumptions change in the future. (Note that super doesn't make any such assumptions!)
If I just needed to call b.act(), I could use B.act(c). But I am passing b to someone else, and have no idea what they'll do with it. I need to make sure it doesn't betray me and start acting like an instance of class C at some point.
A separate question, the documentation for super() (in Python 3.2) only talks about its method delegation, and does not clarify that attribute lookups for the proxy are also performed the same way. Is it an accidental omission?
EDIT
The updated Delegate approach works in the following example as well:
class A:
def f(self):
print('A.f')
def h(self):
print('A.h')
self.f()
class B(A):
def g(self):
self.f()
print('B.g')
def f(self):
print('B.f')
def t(self):
super().h()
a_true = A()
# instance of A ends up executing A.f
a_true.h()
b = B()
a_proxy = Delegate(A, b)
# *unlike* super(), the updated `Delegate` implementation would call A.f, not B.f
a_proxy.h()
Note that the updated class Delegate is closer to what I want than super() for two reasons:
super() only does it proxying for the first call; subsequent calls will happen as normal, since by then the object is used, not its proxy.
super() does not allow attribute access.
Thus, my question as asked has a (nearly) perfect answer in Python.
It turns out that, at a higher level, I was trying to do something I shouldn't (see my comments here).
This class should cover the most common cases:
class Delegate:
def __init__(self, cls, obj):
self._delegate_cls = cls
self._delegate_obj = obj
def __getattr__(self, name):
x = getattr(self._delegate_cls, name)
if hasattr(x, "__get__"):
return x.__get__(self._delegate_obj)
return x
Use it like this:
b = Delegate(B, c)
(with the names from your example code.)
Restrictions:
You cannot retrieve some special attributes like __class__ etc. from the class you pass in the constructor via this proxy. (This restistions also applies to super.)
This might behave weired if the attribute you want to retrieve is some weired kind of descriptor.
Edit: If you want the code in the update to your question to work as desired, you can use the foloowing code:
class Delegate:
def __init__(self, cls):
self._delegate_cls = cls
def __getattr__(self, name):
x = getattr(self._delegate_cls, name)
if hasattr(x, "__get__"):
return x.__get__(self)
return x
This passes the proxy object as self parameter to any called method, and it doesn't need the original object at all, hence I deleted it from the constructor.
If you also want instance attributes to be accessible you can use this version:
class Delegate:
def __init__(self, cls, obj):
self._delegate_cls = cls
self._delegate_obj = obj
def __getattr__(self, name):
if name in vars(self._delegate_obj):
return getattr(self._delegate_obj, name)
x = getattr(self._delegate_cls, name)
if hasattr(x, "__get__"):
return x.__get__(self)
return x
A separate question, the documentation for super() (in Python 3.2)
only talks about its method delegation, and does not clarify that
attribute lookups for the proxy are also performed the same way. Is it
an accidental omission?
No, this is not accidental. super() does nothing for attribute lookups. The reason is that attributes on an instance are not associated with a particular class, they're just there. Consider the following:
class A:
def __init__(self):
self.foo = 'foo set from A'
class B(A):
def __init__(self):
super().__init__()
self.bar = 'bar set from B'
class C(B):
def method(self):
self.baz = 'baz set from C'
class D(C):
def __init__(self):
super().__init__()
self.foo = 'foo set from D'
self.baz = 'baz set from D'
instance = D()
instance.method()
instance.bar = 'not set from a class at all'
Which class "owns" foo, bar, and baz?
If I wanted to view instance as an instance of C, should it have a baz attribute before method is called? How about afterwards?
If I view instance as an instance of A, what value should foo have? Should bar be invisible because was only added in B, or visible because it was set to a value outside the class?
All of these questions are nonsense in Python. There's no possible way to design a system with the semantics of Python that could give sensible answers to them. __init__ isn't even special in terms of adding attributes to instances of the class; it's just a perfectly ordinary method that happens to be called as part of the instance creation protocol. Any method (or indeed code from another class altogether, or not from any class at all) can create attributes on any instance it has a reference to.
In fact, all of the attributes of instance are stored in the same place:
>>> instance.__dict__
{'baz': 'baz set from C', 'foo': 'foo set from D', 'bar': 'not set from a class at all'}
There's no way to tell which of them were originally set by which class, or were last set by which class, or whatever measure of ownership you want. There's certainly no way to get at "the A.foo being shadowed by D.foo", as you would expect from C++; they're the same attribute, and any writes to to it by one class (or from elsewhere) will clobber a value left in it by the other class.
The consequence of this is that super() does not perform attribute lookups the same way it does method lookups; it can't, and neither can any code you write.
In fact, from running some experiments, neither super nor Sven's Delegate actually support direct attribute retrieval at all!
class A:
def __init__(self):
self.spoon = 1
self.fork = 2
def foo(self):
print('A.foo')
class B(A):
def foo(self):
print('B.foo')
b = B()
d = Delegate(A, b)
s = super(B, b)
Then both work as expected for methods:
>>> d.foo()
A.foo
>>> s.foo()
A.foo
But:
>>> d.fork
Traceback (most recent call last):
File "<pyshell#43>", line 1, in <module>
d.fork
File "/tmp/foo.py", line 6, in __getattr__
x = getattr(self._delegate_cls, name)
AttributeError: type object 'A' has no attribute 'fork'
>>> s.spoon
Traceback (most recent call last):
File "<pyshell#45>", line 1, in <module>
s.spoon
AttributeError: 'super' object has no attribute 'spoon'
So they both only really work for calling some methods on, not for passing to arbitrary third party code to pretend to be an instance of the class you want to delegate to.
They don't behave the same way in the presence of multiple inheritance unfortunately. Given:
class Delegate:
def __init__(self, cls, obj):
self._delegate_cls = cls
self._delegate_obj = obj
def __getattr__(self, name):
x = getattr(self._delegate_cls, name)
if hasattr(x, "__get__"):
return x.__get__(self._delegate_obj)
return x
class A:
def foo(self):
print('A.foo')
class B:
pass
class C(B, A):
def foo(self):
print('C.foo')
c = C()
d = Delegate(B, c)
s = super(C, c)
Then:
>>> d.foo()
Traceback (most recent call last):
File "<pyshell#50>", line 1, in <module>
d.foo()
File "/tmp/foo.py", line 6, in __getattr__
x = getattr(self._delegate_cls, name)
AttributeError: type object 'B' has no attribute 'foo'
>>> s.foo()
A.foo
Because Delegate ignores the full MRO of whatever class _delegate_obj is an instance of, only using the MRO of _delegate_cls. Whereas super does what you asked in the question, but the behaviour seems quite strange: it's not wrapping an instance of C to pretend it's an instance of B, because direct instances of B don't have foo defined.
Here's my attempt:
class MROSkipper:
def __init__(self, cls, obj):
self.__cls = cls
self.__obj = obj
def __getattr__(self, name):
mro = self.__obj.__class__.__mro__
i = mro.index(self.__cls)
if i == 0:
# It's at the front anyway, just behave as getattr
return getattr(self.__obj, name)
else:
# Check __dict__ not getattr, otherwise we'd find methods
# on classes we're trying to skip
try:
return self.__obj.__dict__[name]
except KeyError:
return getattr(super(mro[i - 1], self.__obj), name)
I rely on the __mro__ attribute of classes to properly figure out where to start from, then I just use super. You could walk the MRO chain from that point yourself checking class __dict__s for methods instead if the weirdness of going back one step to use super is too much.
I've made no attempt to handle unusual attributes; those implemented with descriptors (including properties), or those magic methods looked up behind the scenes by Python, which often start at the class rather than the instance directly. But this behaves as you asked moderately well (with the caveat expounded on ad nauseum in the first part of my post; looking up attributes this way will not give you any different results than looking them up directly in the instance).

Creating a function from a member of an instance for another instance in python

Imagine that i have f which is a function of a member of a class instance:
class A:
def b(self):
print 'hey'
a = A()
f = a.b
If I have another instance of the same class, let's say c = A() how can I reconstruct a new ff only using f and c, so calling ff() would result in c.b() instead of a.b()?
c = A()
ff = some_python_kungfu(f,c)
ff() #it is calling c.b()
Can you use a method reference for the class instead of the instance reference?
class A:
def whoami(self):
print 'I am %s' % id(self)
a = A()
c = A()
func = A.whoami
func(a)
func(c)
So you want to know how to rebind an already bound method to another instance, using only the bound method and the other instance. It can be done like this:
def some_python_kungfu(meth, obj):
return type(meth)(meth.__func__, obj, obj.__class__)
The __func__ attribute is really the same as Ned Batchelders im_func, but __func__ is forward-compatible with python 3.
There is one case where this will not work: methods of built-in classes. The __func__ and im_func attributes are only available on user-defined classes. Therefore, this will fail:
a = "that's no ordinary rabbit"
b = "consult the book of armaments"
b_split = some_python_kungfu(a.split, b)
A slight modification of Ned's solution will work on both built-in and user-defined classes:
def some_python_kungfu(meth, obj):
return getattr(obj, meth.__name__)
So will this always work then? Well... no, but the stumbling block a rather obscure and (I guess) seldom occuring problem: if the name of the method (meth.__name__) is not the same as the name it has in the class dictionary ('b'), then getattr will either return the wrong attribute or raise an AttributeError. For example:
def external(self):
pass
class A(object):
b = external
Here A.b.__name__ == 'external' instead of 'b', so getattr(obj, 'external') will be called instead of getattr(obj, 'b').
While both previous approaches have problems, one with built-in classes and one with patched-together classes, both problems do not occur simultaneously in any circumstance. Therefore, a combination will work in all cases:
def some_python_kungfu(meth, obj):
try:
return type(meth)(meth.__func__, obj, obj.__class__)
except AttributeError:
# meth is a built-in method, so meth.__name__ is always correct
return getattr(obj, meth.__name__)
As explained elsewhere on this page, your best bet would probably be to ignore this whole mess and do it some cleaner way, like for instance using the unbound methods and passing in the first argument (self) manually, as in Cixates answer. But who knows, this may prove useful to some of you some day perhaps, in a somewhat bizarre set of circumstances. ;)
I'm not sure this would work in all cases, but:
def some_python_kungfu(meth, obj):
"""Get a bound method on `obj` corresponding to the method `meth`."""
return getattr(obj, meth.im_func.__name__)

Categories