I am working through the O Reilly Python Cookbook and have been struggling with the below code. It is to with calling a method on a parent class using super():
class Proxy:
def __init__(self, obj):
self._obj = obj
# Delegate attribute lookup to internal obj
def __getattr__(self, name):
return getattr(self._obj, name)
# Delegate attribute assignment
def __setattr__(self, name, value):
if name.startswith('_'):
super().__setattr__(name, value) # Call original __setattr__
else:
setattr(self._obj, name, value)
if __name__ == '__main__':
class A:
def __init__(self, x):
self.x = x
def spam(self):
print('A.spam')
a = A(42)
p = Proxy(a)
print(p.x)
print(p.spam())
p.x = 37
print('Should be 37:', p.x)
print('Should be 37:', a.x)
The book states:
In this code the implementation of __setatrr__() includes a name
check. If the name starts with an underscore it invokes the original
implementation of __setattr__() using super(). Otherwise, it delegates
to the internally held object self._obj.
I am confused. How does super() work then if there is no explicit base class listed?
What exactly then is super() referring to?
There is always a base class; with none explicitly mentioned, Proxy inherits directly from object.
Each class defines a method-resolution order, determined recursively by its base class(es) and its ancestors. When super() gets called, it resolves to a "proxy" of the next class in the MRO of self, whether or not that class appears in the MRO of the class you are currently defining.
Consider the following classes:
class A:
def foo(self):
print("A.foo")
class B(A):
def foo(self):
super().foo()
print("B.foo")
class C(A):
def foo(self):
super().foo()
print("C.foo")
class D(C):
def foo(self):
super().foo()
print("D.foo")
class E(B,D):
def foo(self):
super().foo()
print("E.foo")
e = E()
The MRO of E is [E, B, D, C, A, object]. When you call e.foo(), you start a chain of calls in MRO order. In particular, the call to super in B.foo does not invoke A.foo, but D.foo, a method in a class B knows nothing about, as D is not an ancestor of B. But both B and D are ancestors of E, which is what matters.
Related
I can see code below
class MetaStrategy(StrategyBase.__class__): pass
I am not sure why not just write code like below
class MetaStrategy(StrategyBase): pass
Definition schematic
class StrategyBase(DataAccessor):
pass
class DataAccessor(LineIterator):
pass
class LineIterator(with_metaclass(MetaLineIterator, LineSeries)):
pass
def with_metaclass(meta, *bases):
class metaclass(meta):
def __new__(cls, name, this_bases, d):
return meta(name, bases, d)
return type.__new__(metaclass, str('temporary_class'), (), {})
If you call self.__class__ from a subclass instance, self.__class__ will use that type of the subclass.
Any class that is expressly specified while using the class will be used naturally.
Take the example below:
class Foo(object):
def create_new(self):
return self.__class__()
def create_new2(self):
return Foo()
class Bar(Foo):
pass
b = Bar()
c = b.create_new()
print type(c) # We got an instance of Bar
d = b.create_new2()
print type(d) # we got an instance of Foo
Imagine a class MyMixInClass that is being used in a multiple inheritance hierarchy. when using super() to call some method, is there some way to inspect or drill in to extract the class that this method came from?
example:
class MyMixInClass:
def __init__(self):
initfunc = getattr(super(), '__init__')
# can we figure out which class the __init__ came from?
For each class in the mro sequence, you can check if there is an __init__ method in the class __dict__:
class A:
def __init__(self):
pass
class B(A):
def __init__(self):
super().__init__()
class C(A):
pass
class D(B, C):
pass
if __name__ == '__main__':
for cls in D.__mro__:
if '__init__' in cls.__dict__:
print(f'{cls.__name__} has its own init method', end='\n')
else:
print(f'{cls.__name__} has no init method', end='\n')
output:
D has no init method
B has its own init method
C has no init method
A has its own init method
object has its own init method
In this output, the first class having an __init__ method (here B), is the one called by super().__init__() in D()
I was just trying something with multiple inheritance in python. I come up with this
class ParentOne:
def foo(self):
print("ParentOne foo is called")
class ParentTwo:
def foo(self):
print("ParentTwo foo is called")
class Child(ParentOne, ParentTwo):
# how is this working
def call_parent_two_foo(self):
super(ParentOne, self).foo()
# This does not work
def call_parent_foo(self):
super(ParentTwo, self).foo()
def call_super_foo(self):
super(Child, self).foo()
def foo(self):
print("Child foo is called")
if __name__ == "__main__":
child = Child()
child.foo()
child.call_super_foo()
child.call_parent_two_foo()
# child.call_parent_foo() #This gives the below error
# super(ParentTwo, self).foo()
# AttributeError: 'super' object has no attribute 'foo'
and it gives the following output
Child foo is called
ParentOne foo is called
ParentTwo foo is called
I am getting confused as to how calling of super(ParentOne, self).foo() is evaluated in this case. As per my understanding ParentOne class does not have any idea of the methods and attributes of ParentTwo class. How does super works in case of multiple inheritance
Python constructs a method resolution order (MRO) when it builds a class. The MRO is always linear. If python cannot create a linear MRO, then a ValueError will be raised. In this case, your MRO probably looks like:
Child -> ParentOne -> ParentTwo -> object
Now when python see's a super(cls, self), it basically looks at self and figures out the MRO. It then uses cls to determine where we are currently at in the MRO and finally it returns an object which delegates to the next class in the MRO. So, in this case, a super(Child, self) call would return an object that delegates to ParentOne. A super(ParentOne, self) class would return an object that delegates to ParentTwo. Finally a super(ParentTwo, self) call would delegate to object. In other words, you can think of super as a fancier version of the following code:
def kinda_super(cls, self):
mro = inspect.getmro(type(self))
idx = mro.index(cls)
return Delegate(mro[idx + 1]) # for a suitably defined `Delegate`
Note that since super(ParentTwo, self) returns a "Delegate" to object, we can see why you're getting an AttributeError when you try super(ParentTwo, self).foo() -- Specifically the reason is because object has no foo method.
class X1:
def run(self):
print("x1")
class X2:
def run(self):
print("x2")
class X3:
def run(self):
print("x3")
class X2:
def run(self):
print("x2")
class Y(X1, X2, X3):
def run(self):
print("y")
Given an instance:
y = Y()
To call base class function:
super(Y,y).run()
super(X1,y).run()
super(X2,y).run()
y.run()
Output
x1
x2
x3
y
Similarity,
super(Y, y).run()
for cls in y.__class__.__bases__:
if(cls != X3):
super(cls,y).run()
y.run()
Output
x1
x2
x3
y
You may understand Child(ParentOne, ParentTwo) as two separate inheritances within a chain: Child(ParentOne(ParentTwo)). Actually, ParentOne doesn't inherit ParentTwo, they are two separate classes, but the method super works like there's a chain of inheritances (in case of multiple inheritance only). I like this example to understand better what's going on (for Python 3.x):
class P:
def m(self):
print("P")
class A(P):
def m(self):
super().m() # -> B, if we inherit like C(A, B)
print("A")
class B(P):
def m(self):
super().m() # -> P, if we inherit like C(A, B)
print("B")
class C(A, B):
def m(self):
super().m() # -> A
print("C")
A.m(self)
B.m(self)
c = C()
c.m()
It also considers a case if two parents inherit one base class. The script above prints:
P
B
A
C
P
B
A
P
B
I need a parent class with a decorator defined internally that saves all functions in its class to a list, which is an attribute of the parent class. All children of this class must be able to use the decorator, but storing to a list owned by the specific child.
After numerous attempts at defining such a decorator, I am at a loss for how it would be done. Any help would be greatly appreciated! An example of my preferred usage is shown below.
class Parent:
decorated_functions = []
# insert decorator definition
class ChildOne(Parent):
#decorator
def a(self):
return 'a'
#decorator
def b(self):
return 'b'
class ChildTwo(Parent):
#decorator
def c(self):
return 'c'
class ChildThree(Parent):
#decorator
def d(self):
return 'd'
#decorator
def e(self):
return 'e'
#decorator
def f(self):
return 'f'
ChildOne().decorated_functions
# [<function __main__.ChildOne.a>, <function __main__.ChildOne.b>]
ChildTwo().decorated_functions
# [<function __main__.ChildTwo.c>]
ChildThree().decorated_functions
# [<function __main__.ChildThree.d>, <function __main__.ChildThree.e>, <function __main__.ChildThree.f>]
Update #1
Using Brendan Abel's metaclass, I have tried using the following code.
class Child(Parent):
#decorator
def a(self):
return 'a'
#decorator
def b(self):
return 'b'
print(Child().decorated_functions)
However, Child() does not seem to have an attribute decorated_functions.
AttributeError: type object 'Child' has no attribute 'decorated_functions'
Update #2
The above code now works with Brendan Abel's solution! The issue was a change in syntax for metaclasses Python 3.
You probably won't be able to do this without turning decorated_functions into a property (which allows it to be computed after the class has been created), or using a class decorator or metaclasses. I never thought I'd say this, but a metaclass might be the simplest solution here
def decorator(f):
f.decorated = True
return f
class DecoMeta(type):
def __new__(cls, name, bases, attrs):
decorated_functions = []
for v in attrs.values():
if getattr(v, 'decorated', None):
decorated_functions.append(v)
attrs['decorated_functions'] = decorated_functions
return super(DecoMeta, cls).__new__(cls, name, bases, attrs)
class Parent(object):
__metaclass__ = DecoMeta
Edit:
In Python 3, the metaclass hook is slightly different
class Parent(object, metaclass=DecoMeta):
...
In python, is there a way, when initializing a Class, to change the superclass in function of the value of a class attribute? Here's an example of what I want to do. First I have theses classes:
class A(object):
pass
class B(A):
# extend and override class A
pass
class C(A or B):
# extend and override class A
pass
Secondly, I want to create other classes that inherit from Class C but in some cases I want C to inherit from A and on other cases, inherit from B:
class D(C):
# C inherit only from A
from_B = False
class E(C):
# C inherit from B because attribute from_B = True
from_B = True
I tried with metaclass but that was setting the base class of C (to A or B) for all subclasses (D, E, ...) at the initialization of the first subclass. So, if the first subclass to be initialize had from_B = True, all subclasses of C had C(B) as parent whatever from_B was set. My code was something like this:
class MetaC(type):
def __new__(cls, name, bases, attrs):
if C in bases and getattr(attrs, 'from_B', False):
C.__bases__[C.__bases__.index(A)] = B
return super(MetaC, cls).__new__(cls, name, bases, attrs)
class C(A):
__metaclass__ = MetaC
My goal is to avoid the duplication of the code of the C class and keeping the possibility to have or not the added functionalities of the B class. I should mention that I don't have control on A and B classes.
UPDATE
I think I got it with this metaclass (code is a bit rough at the moment):
class MetaC(type):
def __new__(cls, name, bases, attrs):
for base in bases:
if base.__name__ == 'C':
if attrs.has_key('from_B'):
list_bases = list(base.__bases__)
list_bases[list_bases.index(A)] = B
base.__bases__ = tuple(list_bases)
elif B in base.__bases__:
list_bases = list(base.__bases__)
list_bases[list_bases.index(B)] = A
base.__bases__ = tuple(list_bases)
break
return super(MetaC, cls).__new__(cls, name, bases, attrs)
UPDATE 2
This solution doesn't work because I'm always modifying the base class C. So, when a subclass is instanciated it will use the C class in it's current state.
I ended by using cooperative multiple inheritance. It works fine. The only drawback is that we need to be sure that for methods that need to be call on many parent classes (like methods that are present in A and B and C), there's a super() call in each method definitions of each classes and that they have the same calling signature in every case. Fortunately for me my B classes respect this.
Example:
class A(object):
some_method(arg1, arg2, karg1=None):
do_some_stuff(arg1, arg2, karg1)
class B(A):
# extend and override class A
some_method(arg1, arg2, karg1=None):
super(B, self).some_method(arg1, arg2, karg1)
do_more_stuff(arg1, arg2, karg1)
class C(A, B):
# extend and override class A
some_method(arg1, arg2, karg1=None):
do_other_stuff(arg1, arg2, karg1)
super(C, self).some_method(arg1, arg2, karg1)
This way, when some_method will be call from C or C childrens, all theses calls will be made in this order:
C.some_method
A.some_method
B.some_method
Check The wonders of cooperative inheritance for more info on the subject.
This looks so painful, you have to consider composition/delegation instead of contorting inheritance this way. What do you think of something like this?
class A(object):
def from_B(self):
return False
class B(object):
def from_B(self):
return True
class C(object):
pass
class PolyClass(object):
def __init__(self, *args):
self.delegates = [c() for c in args[::-1]]
def __getattr__(self, attr):
for d in self.delegates:
if hasattr(d, attr):
return getattr(d,attr)
raise AttributeError(attr + "? what the heck is that?")
def __repr__(self):
return "<instance of (%s)>" % ','.join(d.__class__.__name__
for d in self.delegates[::-1])
pc1 = PolyClass(A,B)
pc2 = PolyClass(A,C)
pc3 = PolyClass(B,C)
for p in (pc1,pc2,pc3):
print p, p.from_B()
print pc1.from_C()
Prints:
<instance of (A,B)> True
<instance of (A,C)> False
<instance of (B,C)> True
Traceback (most recent call last):
File "varying_delegation.py", line 33, in <module>
print pc1.from_C()
File "varying_delegation.py", line 21, in __getattr__
raise AttributeError(attr + "? what the heck is that?")
AttributeError: from_C? what the heck is that?
EDIT:
Here's how to take the not-in-your-control classes A and B, and create custom C classes that look like they extend either an A or a B:
# Django admin classes
class A(object):
def from_B(self):
return False
class B(A):
def from_B(self):
return True
# Your own class, which might get created with an A or B instance
class C(object):
def __init__(self, obj):
self.obj = obj
def __getattr__(self, attr):
return getattr(self.obj, attr)
# these are instantiated some way, not in your control
a,b = A(), B()
# now create different C's
c1 = C(a)
c2 = C(b)
print c1.from_B()
print c2.from_B()
prints:
False
True
And to create your subclasses D and E, create an interim subclass of C (I called it SubC cause I lack imagination), which will auto-init the C superclass with a specific global variable, either a or b.
# a subclass of C for subclasses pre-wired to delegate to a specific
# global object
class SubC(C):
c_init_obj = None
def __init__(self):
super(SubC,self).__init__(self.c_init_obj)
class D(SubC): pass
class E(SubC): pass
# assign globals to C subclasses so they build with the correct contained
# global object
D.c_init_obj = a
E.c_init_obj = b
d = D()
e = E()
print d.from_B()
print e.from_B()
Again, prints:
False
True