Class-level read-only properties in Python - python

Is there some way to make a class-level read-only property in Python? For instance, if I have a class Foo, I want to say:
x = Foo.CLASS_PROPERTY
but prevent anyone from saying:
Foo.CLASS_PROPERTY = y
EDIT:
I like the simplicity of Alex Martelli's solution, but not the syntax that it requires. Both his and ~unutbu's answers inspired the following solution, which is closer to the spirit of what I was looking for:
class const_value (object):
def __init__(self, value):
self.__value = value
def make_property(self):
return property(lambda cls: self.__value)
class ROType(type):
def __new__(mcl,classname,bases,classdict):
class UniqeROType (mcl):
pass
for attr, value in classdict.items():
if isinstance(value, const_value):
setattr(UniqeROType, attr, value.make_property())
classdict[attr] = value.make_property()
return type.__new__(UniqeROType,classname,bases,classdict)
class Foo(object):
__metaclass__=ROType
BAR = const_value(1)
BAZ = 2
class Bit(object):
__metaclass__=ROType
BOO = const_value(3)
BAN = 4
Now, I get:
Foo.BAR
# 1
Foo.BAZ
# 2
Foo.BAR=2
# Traceback (most recent call last):
# File "<stdin>", line 1, in <module>
# AttributeError: can't set attribute
Foo.BAZ=3
#
I prefer this solution because:
The members get declared inline instead of after the fact, as with type(X).foo = ...
The members' values are set in the actual class's code as opposed to in the metaclass's code.
It's still not ideal because:
I have to set the __metaclass__ in order for const_value objects to be interpreted correctly.
The const_values don't "behave" like the plain values. For example, I couldn't use it as a default value for a parameter to a method in the class.

The existing solutions are a bit complex -- what about just ensuring that each class in a certain group has a unique metaclass, then setting a normal read-only property on the custom metaclass. Namely:
>>> class Meta(type):
... def __new__(mcl, *a, **k):
... uniquemcl = type('Uniq', (mcl,), {})
... return type.__new__(uniquemcl, *a, **k)
...
>>> class X: __metaclass__ = Meta
...
>>> class Y: __metaclass__ = Meta
...
>>> type(X).foo = property(lambda *_: 23)
>>> type(Y).foo = property(lambda *_: 45)
>>> X.foo
23
>>> Y.foo
45
>>>
this is really much simpler, because it's based on nothing more than the fact that when you get an instance's attribute descriptors are looked up on the class (so of course when you get a class's attribute descriptors are looked on the metaclass), and making class/metaclass unique isn't terribly hard.
Oh, and of course:
>>> X.foo = 67
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: can't set attribute
just to confirm it IS indeed read-only!

The ActiveState solution that Pynt references makes instances of ROClass have read-only attributes. Your question seems to ask if the class itself can have read-only attributes.
Here is one way, based on Raymond Hettinger's comment:
#!/usr/bin/env python
def readonly(value):
return property(lambda self: value)
class ROType(type):
CLASS_PROPERTY = readonly(1)
class Foo(object):
__metaclass__=ROType
print(Foo.CLASS_PROPERTY)
# 1
Foo.CLASS_PROPERTY=2
# AttributeError: can't set attribute
The idea is this: Consider first Raymond Hettinger's solution:
class Bar(object):
CLASS_PROPERTY = property(lambda self: 1)
bar=Bar()
bar.CLASS_PROPERTY=2
It shows a relatively simple way to give bar a read-only property.
Notice that you have to add the CLASS_PROPERTY = property(lambda self: 1)
line to the definition of the class of bar, not to bar itself.
So, if you want the class Foo to have a read-only property, then the parent class of Foo has to have CLASS_PROPERTY = property(lambda self: 1) defined.
The parent class of a class is a metaclass. Hence we define ROType as the metaclass:
class ROType(type):
CLASS_PROPERTY = readonly(1)
Then we make Foo's parent class be ROType:
class Foo(object):
__metaclass__=ROType

Found this on ActiveState:
# simple read only attributes with meta-class programming
# method factory for an attribute get method
def getmethod(attrname):
def _getmethod(self):
return self.__readonly__[attrname]
return _getmethod
class metaClass(type):
def __new__(cls,classname,bases,classdict):
readonly = classdict.get('__readonly__',{})
for name,default in readonly.items():
classdict[name] = property(getmethod(name))
return type.__new__(cls,classname,bases,classdict)
class ROClass(object):
__metaclass__ = metaClass
__readonly__ = {'a':1,'b':'text'}
if __name__ == '__main__':
def test1():
t = ROClass()
print t.a
print t.b
def test2():
t = ROClass()
t.a = 2
test1()
Note that if you try to set a read-only attribute (t.a = 2) python will raise an AttributeError.

Related

How to create object of derived class inside base class in Python?

I have a code like this:
class Base:
def __init__(self):
pass
def new_obj(self):
return Base() # ← return Derived()
class Derived(Base):
def __init__(self):
pass
In the line with a comment I actually want not exactly the Derived object, but any object of class that self really is.
Here is a real-life example from Mercurial.
How to do that?
def new_obj(self):
return self.__class__()
I can't think of a really good reason to do this, but as D.Shawley pointed out:
def new_obj(self):
return self.__class__()
will do it.
That's because when calling a method on a derived class, if it doesn't exist on that class, it will use the method resolution order to figure out which method to call on its inheritance chain. In this case, you've only got one, so it's going to call Base.new_obj and pass in the instance as the first argument (i.e. self).
All instances have a __class__ attribute, that refers to the class that they are an instance of. So given
class Base:
def new_obj(self):
return self.__class__()
class Derived(Base): pass
derived = Derived()
The following lines are functionally equivalent:
derived.new_obj()
# or
Base.new_obj(derived)
You may have encountered a relative of this if you've either forgotten to add the self parameter to your function declaration, or not provided enough arguments to a function and seen a stack trace that looks like this:
>>> f.bar()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: bar() takes exactly 2 arguments (1 given)
You can use a classmethod:
class Base:
def __init__(self):
pass
#classmethod
def new_obj(cls):
return cls()
class Derived(Base):
def __init__(self):
pass
>>> b = Base()
>>> b.new_obj()
<__main__.Base at 0x10fc12208>
>>> d = Derived()
>>> d.new_obj()
<__main__.Derived at 0x10fdfce80>
You can also do this with a class method, which you create with a decorator.
In [1]: class Base:
...: #classmethod
...: def new_obj(cls):
...: return cls()
...:
In [2]: class Derived(Base): pass
In [3]: print type(Base.new_obj())
<type 'instance'>
In [4]: print Base.new_obj().__class__
__main__.Base
In [5]: print Derived.new_obj().__class__
__main__.Derived
Incidentally (you may know this), you don't have to create __init__ methods if you don't do anything with them.

Questions about details of #property in Python

Assume that I have a class as following:
class MyClass(object):
def __init__(self, value=None):
self.attr = value
#property
def attr(self):
# This acts as a getter?
# Let's call the function "attr_1" as alias
return self.__attr
#attr.setter
def attr(self, value):
# This acts as a setter?
# Let's call the function "attr_2" as alias
self.__attr = value
inst = MyClass(1)
I read the Documentation on Descriptor and looked at the implementation of property class.
As far as I know, when I type inst.attr, the following happens:
The first attr (whose alias is attr_1) is found, and attr is now an instance of property class, which is a data descriptor.
Therefore, it will override the instance dictionary, which means type(inst).__dict__['attr'].__get__(inst, type(inst)) is invoked.
attr.__get__(inst, type(inst)) invokes attr.fget(inst), where fget() is in fact the attr(self) (the "raw" attr_1 function).
Finally, attr.fget(inst) returns inst.__attr.
Here comes the first question: the class MyClass does not have an attribute __attr, then how to interpret inst.__attrin step 3?
Similarly, in the emulated setter, how does Python find an attribute inst.__attr to assign the value?
And a trivial question: since property is a class, why not Property instead of property?
Your question is not directly related to properties actually, and the way they work as data descriptors. It's just the way Python fakes private attributes marked as starting with two underscores.
>>> inst.__attr
Traceback (most recent call last):
File "<pyshell#4>", line 1, in <module>
inst.__attr
AttributeError: 'MyClass' object has no attribute '__attr'
Consider that you wrote your code using an internal variable with a single underscore (usually the convention to say, you shouldn't touch this but I won't enforce, do at your own risk):
>>> class MyClass2(object):
def __init__(self, value=None):
self.attr = value
#property
def attr(self):
# This acts as a getter?
# Let's call the function "attr_1" as alias
return self._attr
#attr.setter
def attr(self, value):
# This acts as a setter?
# Let's call the function "attr_2" as alias
self._attr = value
>>> inst2 = MyClass2(1)
>>> inst2._attr
1
And you can see the trick by peeking at the object's __dict__
>>> inst2.__dict__
{'_attr': 1}
>>> inst.__dict__
{'_MyClass__attr': 1}
Just some more to convince you that this has nothing to do with properties:
>>> class OtherClass(object):
def __init__(self, value):
self.__attr = value
def get_attr(self):
return self.__attr
def set_attr(self, value):
self.__attr = value
>>> other_inst = OtherClass(1)
>>> other_inst.get_attr()
1
>>> other_inst.__attr
Traceback (most recent call last):
File "<pyshell#17>", line 1, in <module>
other_inst.__attr
AttributeError: 'OtherClass' object has no attribute '__attr'
>>> other_inst.__dict__
{'_OtherClass__attr': 1}
>>> other_inst._OtherClass__attr
1
>>> other_inst._OtherClass__attr = 24
>>> other_inst.get_attr()
24
>>> inst._MyClass__attr = 23
>>> inst.attr
23
Concerning your last question, I just don't think there is such a convention in Python that classes must start with an uppercase. property is not an isolated case (datetime, itemgetter, csv.reader, ...).

In Python, how to enforce an abstract method to be static on the child class?

This is the setup I want:
A should be an abstract base class with a static & abstract method f(). B should inherit from A. Requirements:
1. You should not be able to instantiate A
2. You should not be able to instantiate B, unless it implements a static f()
Taking inspiration from this question, I've tried a couple of approaches. With these definitions:
class abstractstatic(staticmethod):
__slots__ = ()
def __init__(self, function):
super(abstractstatic, self).__init__(function)
function.__isabstractmethod__ = True
__isabstractmethod__ = True
class A:
__metaclass__ = abc.ABCMeta
#abstractstatic
def f():
pass
class B(A):
def f(self):
print 'f'
class A2:
__metaclass__ = abc.ABCMeta
#staticmethod
#abc.abstractmethod
def f():
pass
class B2(A2):
def f(self):
print 'f'
Here A2 and B2 are defined using usual Python conventions and A & B are defined using the way suggested in this answer. Following are some operations I tried and the results that were undesired.
With classes A/B:
>>> B().f()
f
#This should have thrown, since B doesn't implement a static f()
With classes A2/B2:
>>> A2()
<__main__.A2 object at 0x105beea90>
#This should have thrown since A2 should be an uninstantiable abstract class
>>> B2().f()
f
#This should have thrown, since B2 doesn't implement a static f()
Since neither of these approaches give me the output I want, how do I achieve what I want?
You can't do what you want with just ABCMeta. ABC enforcement doesn't do any type checking, only the presence of an attribute with the correct name is enforced.
Take for example:
>>> from abc import ABCMeta, abstractmethod, abstractproperty
>>> class Abstract(object):
... __metaclass__ = ABCMeta
... #abstractmethod
... def foo(self): pass
... #abstractproperty
... def bar(self): pass
...
>>> class Concrete(Abstract):
... foo = 'bar'
... bar = 'baz'
...
>>> Concrete()
<__main__.Concrete object at 0x104b4df90>
I was able to construct Concrete() even though both foo and bar are simple attributes.
The ABCMeta metaclass only tracks how many objects are left with the __isabstractmethod__ attribute being true; when creating a class from the metaclass (ABCMeta.__new__ is called) the cls.__abstractmethods__ attribute is then set to a frozenset object with all the names that are still abstract.
type.__new__ then tests for that frozenset and throws a TypeError if you try to create an instance.
You'd have to produce your own __new__ method here; subclass ABCMeta and add type checking in a new __new__ method. That method should look for __abstractmethods__ sets on the base classes, find the corresponding objects with the __isabstractmethod__ attribute in the MRO, then does typechecking on the current class attributes.
This'd mean that you'd throw the exception when defining the class, not an instance, however. For that to work you'd add a __call__ method to your ABCMeta subclass and have that throw the exception based on information gathered by your own __new__ method about what types were wrong; a similar two-stage process as what ABCMeta and type.__new__ do at the moment. Alternatively, update the __abstractmethods__ set on the class to add any names that were implemented but with the wrong type and leave it to type.__new__ to throw the exception.
The following implementation takes that last tack; add names back to __abstractmethods__ if the implemented type doesn't match (using a mapping):
from types import FunctionType
class ABCMetaTypeCheck(ABCMeta):
_typemap = { # map abstract type to expected implementation type
abstractproperty: property,
abstractstatic: staticmethod,
# abstractmethods return function objects
FunctionType: FunctionType,
}
def __new__(mcls, name, bases, namespace):
cls = super(ABCMetaTypeCheck, mcls).__new__(mcls, name, bases, namespace)
wrong_type = set()
seen = set()
abstractmethods = cls.__abstractmethods__
for base in bases:
for name in getattr(base, "__abstractmethods__", set()):
if name in seen or name in abstractmethods:
continue # still abstract or later overridden
value = base.__dict__.get(name) # bypass descriptors
if getattr(value, "__isabstractmethod__", False):
seen.add(name)
expected = mcls._typemap[type(value)]
if not isinstance(namespace[name], expected):
wrong_type.add(name)
if wrong_type:
cls.__abstractmethods__ = abstractmethods | frozenset(wrong_type)
return cls
With this metaclass you get your expected output:
>>> class Abstract(object):
... __metaclass__ = ABCMetaTypeCheck
... #abstractmethod
... def foo(self): pass
... #abstractproperty
... def bar(self): pass
... #abstractstatic
... def baz(): pass
...
>>> class ConcreteWrong(Abstract):
... foo = 'bar'
... bar = 'baz'
... baz = 'spam'
...
>>> ConcreteWrong()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: Can't instantiate abstract class ConcreteWrong with abstract methods bar, baz, foo
>>>
>>> class ConcreteCorrect(Abstract):
... def foo(self): return 'bar'
... #property
... def bar(self): return 'baz'
... #staticmethod
... def baz(): return 'spam'
...
>>> ConcreteCorrect()
<__main__.ConcreteCorrect object at 0x104ce1d10>

Is there any way that I can restrict a child class from inheriting some of its parent's methods?

class Liquid(object):
def foo(self):
pass
def bar(self):
pass
class Water(Liquid):
Say, I have the two classes above, Water inherits from Liquid. Is there any way I can restrict Water from inheriting one of the parent's methods, say bar()?
Sort of. But don't do it.
class Liquid(object):
def foo(self):
pass
def bar(self):
pass
class Water(Liquid):
def __getattribute__(self, name):
if name == 'bar':
raise AttributeError("'Water' object has no attribute 'bar'")
l = Liquid()
l.bar()
w = Water()
w.bar()
You can override the method to be a no-op, but you can't remove it. Doing so would violate one of the core principles of object-oriented design, namely that any object that inherits from some parent should be able to be used anywhere the parent is used. This is known as the Liskov Substitution Principle.
You can, as the other answers, say, break one of the inherited methods.
The alternative is to refactor out the "optional" methods, and inherit from a baseclass that doesn't have them:
class BaseLiquid(object):
def foo(self):
pass
class Barised(object):
def bar(self):
pass
class Liquid(BaseLiquid, Barised): pass
class Water(BaseLiquid):
def drip(self):
pass
This is probably not a good idea, but you could always use metaclasses to implement private attributes:
def private_attrs(name, bases, attrs):
def get_base_attrs(base):
result = {}
for deeper_base in base.mro()[1:]:
result.update( get_base_attrs(deeper_base) )
priv = []
if "__private__" in base.__dict__:
priv = base.__private__
for attr in base.__dict__:
if attr not in priv:
result.update( {attr: base.__dict__[attr]} )
return result
final_attrs = {}
for base in bases:
final_attrs.update( get_base_attrs(base) )
final_attrs.update(attrs)
return type(name, (), final_attrs)
class Liquid(object):
__metaclass__ = private_attrs
__private__ = ['bar']
def foo(self):
pass
def bar(self):
pass
class Water(Liquid):
__metaclass__ = private_attrs
print Water.foo
print Water.bar
Output is:
<unbound method Water.foo>
Traceback (most recent call last):
File "testing-inheritance.py", line 41, in <module>
print Water.bar
AttributeError: type object 'Water' has no attribute 'bar'
EDIT: This will mess up isinstance() because it doesn't modify bases of the class.
http://docs.python.org/release/2.5.2/ref/slots.html
I suspect you can do this, by using the slots attr.
It might be possible implementing a getattr method and throwing the appropriate exception if bar is called.
However, I agree, you don't want to do this in practice, since its a sign of bad design.

python: super()-like proxy object that starts the MRO search at a specified class

According to the docs, super(cls, obj) returns
a proxy object that delegates method calls to a parent or sibling
class of type cls
I understand why super() offers this functionality, but I need something slightly different: I need to create a proxy object that delegates methods calls (and attribute lookups) to class cls itself; and as in super, if cls doesn't implement the method/attribute, my proxy should continue looking in the MRO order (of the new not the original class). Is there any function I can write that achieves that?
Example:
class X:
def act():
#...
class Y:
def act():
#...
class A(X, Y):
def act():
#...
class B(X, Y):
def act():
#...
class C(A, B):
def act():
#...
c = C()
b = some_magic_function(B, c)
# `b` needs to delegate calls to `act` to B, and look up attribute `s` in B
# I will pass `b` somewhere else, and have no control over it
Of course, I could do b = super(A, c), but that relies on knowing the exact class hierarchy and the fact that B follows A in the MRO. It would silently break if any of these two assumptions change in the future. (Note that super doesn't make any such assumptions!)
If I just needed to call b.act(), I could use B.act(c). But I am passing b to someone else, and have no idea what they'll do with it. I need to make sure it doesn't betray me and start acting like an instance of class C at some point.
A separate question, the documentation for super() (in Python 3.2) only talks about its method delegation, and does not clarify that attribute lookups for the proxy are also performed the same way. Is it an accidental omission?
EDIT
The updated Delegate approach works in the following example as well:
class A:
def f(self):
print('A.f')
def h(self):
print('A.h')
self.f()
class B(A):
def g(self):
self.f()
print('B.g')
def f(self):
print('B.f')
def t(self):
super().h()
a_true = A()
# instance of A ends up executing A.f
a_true.h()
b = B()
a_proxy = Delegate(A, b)
# *unlike* super(), the updated `Delegate` implementation would call A.f, not B.f
a_proxy.h()
Note that the updated class Delegate is closer to what I want than super() for two reasons:
super() only does it proxying for the first call; subsequent calls will happen as normal, since by then the object is used, not its proxy.
super() does not allow attribute access.
Thus, my question as asked has a (nearly) perfect answer in Python.
It turns out that, at a higher level, I was trying to do something I shouldn't (see my comments here).
This class should cover the most common cases:
class Delegate:
def __init__(self, cls, obj):
self._delegate_cls = cls
self._delegate_obj = obj
def __getattr__(self, name):
x = getattr(self._delegate_cls, name)
if hasattr(x, "__get__"):
return x.__get__(self._delegate_obj)
return x
Use it like this:
b = Delegate(B, c)
(with the names from your example code.)
Restrictions:
You cannot retrieve some special attributes like __class__ etc. from the class you pass in the constructor via this proxy. (This restistions also applies to super.)
This might behave weired if the attribute you want to retrieve is some weired kind of descriptor.
Edit: If you want the code in the update to your question to work as desired, you can use the foloowing code:
class Delegate:
def __init__(self, cls):
self._delegate_cls = cls
def __getattr__(self, name):
x = getattr(self._delegate_cls, name)
if hasattr(x, "__get__"):
return x.__get__(self)
return x
This passes the proxy object as self parameter to any called method, and it doesn't need the original object at all, hence I deleted it from the constructor.
If you also want instance attributes to be accessible you can use this version:
class Delegate:
def __init__(self, cls, obj):
self._delegate_cls = cls
self._delegate_obj = obj
def __getattr__(self, name):
if name in vars(self._delegate_obj):
return getattr(self._delegate_obj, name)
x = getattr(self._delegate_cls, name)
if hasattr(x, "__get__"):
return x.__get__(self)
return x
A separate question, the documentation for super() (in Python 3.2)
only talks about its method delegation, and does not clarify that
attribute lookups for the proxy are also performed the same way. Is it
an accidental omission?
No, this is not accidental. super() does nothing for attribute lookups. The reason is that attributes on an instance are not associated with a particular class, they're just there. Consider the following:
class A:
def __init__(self):
self.foo = 'foo set from A'
class B(A):
def __init__(self):
super().__init__()
self.bar = 'bar set from B'
class C(B):
def method(self):
self.baz = 'baz set from C'
class D(C):
def __init__(self):
super().__init__()
self.foo = 'foo set from D'
self.baz = 'baz set from D'
instance = D()
instance.method()
instance.bar = 'not set from a class at all'
Which class "owns" foo, bar, and baz?
If I wanted to view instance as an instance of C, should it have a baz attribute before method is called? How about afterwards?
If I view instance as an instance of A, what value should foo have? Should bar be invisible because was only added in B, or visible because it was set to a value outside the class?
All of these questions are nonsense in Python. There's no possible way to design a system with the semantics of Python that could give sensible answers to them. __init__ isn't even special in terms of adding attributes to instances of the class; it's just a perfectly ordinary method that happens to be called as part of the instance creation protocol. Any method (or indeed code from another class altogether, or not from any class at all) can create attributes on any instance it has a reference to.
In fact, all of the attributes of instance are stored in the same place:
>>> instance.__dict__
{'baz': 'baz set from C', 'foo': 'foo set from D', 'bar': 'not set from a class at all'}
There's no way to tell which of them were originally set by which class, or were last set by which class, or whatever measure of ownership you want. There's certainly no way to get at "the A.foo being shadowed by D.foo", as you would expect from C++; they're the same attribute, and any writes to to it by one class (or from elsewhere) will clobber a value left in it by the other class.
The consequence of this is that super() does not perform attribute lookups the same way it does method lookups; it can't, and neither can any code you write.
In fact, from running some experiments, neither super nor Sven's Delegate actually support direct attribute retrieval at all!
class A:
def __init__(self):
self.spoon = 1
self.fork = 2
def foo(self):
print('A.foo')
class B(A):
def foo(self):
print('B.foo')
b = B()
d = Delegate(A, b)
s = super(B, b)
Then both work as expected for methods:
>>> d.foo()
A.foo
>>> s.foo()
A.foo
But:
>>> d.fork
Traceback (most recent call last):
File "<pyshell#43>", line 1, in <module>
d.fork
File "/tmp/foo.py", line 6, in __getattr__
x = getattr(self._delegate_cls, name)
AttributeError: type object 'A' has no attribute 'fork'
>>> s.spoon
Traceback (most recent call last):
File "<pyshell#45>", line 1, in <module>
s.spoon
AttributeError: 'super' object has no attribute 'spoon'
So they both only really work for calling some methods on, not for passing to arbitrary third party code to pretend to be an instance of the class you want to delegate to.
They don't behave the same way in the presence of multiple inheritance unfortunately. Given:
class Delegate:
def __init__(self, cls, obj):
self._delegate_cls = cls
self._delegate_obj = obj
def __getattr__(self, name):
x = getattr(self._delegate_cls, name)
if hasattr(x, "__get__"):
return x.__get__(self._delegate_obj)
return x
class A:
def foo(self):
print('A.foo')
class B:
pass
class C(B, A):
def foo(self):
print('C.foo')
c = C()
d = Delegate(B, c)
s = super(C, c)
Then:
>>> d.foo()
Traceback (most recent call last):
File "<pyshell#50>", line 1, in <module>
d.foo()
File "/tmp/foo.py", line 6, in __getattr__
x = getattr(self._delegate_cls, name)
AttributeError: type object 'B' has no attribute 'foo'
>>> s.foo()
A.foo
Because Delegate ignores the full MRO of whatever class _delegate_obj is an instance of, only using the MRO of _delegate_cls. Whereas super does what you asked in the question, but the behaviour seems quite strange: it's not wrapping an instance of C to pretend it's an instance of B, because direct instances of B don't have foo defined.
Here's my attempt:
class MROSkipper:
def __init__(self, cls, obj):
self.__cls = cls
self.__obj = obj
def __getattr__(self, name):
mro = self.__obj.__class__.__mro__
i = mro.index(self.__cls)
if i == 0:
# It's at the front anyway, just behave as getattr
return getattr(self.__obj, name)
else:
# Check __dict__ not getattr, otherwise we'd find methods
# on classes we're trying to skip
try:
return self.__obj.__dict__[name]
except KeyError:
return getattr(super(mro[i - 1], self.__obj), name)
I rely on the __mro__ attribute of classes to properly figure out where to start from, then I just use super. You could walk the MRO chain from that point yourself checking class __dict__s for methods instead if the weirdness of going back one step to use super is too much.
I've made no attempt to handle unusual attributes; those implemented with descriptors (including properties), or those magic methods looked up behind the scenes by Python, which often start at the class rather than the instance directly. But this behaves as you asked moderately well (with the caveat expounded on ad nauseum in the first part of my post; looking up attributes this way will not give you any different results than looking them up directly in the instance).

Categories