Creating dynamic ABC class based on user defined class - python

I'm writing an plugin framework and I want to be able to write a decorator interface, which will convert user class to ABC class and substitute all methods with abstractmethods. I cannot get it working and I suppose the problem is connected with wrong mro, but I can be wrong.
I basically need to be albe to write:
#interface
class X:
def test(self):
pass
x = X() # should fail, because test will be abstract method.
substituting methods with their abstract versions is straightforward (you have to iterate over func's and replace them with abc.abstractmethod(func)), but I've got problem with creating dynamic type, which will be an ABCmeta metaclass.
Right now I've got something like:
from abc import ABCMeta
class Interface(metaclass=ABCMeta):
pass
def interface(cls):
newcls = type(cls.__name__, (Interface, cls), {})
# substitute all methods with abstract ones
for name, func in inspect.getmembers(newcls, predicate=inspect.isfunction):
setattr(newcls, name, abstractmethod(func))
return newcls
but it doesnot work - Ican initialize class X without errors.
With standard usage of ABC in Python, we can write:
class X(metaclass=ABCMeta):
#abstractmethod
def test(self):
pass
x = X() # it will fail
How can I create dynamic type in Python3, which will behave like it will have metaclass ABCmeta and will substitute all functions with abstract ones?

The trick is not to use setattr to reset each of the attributes, but instead to pass those modified attributes to the type function as a dictionary:
import inspect
from abc import ABCMeta, abstractmethod
class Interface(metaclass=ABCMeta):
pass
def interface(cls):
attrs = {n: abstractmethod(f)
for n, f in inspect.getmembers(cls, predicate=inspect.isfunction)}
return type(cls.__name__, (Interface, cls), attrs)
#interface
class X(metaclass=ABCMeta):
def test(self):
pass
x = X()
# does fail:
# Traceback (most recent call last):
# File "test.py", line 19, in <module>
# x = X() # should fail, because test will be abstract method.
# TypeError: Can't instantiate abstract class X with abstract methods test

Related

Using metaclasses in order to define methods, class methods / instance methods

I am trying to understand deeper how metaclasses work in python. My problem is the following, I want to use metaclasses in order to define a method for each class which would use a class attribute defined within the metaclass. For instance, this has application for registration.
Here is a working example:
import functools
def dec_register(func):
#functools.wraps(func)
def wrapper_register(*args, **kwargs):
(args[0].__class__.list_register_instances).append(args[0])
return func(*args, **kwargs)
return wrapper_register
dict_register_classes = {}
class register(type):
def __new__(meta, name, bases, attrs):
dict_register_classes[name] = cls = type.__new__(meta, name, bases, attrs) # assigniation from right to left
cls.list_register_instances = []
cls.print_register = meta.print_register
return cls
def print_register(self):
for element in self.list_register_instances:
print(element)
def print_register_class(cls):
for element in cls.list_register_instances:
print(element)
#
class Foo(metaclass=register):
#dec_register
def __init__(self):
pass
def print_register(self):
pass
class Boo(metaclass=register):
#dec_register
def __init__(self):
pass
def print_register(self):
pass
f = Foo()
f_ = Foo()
b = Boo()
print(f.list_register_instances)
print(b.list_register_instances)
print(dict_register_classes)
print("1")
f.print_register()
print("2")
Foo.print_register_class()
print("3")
f.print_register_class()
print("4")
Foo.print_register()
The test I am making at the end do not work as I was expected. I apologize in advance if what I am saying is not using the proper syntax, I am trying to be as clear as possible :
I was thinking that the line cls.print_register = meta.print_register is defining a method within the class using the method defined within the metaclass. Thus it is a method that I can use on an object. I can also use it a class method since it is defined in the metaclass. However, though the following works :
print("1")
f.print_register()
this do not work correctly :
print("4")
Foo.print_register()
with error :
Foo.print_register()
TypeError: print_register() missing 1 required positional argument: 'self'
Same for test 2 and 3, where I was expecting that if a method is defined on the class level, it should also be defined on the object level. However, test 3 is raising an error.
print("2")
Foo.print_register_class()
print("3")
f.print_register_class()
Hence, can you please explain me how come my understanding of class methods is wrong ? I would like to be able to call the method print_register either on the class or on the object.
Perhaps it could help to know that in fact I was trying to reproduce the following very simple example :
# example without anything fancy:
class Foo:
list_register_instances = []
def __init__(self):
self.__class__.list_register_instances.append(self)
#classmethod
def print_register(cls):
for element in cls.list_register_instances:
print(element)
Am I not doing the exact same thing with a metaclass ? A classmethod can be used either on a class or on objects.
Also if you have any tips about code structure I would greatly appreciate it. I must be very bad at the syntax of metaclasses.
Fundamentally, because you have shadowed print_register on your instance of the metaclass (your class).
So when you do Foo.print_register, it finds the print_register you defined in
class Foo(metaclass=register):
...
def print_register(self):
pass
Which of course, is just the plain function print_register, which requires the self argument.
This is (almost) the same thing that would happen with just a regular class and it's instances:
class Foo:
def bar(self):
print("I am a bar")
foo = Foo()
foo.bar = lambda x: print("I've hijacked bar")
foo.bar()
Note:
In [1]: class Meta(type):
...: def print_register(self):
...: print('hi')
...:
In [2]: class Foo(metaclass=Meta):
...: pass
...:
In [3]: Foo.print_register()
hi
In [4]: class Foo(metaclass=Meta):
...: def print_register(self):
...: print('hello')
...:
In [5]: Foo.print_register()
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-5-a42427fde947> in <module>
----> 1 Foo.print_register()
TypeError: print_register() missing 1 required positional argument: 'self'
However, you do this in your metaclass constructor as well!
cls.print_register = meta.print_register
Which is effectively like defining that function in your class definition... not sure why you are doing this though.
You are not doing the exact same thing as using a classmethod, which is a custom descriptor that handles the binding of methods to instances in just the way you'd need to be able to call it on a class or on an instance. That is not the same as defining a method on the class and on the instance! You could just do this in your metaclass __new__, i.e. cls.print_register = classmethod(meta.print_register) and leave def print_register(self) out of your class definitions:
import functools
def dec_register(func):
#functools.wraps(func)
def wrapper_register(*args, **kwargs):
(args[0].__class__.list_register_instances).append(args[0])
return func(*args, **kwargs)
return wrapper_register
dict_register_classes = {}
class register(type):
def __new__(meta, name, bases, attrs):
dict_register_classes[name] = cls = type.__new__(meta, name, bases, attrs) # assigniation from right to left
cls.list_register_instances = []
cls.print_register = classmethod(meta.print_register) # just create the classmethod manually!
return cls
def print_register(self):
for element in self.list_register_instances:
print(element)
def print_register_class(cls):
for element in cls.list_register_instances:
print(element)
#
class Foo(metaclass=register):
#dec_register
def __init__(self):
pass
Note, print_register doesn't have to be defined inside your metaclass, indeed, in this case, I would just define it at the module level:
def print_register(self):
for element in self.list_register_instances:
print(element)
...
class register(type):
def __new__(meta, name, bases, attrs):
dict_register_classes[name] = cls = type.__new__(meta, name, bases, attrs) # assigniation from right to left
cls.list_register_instances = []
cls.print_register = classmethod(print_register)
return cls
...
I think you understand metaclasses sufficiently, actually, it is your understanding of classmethod that is incorrect, as far as I can tell. If you want to understand how classmethod works, indeed, how method-instance binding works for regular functions, you need to understand descriptors. Here's an enlightening link. Function objects are descriptors, they bind the instance as the first argument to themselves when called on an instance (rather, they create a method object and return that, but it is basically partial application). classmethod objects are another kind of descriptor, one that binds the class to the first argument to the function it decorates when called on either the class or the instance. The link describes how you could write classmethod using pure python.

Create an ABC with abstract methods defined from json keys

Say I have a json file look like:
{
"foo": ["hi", "there"],
"bar": ["nothing"]
}
I'd like to create an abstract base class (ABC), where the name of abstract methods are the keys of the json above, i.e.:
from abc import ABCMeta, abstractmethod
class MyABC(metaclass=ABCMeta):
#abstractmethod
def foo(self):
pass
#abstractmethod
def bar(self):
pass
The problem is the json file actually has lots of keys. I wonder if there's any way like:
import json
with open("the_json.json") as f:
the_json = json.load(f)
class MyABC(metaclass=ABCMeta):
# for k in the_json.keys():
# create abstract method k
Thanks for the suggestions from the comments, but somehow it doesn't work as expected. Here is what I tried:
class MyABC(metaclass=ABCMeta):
pass
def f(self):
pass
setattr(MyABC, "foo", abstractmethod(f))
# I also tried
# setattr(MyABC, "foo", abstractmethod(lambda self: ...))
# Try to define another class that inherits MyABC
class MyClass(MyABC):
pass
c = MyClass()
# Now this should trigger TypeError but it doesn't
# I can even call c.foo() without getting any errors
This may work :
from abc import ABCMeta, abstractmethod
with open("the_json.json") as f:
the_json = json.load(f)
class MyABC(metaclass=ABCMeta):
def func(self):
pass
for k in the_json:
locals()[k] = abstractmethod(func)
# Delete attribute "func" is a must
# Otherwise it becomes an additional abstract method in MyABC
delattr(MyABC, "func")
delattr(MyABC, "f")
delattr(MyABC, "k")
class MyClass(MyABC):
pass
MyClass()
# TypeError: Can't instantiate abstract class MyClass with abstract methods bar, foo
It will correctly throw an error if you try to instantiate MyABC, or a subclass of MyABC that doesn't implement the abstract methods.

Overriding __contains__ method for a class

I need to simulate enums in Python, and did it by writing classes like:
class Spam(Enum):
k = 3
EGGS = 0
HAM = 1
BAKEDBEANS = 2
Now I'd like to test if some constant is a valid choice for a particular Enum-derived class, with the following syntax:
if (x in Foo):
print("seems legit")
Therefore I tried to create an "Enum" base class where I override the __contains__ method like this:
class Enum:
"""
Simulates an enum.
"""
k = 0 # overwrite in subclass with number of constants
#classmethod
def __contains__(cls, x):
"""
Test for valid enum constant x:
x in Enum
"""
return (x in range(cls.k))
However, when using the in keyword on the class (like the example above), I get the error:
TypeError: argument of type 'type' is not iterable
Why that? Can I somehow get the syntactic sugar I want?
Why that?
When you use special syntax like a in Foo, the __contains__ method is looked up on the type of Foo. However, your __contains__ implementation exists on Foo itself, not its type. Foo's type is type, which doesn't implement this (or iteration), thus the error.
The same situation occurs if you instantiate an object and then, after it is created, add a __contains__ function to the instance variables. That function won't be called:
>>> class Empty: pass
...
>>> x = Empty()
>>> x.__contains__ = lambda: True
>>> 1 in x
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: argument of type 'Empty' is not iterable
Can I somehow get the syntactic sugar I want?
Yes. As mentioned above, the method is looked up on Foo's type. The type of a class is called a metaclass, so you need a new metaclass that implements __contains__.
Try this one:
class MetaEnum(type):
def __contains__(cls, x):
return x in range(cls.k)
As you can see, the methods on a metaclass take the metaclass instance -- the class -- as their first argument. This should make sense. It's very similar to a classmethod, except that the method lives on the metaclass and not the class.
Inheritance from a class with a custom metaclass also inherits the metaclass, so you can create a base class like so:
class BaseEnum(metaclass=MetaEnum):
pass
class MyEnum(BaseEnum):
k = 3
print(1 in MyEnum) # True
My usecase was to test on the names of the members of my Enum.
With a slight modification to this solution:
from enum import Enum, EnumMeta, auto
class MetaEnum(EnumMeta):
def __contains__(cls, item):
return item in cls.__members__.keys()
class BaseEnum(Enum, metaclass=MetaEnum):
pass
class LogSections(BaseEnum):
configuration = auto()
debug = auto()
errors = auto()
component_states = auto()
alarm = auto()
if __name__ == "__main__":
print('configuration' in LogSections)
print('b' in LogSections)
True
False

Python - Testing an abstract base class

I am looking for ways / best practices on testing methods defined in an abstract base class. One thing I can think of directly is performing the test on all concrete subclasses of the base class, but that seems excessive at some times.
Consider this example:
import abc
class Abstract(object):
__metaclass__ = abc.ABCMeta
#abc.abstractproperty
def id(self):
return
#abc.abstractmethod
def foo(self):
print "foo"
def bar(self):
print "bar"
Is it possible to test bar without doing any subclassing?
In newer versions of Python you can use unittest.mock.patch()
class MyAbcClassTest(unittest.TestCase):
#patch.multiple(MyAbcClass, __abstractmethods__=set())
def test(self):
self.instance = MyAbcClass() # Ha!
Here is what I have found: If you set __abstractmethods__ attribute to be an empty set you'll be able to instantiate abstract class. This behaviour is specified in PEP 3119:
If the resulting __abstractmethods__ set is non-empty, the class is considered abstract, and attempts to instantiate it will raise TypeError.
So you just need to clear this attribute for the duration of tests.
>>> import abc
>>> class A(metaclass = abc.ABCMeta):
... #abc.abstractmethod
... def foo(self): pass
You cant instantiate A:
>>> A()
Traceback (most recent call last):
TypeError: Can't instantiate abstract class A with abstract methods foo
If you override __abstractmethods__ you can:
>>> A.__abstractmethods__=set()
>>> A() #doctest: +ELLIPSIS
<....A object at 0x...>
It works both ways:
>>> class B(object): pass
>>> B() #doctest: +ELLIPSIS
<....B object at 0x...>
>>> B.__abstractmethods__={"foo"}
>>> B()
Traceback (most recent call last):
TypeError: Can't instantiate abstract class B with abstract methods foo
You can also use unittest.mock (from 3.3) to override temporarily ABC behaviour.
>>> class A(metaclass = abc.ABCMeta):
... #abc.abstractmethod
... def foo(self): pass
>>> from unittest.mock import patch
>>> p = patch.multiple(A, __abstractmethods__=set())
>>> p.start()
{}
>>> A() #doctest: +ELLIPSIS
<....A object at 0x...>
>>> p.stop()
>>> A()
Traceback (most recent call last):
TypeError: Can't instantiate abstract class A with abstract methods foo
As properly put by lunaryon, it is not possible. The very purpose of ABCs containing abstract methods is that they are not instantiatable as declared.
However, it is possible to create a utility function that introspects an ABC, and creates a dummy, non abstract class on the fly. This function could be called directly inside your test method/function and spare you of having to wite boiler plate code on the test file just for testing a few methods.
def concreter(abclass):
"""
>>> import abc
>>> class Abstract(metaclass=abc.ABCMeta):
... #abc.abstractmethod
... def bar(self):
... return None
>>> c = concreter(Abstract)
>>> c.__name__
'dummy_concrete_Abstract'
>>> c().bar() # doctest: +ELLIPSIS
(<abc_utils.Abstract object at 0x...>, (), {})
"""
if not "__abstractmethods__" in abclass.__dict__:
return abclass
new_dict = abclass.__dict__.copy()
for abstractmethod in abclass.__abstractmethods__:
#replace each abc method or property with an identity function:
new_dict[abstractmethod] = lambda x, *args, **kw: (x, args, kw)
#creates a new class, with the overriden ABCs:
return type("dummy_concrete_%s" % abclass.__name__, (abclass,), new_dict)
You can use multiple inheritance practice to have access to the implemented methods of the abstract class. Obviously following such design decision depends on the structure of the abstract class since you need to implement abstract methods (at least bring the signature) in your test case.
Here is the example for your case:
class Abstract(object):
__metaclass__ = abc.ABCMeta
#abc.abstractproperty
def id(self):
return
#abc.abstractmethod
def foo(self):
print("foo")
def bar(self):
print("bar")
class AbstractTest(unittest.TestCase, Abstract):
def foo(self):
pass
def test_bar(self):
self.bar()
self.assertTrue(1==1)
No, it's not. The very purpose of abc is to create classes that cannot be instantiated unless all abstract attributes are overridden with concrete implementations. Hence you need to derive from the abstract base class and override all abstract methods and properties.
Perhaps a more compact version of the concreter proposed by #jsbueno could be:
def concreter(abclass):
class concreteCls(abclass):
pass
concreteCls.__abstractmethods__ = frozenset()
return type('DummyConcrete' + abclass.__name__, (concreteCls,), {})
The resulting class still has all original abstract methods (which can be now called, even if this is not likely to be useful...) and can be mocked as needed.

python: super()-like proxy object that starts the MRO search at a specified class

According to the docs, super(cls, obj) returns
a proxy object that delegates method calls to a parent or sibling
class of type cls
I understand why super() offers this functionality, but I need something slightly different: I need to create a proxy object that delegates methods calls (and attribute lookups) to class cls itself; and as in super, if cls doesn't implement the method/attribute, my proxy should continue looking in the MRO order (of the new not the original class). Is there any function I can write that achieves that?
Example:
class X:
def act():
#...
class Y:
def act():
#...
class A(X, Y):
def act():
#...
class B(X, Y):
def act():
#...
class C(A, B):
def act():
#...
c = C()
b = some_magic_function(B, c)
# `b` needs to delegate calls to `act` to B, and look up attribute `s` in B
# I will pass `b` somewhere else, and have no control over it
Of course, I could do b = super(A, c), but that relies on knowing the exact class hierarchy and the fact that B follows A in the MRO. It would silently break if any of these two assumptions change in the future. (Note that super doesn't make any such assumptions!)
If I just needed to call b.act(), I could use B.act(c). But I am passing b to someone else, and have no idea what they'll do with it. I need to make sure it doesn't betray me and start acting like an instance of class C at some point.
A separate question, the documentation for super() (in Python 3.2) only talks about its method delegation, and does not clarify that attribute lookups for the proxy are also performed the same way. Is it an accidental omission?
EDIT
The updated Delegate approach works in the following example as well:
class A:
def f(self):
print('A.f')
def h(self):
print('A.h')
self.f()
class B(A):
def g(self):
self.f()
print('B.g')
def f(self):
print('B.f')
def t(self):
super().h()
a_true = A()
# instance of A ends up executing A.f
a_true.h()
b = B()
a_proxy = Delegate(A, b)
# *unlike* super(), the updated `Delegate` implementation would call A.f, not B.f
a_proxy.h()
Note that the updated class Delegate is closer to what I want than super() for two reasons:
super() only does it proxying for the first call; subsequent calls will happen as normal, since by then the object is used, not its proxy.
super() does not allow attribute access.
Thus, my question as asked has a (nearly) perfect answer in Python.
It turns out that, at a higher level, I was trying to do something I shouldn't (see my comments here).
This class should cover the most common cases:
class Delegate:
def __init__(self, cls, obj):
self._delegate_cls = cls
self._delegate_obj = obj
def __getattr__(self, name):
x = getattr(self._delegate_cls, name)
if hasattr(x, "__get__"):
return x.__get__(self._delegate_obj)
return x
Use it like this:
b = Delegate(B, c)
(with the names from your example code.)
Restrictions:
You cannot retrieve some special attributes like __class__ etc. from the class you pass in the constructor via this proxy. (This restistions also applies to super.)
This might behave weired if the attribute you want to retrieve is some weired kind of descriptor.
Edit: If you want the code in the update to your question to work as desired, you can use the foloowing code:
class Delegate:
def __init__(self, cls):
self._delegate_cls = cls
def __getattr__(self, name):
x = getattr(self._delegate_cls, name)
if hasattr(x, "__get__"):
return x.__get__(self)
return x
This passes the proxy object as self parameter to any called method, and it doesn't need the original object at all, hence I deleted it from the constructor.
If you also want instance attributes to be accessible you can use this version:
class Delegate:
def __init__(self, cls, obj):
self._delegate_cls = cls
self._delegate_obj = obj
def __getattr__(self, name):
if name in vars(self._delegate_obj):
return getattr(self._delegate_obj, name)
x = getattr(self._delegate_cls, name)
if hasattr(x, "__get__"):
return x.__get__(self)
return x
A separate question, the documentation for super() (in Python 3.2)
only talks about its method delegation, and does not clarify that
attribute lookups for the proxy are also performed the same way. Is it
an accidental omission?
No, this is not accidental. super() does nothing for attribute lookups. The reason is that attributes on an instance are not associated with a particular class, they're just there. Consider the following:
class A:
def __init__(self):
self.foo = 'foo set from A'
class B(A):
def __init__(self):
super().__init__()
self.bar = 'bar set from B'
class C(B):
def method(self):
self.baz = 'baz set from C'
class D(C):
def __init__(self):
super().__init__()
self.foo = 'foo set from D'
self.baz = 'baz set from D'
instance = D()
instance.method()
instance.bar = 'not set from a class at all'
Which class "owns" foo, bar, and baz?
If I wanted to view instance as an instance of C, should it have a baz attribute before method is called? How about afterwards?
If I view instance as an instance of A, what value should foo have? Should bar be invisible because was only added in B, or visible because it was set to a value outside the class?
All of these questions are nonsense in Python. There's no possible way to design a system with the semantics of Python that could give sensible answers to them. __init__ isn't even special in terms of adding attributes to instances of the class; it's just a perfectly ordinary method that happens to be called as part of the instance creation protocol. Any method (or indeed code from another class altogether, or not from any class at all) can create attributes on any instance it has a reference to.
In fact, all of the attributes of instance are stored in the same place:
>>> instance.__dict__
{'baz': 'baz set from C', 'foo': 'foo set from D', 'bar': 'not set from a class at all'}
There's no way to tell which of them were originally set by which class, or were last set by which class, or whatever measure of ownership you want. There's certainly no way to get at "the A.foo being shadowed by D.foo", as you would expect from C++; they're the same attribute, and any writes to to it by one class (or from elsewhere) will clobber a value left in it by the other class.
The consequence of this is that super() does not perform attribute lookups the same way it does method lookups; it can't, and neither can any code you write.
In fact, from running some experiments, neither super nor Sven's Delegate actually support direct attribute retrieval at all!
class A:
def __init__(self):
self.spoon = 1
self.fork = 2
def foo(self):
print('A.foo')
class B(A):
def foo(self):
print('B.foo')
b = B()
d = Delegate(A, b)
s = super(B, b)
Then both work as expected for methods:
>>> d.foo()
A.foo
>>> s.foo()
A.foo
But:
>>> d.fork
Traceback (most recent call last):
File "<pyshell#43>", line 1, in <module>
d.fork
File "/tmp/foo.py", line 6, in __getattr__
x = getattr(self._delegate_cls, name)
AttributeError: type object 'A' has no attribute 'fork'
>>> s.spoon
Traceback (most recent call last):
File "<pyshell#45>", line 1, in <module>
s.spoon
AttributeError: 'super' object has no attribute 'spoon'
So they both only really work for calling some methods on, not for passing to arbitrary third party code to pretend to be an instance of the class you want to delegate to.
They don't behave the same way in the presence of multiple inheritance unfortunately. Given:
class Delegate:
def __init__(self, cls, obj):
self._delegate_cls = cls
self._delegate_obj = obj
def __getattr__(self, name):
x = getattr(self._delegate_cls, name)
if hasattr(x, "__get__"):
return x.__get__(self._delegate_obj)
return x
class A:
def foo(self):
print('A.foo')
class B:
pass
class C(B, A):
def foo(self):
print('C.foo')
c = C()
d = Delegate(B, c)
s = super(C, c)
Then:
>>> d.foo()
Traceback (most recent call last):
File "<pyshell#50>", line 1, in <module>
d.foo()
File "/tmp/foo.py", line 6, in __getattr__
x = getattr(self._delegate_cls, name)
AttributeError: type object 'B' has no attribute 'foo'
>>> s.foo()
A.foo
Because Delegate ignores the full MRO of whatever class _delegate_obj is an instance of, only using the MRO of _delegate_cls. Whereas super does what you asked in the question, but the behaviour seems quite strange: it's not wrapping an instance of C to pretend it's an instance of B, because direct instances of B don't have foo defined.
Here's my attempt:
class MROSkipper:
def __init__(self, cls, obj):
self.__cls = cls
self.__obj = obj
def __getattr__(self, name):
mro = self.__obj.__class__.__mro__
i = mro.index(self.__cls)
if i == 0:
# It's at the front anyway, just behave as getattr
return getattr(self.__obj, name)
else:
# Check __dict__ not getattr, otherwise we'd find methods
# on classes we're trying to skip
try:
return self.__obj.__dict__[name]
except KeyError:
return getattr(super(mro[i - 1], self.__obj), name)
I rely on the __mro__ attribute of classes to properly figure out where to start from, then I just use super. You could walk the MRO chain from that point yourself checking class __dict__s for methods instead if the weirdness of going back one step to use super is too much.
I've made no attempt to handle unusual attributes; those implemented with descriptors (including properties), or those magic methods looked up behind the scenes by Python, which often start at the class rather than the instance directly. But this behaves as you asked moderately well (with the caveat expounded on ad nauseum in the first part of my post; looking up attributes this way will not give you any different results than looking them up directly in the instance).

Categories