Get attributes for class and instance in python - python

In python work next code:
class MyClass(object):
field = 1
>>> MyClass.field
1
>>> MyClass().field
1
When I want return value for custom fields I use next code:
class MyClass(object):
def __getattr__(self, name):
if name.startswith('fake'):
return name
raise AttributeError("%r object has no attribute %r" %
(type(self).__name__, name))
>>> MyClass().fake
fake
But:
>>> MyClass.fake
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: class MyClass has no attribute 'fake'
Ok, for classes I can use next code:
class MyClassMeta(type):
def __getattr__(cls, name):
if name.startswith('fake'):
return name
raise AttributeError("%r object has no attribute %r" %
(type(self).__name__, name))
class MyClass(object):
__metaclass__ = MyClassMeta
>>> MyClass.fake
fake
But:
>>> MyClass().fake
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'MyClass' object has no attribute 'fake'
To resolve this problem I use next code:
class FakeAttrMixin():
def __getattr__(self, name):
if name.startswith('fake'):
return name
raise AttributeError("%r object has no attribute %r" %
(type(self).__name__, name))
class MyClassMeta(type, FakeAttrMixin):
pass
class MyClass(object, FakeAttrMixin):
__metaclass__ = MyClassMeta
>>> MyClass.fake
fake
>>> MyClass().fake
fake
MyClass.fake will call __getattr__ with MyClass and fake arguments.
MyClass().fake will call __getattr__ with MyClass instance and fake arguments.
And it's ok if I implement __getattr__ logic only on my mixin and don't use self argument.
Can I write custom value resolving by class and instance more beautiful and why field value resolving for MyClass.field and MyClass().field with MyClass(object): field = 1 definition works different if compare with __getattr__ method? Because when I want get field it at first searching in instance, then in class, but I can't understand why __getattr__ works another way.
Similar questions: __getattr__ on a class and not (or as well as) an instance and Difference between accessing an instance attribute and a class attribute.

No, if you have to support both arbitrary attribute lookup on the class as well as the instance, then your only option is to implement a __getattr__ hook method on both the metaclass and the class, one each to support lookups on the class and the instance.
This is because special hook methods are always looked up on the type, so type(obj).__getattr__. Hence, for MyClass.fake the metaclass __getattr__ is used. See Special method lookup for new-style classes; I explained why this is in a previous answer.
The short reason is that in your case, MyClass.fake would translate into MyClass.__getattr__('fake') and __getattr__ is then an unbound method expecting two arguments (self and name), which would fail.

Related

How to troubleshoot `super()` calls finding incorrect type and obj?

I have a decorator in my library which takes a user's class and creates a new version of it, with a new metaclass, it is supposed to completely replace the original class. Everything works; except for super() calls:
class NewMeta(type):
pass
def deco(cls):
cls_dict = dict(cls.__dict__)
if "__dict__" in cls_dict:
del cls_dict["__dict__"]
if "__weakref__" in cls_dict:
del cls_dict["__weakref__"]
return NewMeta(cls.__name__, cls.__bases__, cls_dict)
#deco
class B:
def x(self):
print("Hi there")
#deco
class A(B):
def x(self):
super().x()
Using this code like so, yields an error:
>>> a = A()
>>> a.x()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 4, in x
TypeError: super(type, obj): obj must be an instance or subtype of type
Some terminology:
The source code class A as produced by class A(B):.
The produced class A*, as produced by NewMeta(cls.__name__, cls.__bases__, cls_dict).
A is established by Python to be the type when using super inside of the methods of A*. How can I correct this?
There's some suboptimal solutions like calling super(type(self), self).x, or passing cls.__mro__ instead of cls.__bases__ into the NewMeta call (so that obj=self always inherits from the incorrect type=A). The first is unsustainable for end users, the 2nd pollutes the inheritance chains and is confusing as the class seems to inherit from itself.
Python seems to introspect the source code, or maybe stores some information to automatically establish the type, and in this case, I'd say it is failing to do so;
How could I make sure that inside of the methods of A A* is established as the type argument of argumentless super calls?
The argument-free super uses the __class__ cell, which is a regular function closure.
Data Model: Creating the class object
__class__ is an implicit closure reference created by the compiler if any methods in a class body refer to either __class__ or super.
>>> class E:
... def x(self):
... return __class__ # return the __class__ cell
...
>>> E().x()
__main__.E
>>> # The cell is stored as a __closure__
>>> E.x.__closure__[0].cell_contents is E().x() is E
True
Like any other closure, this is a lexical relation: it refers to class scope in which the method was literally defined. Replacing the class with a decorator still has the methods refer to the original class.
The simplest fix is to explicitly refer to the name of the class, which gets rebound to the newly created class by the decorator.
#deco
class A(B):
def x(self):
super(A, self).x()
Alternatively, one can change the content of the __class__ cell to point to the new class:
def deco(cls):
cls_dict = dict(cls.__dict__)
cls_dict.pop("__dict__", None)
cls_dict.pop("__weakref__", None)
new_cls = NewMeta(cls.__name__, cls.__bases__, cls_dict)
for method in new_cls.__dict__.values():
if getattr(method, "__closure__", None) and method.__closure__[0].cell_contents is cls:
method.__closure__[0].cell_contents = new_cls
return new_cls

Python ABC inheritance with collections.namedtuple suppressing implementation error [duplicate]

Consider the following code example
import abc
class ABCtest(abc.ABC):
#abc.abstractmethod
def foo(self):
raise RuntimeError("Abstract method was called, this should be impossible")
class ABCtest_B(ABCtest):
pass
test = ABCtest_B()
This correctly raises the error:
Traceback (most recent call last):
File "/.../test.py", line 10, in <module>
test = ABCtest_B()
TypeError: Can't instantiate abstract class ABCtest_B with abstract methods foo
However when the subclass of ABCtest also inherits from a built in type like str or list there is no error and test.foo() calls the abstract method:
class ABCtest_C(ABCtest, str):
pass
>>> test = ABCtest_C()
>>> test.foo()
Traceback (most recent call last):
File "<pyshell#0>", line 1, in <module>
test.foo()
File "/.../test.py", line 5, in foo
raise RuntimeError("Abstract method was called, this should be impossible")
RuntimeError: Abstract method was called, this should be impossible
This seems to happen when inheriting from any class defined in C including itertools.chain and numpy.ndarray but still correctly raises errors with classes defined in python. Why would implementing one of a built in types break the functionality of abstract classes?
Surprisingly, the test that prevents instantiating abstract classes happens in object.__new__, rather than anything defined by the abc module itself:
static PyObject *
object_new(PyTypeObject *type, PyObject *args, PyObject *kwds)
{
...
if (type->tp_flags & Py_TPFLAGS_IS_ABSTRACT) {
...
PyErr_Format(PyExc_TypeError,
"Can't instantiate abstract class %s "
"with abstract methods %U",
type->tp_name,
joined);
(Almost?) all built-in types that aren't object supply a different __new__ that overrides object.__new__ and does not call object.__new__. When you multiple-inherit from a non-object built-in type, you inherit its __new__ method, bypassing the abstract method check.
I don't see anything about __new__ or multiple inheritance from built-in types in the abc documentation. The documentation could use enhancement here.
It seems kind of strange that they'd use a metaclass for the ABC implementation, making it a mess to use other metaclasses with abstract classes, and then put the crucial check in core language code that has nothing to do with abc and runs for both abstract and non-abstract classes.
There's a report for this issue on the issue tracker that's been languishing since 2009.
I asked a similar question and based on user2357112 supports Monicas linked bug report, I came up with this workaround (based on the suggestion from Xiang Zhang):
from abc import ABC, abstractmethod
class Base(ABC):
#abstractmethod
def foo(self):
pass
#abstractmethod
def bar(self):
pass
def __new__(cls, *args, **kwargs):
abstractmethods = getattr(cls, '__abstractmethods__', None)
if abstractmethods:
msg = "Can't instantiate abstract class {name} with abstract method{suffix} {methods}"
suffix = 's' if len(abstractmethods) > 1 else ''
raise TypeError(msg.format(name=cls.__name__, suffix=suffix, methods=', '.join(abstractmethods)))
return super().__new__(cls, *args, **kwargs)
class Derived(Base, tuple):
pass
Derived()
This raises TypeError: Can't instantiate abstract class Derived with abstract methods bar, foo, which is the original behaviour.

Python3 metaclass: use to validate constructor argument

I was planning to use metaclass to validate the constructor argument in Python3, but it seems __new__method has no access to the variable val, because the class A() has not been instantiated yet.
Sow what's the correct way to do it?
class MyMeta(type):
def __new__(cls, clsname, superclasses, attributedict):
print("clsname: ", clsname)
print("superclasses: ", superclasses)
print("attributedict: ", attributedict)
return type.__new__(cls, clsname, superclasses, attributedict)
class A(metaclass=MyMeta):
def __init__(self, val):
self.val = val
A(123)
... it seems __new__method has no access to the variable val, because the class A() has not been instantiated yet.
Exactly.
So what's the correct way to do it?
Not with a metaclass.
Metaclasses are for fiddling with the creation of the class object itself, and what you want to do is related to instances of the class.
Best practice: don't type-check the val at all. Pythonic code is duck-typed. Simply document that you expect a string-like argument, and users who put garbage in get garbage out.
wim is absolutely correct that this isn't a good use of metaclasses, but it's certainly possible (and easy, too).
Consider how you would create a new instance of your class. You do this:
A(123)
In other words: You create an instance by calling the class. And python allows us to create custom callable objects by defining a __call__ method. So all we have to do is to implement a suitable __call__ method in our metaclass:
class MyMeta(type):
def __call__(self, val):
if not isinstance(val, str):
raise TypeError('val must be a string')
return super().__call__(val)
class A(metaclass=MyMeta):
def __init__(self, val):
self.val = val
And that's it. Simple, right?
>>> A('foo')
<__main__.A object at 0x007886B0>
>>> A(123)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "untitled.py", line 5, in __call__
raise TypeError('val must be a string')
TypeError: val must be a string

Python: Predefined class variable access

In Python, I am able to access the non-predefined class variables both from the class as well as instances. However, I am not able to access the predefined class variables (such as "name") from the object instances. What am I missing? Thanks.
Here is a test program that I wrote.
class Test:
'''
This is a test class to understand why we can't access predefined class variables
like __name__, __module__ etc from an instance of the class while still able
to access the non-predefined class variables from instances
'''
PI_VALUE = 3.14 #This is a non-predefined class variable
# the constructor of the class
def __init__(self, arg1):
self.value = arg1
def print_value(self):
print self.value
an_object = Test("Hello")
an_object.print_value()
print Test.PI_VALUE # print the class variable PI_VALUE from an instance of the class
print an_object.PI_VALUE # print the class variable PI_VALUE from the class
print Test.__name__ # print pre-defined class variable __name__ from the class
print an_object.__name__ #print the pre-defined class varible __name__ from an instance of the class
That's normal. Instances of a class look in that class's __dict__ for attribute resolution, as well as the __dict__s of all ancestors, but not all attributes of a class come from its __dict__.
In particular, Test's __name__ is held in a field in the C struct representing the class, rather than in the class's __dict__, and the attribute is found through a __name__ descriptor in type.__dict__. Instances of Test don't look at this for attribute lookup.
I don't have a great answer for "why". But here's how you can get to them, using __class__:
>>> class Foo(object): pass
...
>>> foo = Foo()
>>> foo.__name__
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'Foo' object has no attribute '__name__'
>>> foo.__class__.__name__
'Foo'
>>>

Can I prevent class definition unless a method is implemented?

I'm trying to figure out how to provide a base class to plugin writers so that they provide definitions for several static methods.
A plugin class is a collection of static methods which will never be instantiated.
I know how to use ABC to prevent instantiation of a class missing method implementations, which will not provide the safety I would like. Is there a pattern to prevent definition?
You can do it by writing your own metaclass similar to ABCMeta, which checks for abstract methods at class-definition time and raises an error if it finds any. Here's an example:
class ClassABC(type):
def __init__(cls, name, bases, attrs):
abstracts = set()
for base in bases:
abstracts.update(getattr(base, '__abstractclassmethods__', set()))
for abstract in abstracts:
if getattr(getattr(cls, abstract), '__isabstractmethod__', False):
raise TypeError("Your class doesn't define {0}".format(abstract))
for attr in attrs:
if getattr(attrs[attr], '__isabstractmethod__', False):
abstracts.add(attr)
cls.__abstractclassmethods__ = abstracts
class BaseClass(object):
__metaclass__ = ClassABC
#abc.abstractmethod
def foo(self):
print("I am base foo")
Then:
>>> class Derived(BaseClass):
... pass
Traceback (most recent call last):
File "<pyshell#10>", line 1, in <module>
class Derived(BaseClass):
File "<pyshell#8>", line 8, in __init__
raise TypeError("Your class doesn't define {0}".format(abstract))
TypeError: Your class doesn't define foo
My example is fairly quick and dirty and only scantily tested, and you might want to refine it to check various sorts of edge cases. What I did is I raised the error if there is an abstract method that wasn't defined in the currently-being-defined class. (Otherwise an error would be raised because the abstract base class doesn't define concrete implementations of its own abstract methods.)

Categories