In the below example, the superclass has a __dict__ attribute, while the subclass does not have it.
>>> class Super(object):
... def hello(self):
... self.data1="hello"
...
>>>
>>> class Sub(Super):
... def hola(self):
... self.data2="hola"
...
>>>
>>> Super.__dict__
<dictproxy object at 0x108794868>
>>> Super.__dict__.keys()
['__dict__', '__module__', '__weakref__', 'hello', '__doc__'] # note __dict__
>>> Sub.__dict__.keys()
['__module__', '__doc__', 'hola'] #__dict__ absent here
>>> Sub.__dict__
<dictproxy object at 0x108794868>
Q1: The comments on the above shows where dict is present. why the superclass has it but not the sublcass.
while trying to find out the answer for this, I came across this post. and this confused me further.
>>> class Foo(object):
... __slots__ = ('bar',)
... bar="spam"
...
>>> f = Foo()
>>> f.__dict__
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'Foo' object has no attribute '__dict__'
>>> class A(object):
... pass
...
>>> b = A()
>>> b.__dict__
{}
Q2: why the instance of Foo throws AttributeError but that of A has empty dict.
Class with slots hasn't dict. Here there is conflict between 'bar' and slots. Delete 'bar' and it will work fine.
Related
I always thought one should inherit from abc.ABC when one does not want the class to be instantiated. But I've just realized that if a class has an #abstractmethod then one can also not instanciate it.
Is there any other reason to inherit from ABC?
Unless you use abc.ABCMeta as the metaclass for your class (either explicitly or by inheriting from abc.ABC), using abstractmethod doesn't really do anything.
>>> from abc import abstractmethod, ABC
>>> class Foo:
... #abstractmethod
... def bar(self):
... pass
...
>>> f = Foo()
>>>
Likewise, using ABCMeta doesn't mean much unless you mark at least one method as abstract:
>>> class Bar(ABC):
... pass
...
>>> b = Bar()
>>>
It's the combination of the two that allows a class to be (nominally) uninstantiable:
>>> class Baz(ABC):
... #abstractmethod
... def m(self):
... pass
...
>>> b = Baz()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: Can't instantiate abstract class Baz with abstract methods m
>>>
(Even then, note that all #abstractmethod does is add the decorated method to a set which the metaclass machinery consults when trying to instantiate the class. It is trivial to defeat that machinery:
>>> Baz.__abstractmethods__
frozenset({'m'})
>>> Baz.__abstractmethods__ = set()
>>> b = Baz()
>>>
)
Note that ABC itself is a trivial class that uses ABCMeta as its metaclass, which makes any of its descendants use it as well.
# Docstring omitted; see
# https://github.com/python/cpython/blob/3.7/Lib/abc.py#L166
# for the original
class ABC(metaclass=ABCMeta):
__slots__ = ()
What chepner said, and also readability. Inheriting from ABC makes it clear to your readers what you're up to.
>>> from abc import ABC, abstractmethod
>>>
>>> class Foo:
... #abstractmethod
... def f(self):
... pass
...
>>> class Bar(Foo):
... pass
...
>>> Bar().f()
>>>
>>> class Baz(ABC):
... #abstractmethod
... def f(self):
... pass
...
>>> class Quux(Baz):
... pass
...
>>> Quux().f()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: Can't instantiate abstract class Quux with abstract methods f
This question already has answers here:
Can't set attributes on instance of "object" class
(7 answers)
Closed 9 years ago.
(Written in Python shell)
>>> o = object()
>>> o.test = 1
Traceback (most recent call last):
File "<pyshell#45>", line 1, in <module>
o.test = 1
AttributeError: 'object' object has no attribute 'test'
>>> class test1:
pass
>>> t = test1()
>>> t.test
Traceback (most recent call last):
File "<pyshell#50>", line 1, in <module>
t.test
AttributeError: test1 instance has no attribute 'test'
>>> t.test = 1
>>> t.test
1
>>> class test2(object):
pass
>>> t = test2()
>>> t.test = 1
>>> t.test
1
>>>
Why doesn't object allow you to add attributes to it?
Notice that an object instance has no __dict__ attribute:
>>> dir(object())
['__class__', '__delattr__', '__doc__', '__getattribute__', '__hash__', '__init__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__str__']
An example to illustrate this behavior in a derived class:
>>> class Foo(object):
... __slots__ = {}
...
>>> f = Foo()
>>> f.bar = 42
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'Foo' object has no attribute 'bar'
Quoting from the docs on slots:
[...] The __slots__ declaration takes a sequence of instance variables and reserves just enough space in each instance to hold a value for each variable. Space is saved because __dict__ is not created for each instance.
EDIT: To answer ThomasH from the comments, OP's test class is an "old-style" class. Try:
>>> class test: pass
...
>>> getattr(test(), '__dict__')
{}
>>> getattr(object(), '__dict__')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'object' object has no attribute '__dict__'
and you'll notice there is a __dict__ instance. The object class may not have a __slots__ defined, but the result is the same: lack of a __dict__, which is what prevents dynamic assignment of an attribute. I've reorganized my answer to make this clearer (move the second paragraph to the top).
Good question, my guess is that it has to do with the fact that object is a built-in/extension type.
>>> class test(object):
... pass
...
>>> test.test = 1
>>> object.test = 1
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: can't set attributes of built-in/extension type 'object'
IIRC, this has to do with the presence of a __dict__ attribute or, more correctly, setattr() blowing up when the object doesn't have a __dict__ attribute.
Im adding callable objects to a instance of a class A at runtime using the __dict__ property. At some point though I want to remove all added objects from my instance. I thought about storing the initial __dict__ property in a member _orgDict and then execute self.__dict__ = self._orgDict later. Im wondering whether:
This works at all?
The removed objects are really deleted or just not contained in my instance anymore?
You mean the del statement?
del(instance.attribute)
A quick test shows that reassigning an instance __dict__ seems to work:
>>> class B(object):
pass
>>> b = B()
>>> b.b = 6
>>> b.b
6
>>> b.__dict__ = {}
>>> b.b
Traceback (most recent call last):
File "<pyshell#57>", line 1, in <module>
b.b
AttributeError: 'B' object has no attribute 'b'
However, I'm not sure whether this is guaranteed, or if it just happens to work. Especially in terms of supporting non-C Pythons, you may want to be careful.
Yes. It is possible to override(delete) the objects by assignment. Here is the example.
>>> class callable_objects:
def __init__(self, name, fame=None):
self.name = name
self.fame = fame
def _name(self):
if self.name[0] in ["a","b","c","d","e"]:
self._fame("1")
else:
self._fame("2")
def _fame(self, ifame):
if ifame == "1":
print "Ur fame is bad"
else:
print "Ur fame is very bad"
>>> c = callable_objects("ameet")
>>> callable_objects.__dict__
{'__module__': '__main__', '_fame': <function _fame at 0x02B5C370>, '__doc__': None, '__init__': <function __init__ at 0x02B5C330>, '_name': <function _name at 0x02B5C2F0>}
>>> c.__dict__
{'name': 'ameet', 'fame': None}
>>> callable_objects.__dict__ = c.__dict__
>>> callable_objects.__dict__
{'name': 'ameet', 'fame': None}
The example below is from a REST database driver on Python 2.7.
In the __setattr__ method below, if I use the commented out getattr() line, it reduces the object instantiation performance from 600 rps to 230.
Why is getattr() so much slower than self.__dict__.get() in this case?
class Element(object):
def __init__(self, client):
self._client = client
self._data = {}
self._initialized = True
def __setattr__(self, key, value):
#_initialized = getattr(self, "_initialized", False)
_initialized = self.__dict__.get("_initialized", False)
if key in self.__dict__ or _initialized is False:
# set the attribute normally
object.__setattr__(self, key, value)
else:
# set the attribute as a data property
self._data[key] = value
In short: because getattr(foo,bar) does the same thing as foo.bar, which is not the same thing as just accessing the __dict__ property (for a start, getattr has to select the right __dict__, but there's a whole lot more going on).
An example for illustration:
>>> class A:
... a = 1
...
>>> class B(A):
... b = 2
...
>>> dir(B)
['__doc__', '__module__', 'a', 'b']
>>> B.a
1
>>> B.__dict__
{'__module__': '__main__', 'b': 2, '__doc__': None}
>>> B.__dict__['a']
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
KeyError: 'a'
>>> B.__dict__.get('a')
>>>
Details contained in, or linked to here: http://docs.python.org/reference/datamodel.html (search for "getattr").
class Foo:
pass
>>> f = test.Foo()
Lets look into the class instance ...
>>> dir(f)
['__add__', [__class__] ...]
Oooh! Lets look into the class instance metadata ...
>>> dir(f.__class__)
['__add__', [__class__] ...]
hmm ... was expecting attributes of __class__ ; but returns back attributes of f
Trying a hit and trial ...
>>> dir(f.__class__.__class__)
['__abstractmethods__', '__base__' ...]
hmm ... why twice a charm?
dir(f) and dir(f.__class__) are showing the attributes of two different things. It's just that your empty object has the same attributes as its own class. Try this:
>>> class Foo:
... def __init__(self):
... self.a = 17
...
>>> f = Foo()
>>> 'a' in dir(f)
True
>>> 'a' in dir(f.__class__)
False