I have a class with __slots__:
class A:
__slots__ = ('foo',)
If I create a subclass without specifying __slots__, the subclass will have a __dict__:
class B(A):
pass
print('__dict__' in dir(B)) # True
Is there any way to prevent B from having a __dict__ without having to set __slots__ = ()?
The answer of #AKX is almost correct. I think __prepare__ and a metaclass is indeed the way this can be solved quite easily.
Just to recap:
If the namespace of the class contains a __slots__ key after the class body is executed then the class will use __slots__ instead of __dict__.
One can inject names into the namespace of the class before the class body is executed by using __prepare__.
So if we simply return a dictionary containing the key '__slots__' from __prepare__ then the class will (if the '__slots__' key isn't removed again during the evaluation of the class body) use __slots__ instead of __dict__.
Because __prepare__ just provides the initial namespace one can easily override the __slots__ or remove them again in the class body.
So a metaclass that provides __slots__ by default would look like this:
class ForceSlots(type):
#classmethod
def __prepare__(metaclass, name, bases, **kwds):
# calling super is not strictly necessary because
# type.__prepare() simply returns an empty dict.
# But if you plan to use metaclass-mixins then this is essential!
super_prepared = super().__prepare__(metaclass, name, bases, **kwds)
super_prepared['__slots__'] = ()
return super_prepared
So every class and subclass with this metaclass will (by default) have an empty __slots__ in their namespace and thus create a "class with slots" (except the __slots__ are removed on purpose).
Just to illustrate how this would work:
class A(metaclass=ForceSlots):
__slots__ = "a",
class B(A): # no __dict__ even if slots are not defined explicitly
pass
class C(A): # no __dict__, but provides additional __slots__
__slots__ = "c",
class D(A): # creates normal __dict__-based class because __slots__ was removed
del __slots__
class E(A): # has a __dict__ because we added it to __slots__
__slots__ = "__dict__",
Which passes the tests mentioned in AKZs answer:
assert "__dict__" not in dir(A)
assert "__dict__" not in dir(B)
assert "__dict__" not in dir(C)
assert "__dict__" in dir(D)
assert "__dict__" in dir(E)
And to verify that it works as expected:
# A has slots from A: a
a = A()
a.a = 1
a.b = 1 # AttributeError: 'A' object has no attribute 'b'
# B has slots from A: a
b = B()
b.a = 1
b.b = 1 # AttributeError: 'B' object has no attribute 'b'
# C has the slots from A and C: a and c
c = C()
c.a = 1
c.b = 1 # AttributeError: 'C' object has no attribute 'b'
c.c = 1
# D has a dict and allows any attribute name
d = D()
d.a = 1
d.b = 1
d.c = 1
# E has a dict and allows any attribute name
e = E()
e.a = 1
e.b = 1
e.c = 1
As pointed out in a comment (by Aran-Fey) there is a difference between del __slots__ and adding __dict__ to the __slots__:
There's a minor difference between the two options: del __slots__ will give your class not only a __dict__, but also a __weakref__ slot.
How about a metaclass like this and the __prepare__() hook?
import sys
class InheritSlots(type):
def __prepare__(name, bases, **kwds):
# this could combine slots from bases, I guess, and walk the base hierarchy, etc
for base in bases:
if base.__slots__:
kwds["__slots__"] = base.__slots__
break
return kwds
class A(metaclass=InheritSlots):
__slots__ = ("foo", "bar", "quux")
class B(A):
pass
assert A.__slots__
assert B.__slots__ == A.__slots__
assert "__dict__" not in dir(A)
assert "__dict__" not in dir(B)
print(sys.getsizeof(A()))
print(sys.getsizeof(B()))
For some reason, this still does print 64, 88 – maybe an inherited class's instances are always a little heavier than the base class itself?
Related
I want to create a class that inherits another class and instantiates it with the object from the prev. class to have a new object from the new class ( old class attrs. and methods with new attrs. and methods).
example:
class A():
attrs...
methods...
class B(A):
def __init__(self, a_obj):
...
A_attrs + B_attrs...
A_methods + B_methods...
a = A()
# assign some values to 'a'
b = B(a)
# a and b should have the same params and behaviors
Is there a way to implement such an alternative class and use the new object?
It is hard to tell what exactly you are trying to achieve, but having the exact things you want to forward to a, it is easy to achieve.
No need to use metaclasses, but depending on the behavior you want, the special __getattribute__ and __setattr__ methods might help.
Take in mind that as far as methods are concerned, the inheritance mechanism will already do that: any methods called in an instance of B will be forwarded to the method defined in A, unless there is an overriding implementation of it in B: in this case the overriding method have to explicitly run the method in A by using a super() call, or skip it altoghether: it is up to the implementation.
Method overriding is independent of instances. If you want them to "see" a particular instance of A passed at B object instantiation, it is just the attributes in that instance that matter.
Now, if you want instances of B to proxy over to the attributes a particular instance of A, the special methods I mentioned can do the same bridge. We can implement those in a way that if any attribute access is attempted in attribute that existis in the a instance, that one is used instead. Also, the special behavior can be implemented in a mixin class, so you are free to implement your business logic in B, and deffer all special attribute handling mechanisms to the mixin instead.
_SENTINEL = object()
class ProxyAttrMixins:
def __init__(self):
# Do nothing: just prevent the proxied class` __init__ from being run
pass
def _inner_get(self, attrname):
bound_getattr = super().__getattribute__
try:
proxied = bound_getattr("proxied")
except AttributeError:
# No associated object to proxy to!
# just pass an try to retrieve the attribute from `self`
pass
else: # no AttributeError: there is a proxied object
associated_attr = getattr(proxied, attrname, _SENTINEL)
if associated_attr is not _SENTINEL:
# if object is a callable: it is a method. A mehtod in the derived class should
# be called if it exists, and just otherwise in the proxied object:
if callable(associated_attr):
try:
own_method = bound_getattr(attrname)
except AttributeError:
pass
else:
return "own", own_method
return "proxy", associated_attr
# if there is no proxied object, or if the proxied does not have the desired attribute,
# return the regular instance attribute:
return "own", bound_getattr(attrname)
def __getattribute__(self, attrname):
bound_getattr = super().__getattribute__
whose, attr = bound_getattr("_inner_get")(attrname)
return attr
def __setattr__(self, attrname, value):
bound_getattr = super().__getattribute__
try:
whose, attr = bound_getattr("_inner_get")(attrname)
except AttributeError:
whose = "own"
if whose != "own":
proxied = bound_getattr("proxied")
return setattr(proxied, attrname, value)
super().__setattr__(attrname, value)
class A:
def __init__(self, c: int):
self.c = c
class B(ProxyAttrMixins, A):
def __init__(self, a: A):
self.proxied = a
super().__init__() # this ensure B can still have colaborative inheritance.
# The Mixin's __init__ prevents A __init__ from being run and
# report on the missing `c` argument
And this code allows this kind of scenario:
In [18]: b = B(a:=A(5))
In [19]: b.c
Out[19]: 5
In [20]: b.c = 10
In [21]: a.c
Out[21]: 10
class MyClass():
def __init__(self):
self.attribute_1 = "foo"
self.attribute_2 = "bar"
#property
def attribute_1(self):
return self._attribute_1
#attribute_1.setter
def attribute_1(self,s):
self._attribute_1 = s
#property
def attribute_2(self):
return self._attribute_2
#attribute_2.setter
def attribute_2(self,s):
self._attribute_2 = s
>>> ob = MyClass()
>>> ob.attribute_1 = 'fizz' #Ok
>>> ob.atribute_1 = 'buzz' #want to throw an exception because this has no setter or #property def
I would like my class to complain if we try and set an attribute that has not been decorated with property and a setter. I have tried using slots but can't get it working with the property decorator. 'attribute' in __slots__ conflicts with class variable
Any thoughts?
__slots__ should contain all instance variables, in your case it is _attribute_1 and _attribute_2 (the ones with underscores used internally) so just do that:
class MyClass():
__slots__ = ["_attribute_1", "_attribute_2"]
pass # rest of implementation
note that if your property is just directly forwarding you might as well just put the public variables in the slots and only have properties for fields that need more validation or other logic. having slots is effectively a property really:
>>> MyClass._attribute_1
<member '_attribute_1' of 'MyClass' objects>
With a boring class, object instance attribute shadow class attributes:
class C(object):
a="class_a"
def __init__(self, a):
self.a = a
c = C(a="obja")
print c.a # obja
But if my class attributes are declared in a named_tuple base:
class C(collections.namedtuple("CBase", ['a', ])):
a="class_a"
c = C(a="obja")
print c.a # class_a !!??!
... so, declaring my instance attribute through the name tuple causes that attribute to be shadowed by the class attribute ... which not what you'd expect.
Why is this?
namedtuple "attributes" are implemented as descriptors (specifically, propertys) on the class itself, not attributes in the traditional sense (all the actual data is stored in unnamed indices of the tuple). In this case, the namedtuple (roughly) defines:
#property
def a(self):
return self[0]
Since the property is a class level attribute, when you define a on the subclass, it shadows equivalent definitions in the parent class.
This is the setup I want:
A should be an abstract base class with a static & abstract method f(). B should inherit from A. Requirements:
1. You should not be able to instantiate A
2. You should not be able to instantiate B, unless it implements a static f()
Taking inspiration from this question, I've tried a couple of approaches. With these definitions:
class abstractstatic(staticmethod):
__slots__ = ()
def __init__(self, function):
super(abstractstatic, self).__init__(function)
function.__isabstractmethod__ = True
__isabstractmethod__ = True
class A:
__metaclass__ = abc.ABCMeta
#abstractstatic
def f():
pass
class B(A):
def f(self):
print 'f'
class A2:
__metaclass__ = abc.ABCMeta
#staticmethod
#abc.abstractmethod
def f():
pass
class B2(A2):
def f(self):
print 'f'
Here A2 and B2 are defined using usual Python conventions and A & B are defined using the way suggested in this answer. Following are some operations I tried and the results that were undesired.
With classes A/B:
>>> B().f()
f
#This should have thrown, since B doesn't implement a static f()
With classes A2/B2:
>>> A2()
<__main__.A2 object at 0x105beea90>
#This should have thrown since A2 should be an uninstantiable abstract class
>>> B2().f()
f
#This should have thrown, since B2 doesn't implement a static f()
Since neither of these approaches give me the output I want, how do I achieve what I want?
You can't do what you want with just ABCMeta. ABC enforcement doesn't do any type checking, only the presence of an attribute with the correct name is enforced.
Take for example:
>>> from abc import ABCMeta, abstractmethod, abstractproperty
>>> class Abstract(object):
... __metaclass__ = ABCMeta
... #abstractmethod
... def foo(self): pass
... #abstractproperty
... def bar(self): pass
...
>>> class Concrete(Abstract):
... foo = 'bar'
... bar = 'baz'
...
>>> Concrete()
<__main__.Concrete object at 0x104b4df90>
I was able to construct Concrete() even though both foo and bar are simple attributes.
The ABCMeta metaclass only tracks how many objects are left with the __isabstractmethod__ attribute being true; when creating a class from the metaclass (ABCMeta.__new__ is called) the cls.__abstractmethods__ attribute is then set to a frozenset object with all the names that are still abstract.
type.__new__ then tests for that frozenset and throws a TypeError if you try to create an instance.
You'd have to produce your own __new__ method here; subclass ABCMeta and add type checking in a new __new__ method. That method should look for __abstractmethods__ sets on the base classes, find the corresponding objects with the __isabstractmethod__ attribute in the MRO, then does typechecking on the current class attributes.
This'd mean that you'd throw the exception when defining the class, not an instance, however. For that to work you'd add a __call__ method to your ABCMeta subclass and have that throw the exception based on information gathered by your own __new__ method about what types were wrong; a similar two-stage process as what ABCMeta and type.__new__ do at the moment. Alternatively, update the __abstractmethods__ set on the class to add any names that were implemented but with the wrong type and leave it to type.__new__ to throw the exception.
The following implementation takes that last tack; add names back to __abstractmethods__ if the implemented type doesn't match (using a mapping):
from types import FunctionType
class ABCMetaTypeCheck(ABCMeta):
_typemap = { # map abstract type to expected implementation type
abstractproperty: property,
abstractstatic: staticmethod,
# abstractmethods return function objects
FunctionType: FunctionType,
}
def __new__(mcls, name, bases, namespace):
cls = super(ABCMetaTypeCheck, mcls).__new__(mcls, name, bases, namespace)
wrong_type = set()
seen = set()
abstractmethods = cls.__abstractmethods__
for base in bases:
for name in getattr(base, "__abstractmethods__", set()):
if name in seen or name in abstractmethods:
continue # still abstract or later overridden
value = base.__dict__.get(name) # bypass descriptors
if getattr(value, "__isabstractmethod__", False):
seen.add(name)
expected = mcls._typemap[type(value)]
if not isinstance(namespace[name], expected):
wrong_type.add(name)
if wrong_type:
cls.__abstractmethods__ = abstractmethods | frozenset(wrong_type)
return cls
With this metaclass you get your expected output:
>>> class Abstract(object):
... __metaclass__ = ABCMetaTypeCheck
... #abstractmethod
... def foo(self): pass
... #abstractproperty
... def bar(self): pass
... #abstractstatic
... def baz(): pass
...
>>> class ConcreteWrong(Abstract):
... foo = 'bar'
... bar = 'baz'
... baz = 'spam'
...
>>> ConcreteWrong()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: Can't instantiate abstract class ConcreteWrong with abstract methods bar, baz, foo
>>>
>>> class ConcreteCorrect(Abstract):
... def foo(self): return 'bar'
... #property
... def bar(self): return 'baz'
... #staticmethod
... def baz(): return 'spam'
...
>>> ConcreteCorrect()
<__main__.ConcreteCorrect object at 0x104ce1d10>
From this answer to "what is a metaclass?" I got this:
You write class Foo(object) first, but the class object Foo is not created in memory yet.
Python will look for metaclass in the class definition. If it finds it, it will use it to create the object class Foo. If it doesn't, it will use type to create the class.
Having tested it, it seems that the attributes of the class are instantiated before the constructor of the class is run. What am I misunderstanding?
Test code:
class meta(type):
def __init__(cls, name, bases, dic):
type.__init__(cls, name, bases, dic)
print hasattr(cls, "a")
cls.a = "1"
class A(object):
a = "a"
__metaclass__ = meta
class B(object):
__metaclass__ = meta
class C(object):
__metaclass__ = meta
a = "a"
print A.a
print B.a
print C.a
Output:
True
False
True
1
1
1
The class body is run before the class is constructed, yes.
The body of the class provides a temporary namespace, and all local names in that namespace are given as a dictionary to construct the class object, together with the base classes and a name for the class.
You can do this with the type() constructor too:
>>> Foo = type('Foo', (), {'a': 1})
>>> Foo.a
1
The class body is basically executed as a function, with the local namespace of that function being used to create the class attributes, the 3rd argument to type() above.
In python 3 you have a little more influence on that process with the __prepare__ hook on a metaclass. __prepare__ should be a class method that returns a initial namespace for the class body; use it to inject extra names into the generated class body before the class body is executed:
class MyMeta(type):
#classmethod
def __prepare__(mcl, name, bases):
return {'a': 1}