I want to redefine a __metaclass__ but I want to fall back to the metaclass which would have been used if I hadn't redefined.
class ComponentMetaClass(type):
def __new__(cls, name, bases, dct):
return <insert_prev_here>.__new__(cls, name, bases, dct)
class Component(OtherObjects):
__metaclass__ = ComponentMetaClass
From what I understand, the __metaclass__ used by default goes through the process of checking for a definition in the scope of the class, then in the bases and then in global. Normally you would use type in the redefinition and that is usually the global one, however, my OtherObjects, may have redefined the __metaclass__. So in using type, I would ignore their definition and they wouldn't run, right?
edit: note that I don't know what OtherObjects are until runtime
As #unutbu puts it: "Within one class hierarchy, metaclasses must be subclasses of each other. That is, the metaclass of Component must be a subclass of the metaclass of OtherObjects."
Which means your problem is a bit more complicated than you though first - Not only you have to call the proper metaclass from the base classes, but your current metaclass has to properly inherit from then as well.
(hack some code, confront strange behavior, come back 90 min later)
It was tricky indeed - I had to create a class that receives the desired metaclass as a parameter, and which __call__ method generates dynamically a new metaclass, modifying its bases and adding a __superclass attribute to it.
But this should do what you want and some more - you just have to inherit all your metaclasses from BaseComponableMeta and call the superclasses in the hyerarchy through the metaclass "__superclass" attribute:
from itertools import chain
class Meta1(type):
def __new__(metacls, name, bases, dct):
print name
return type.__new__(metacls, name, bases, dct)
class BaseComponableMeta(type):
def __new__(metacls, *args, **kw):
return metacls.__superclass.__new__(metacls, *args, **kw)
class ComponentMeta(object):
def __init__(self, metaclass):
self.metaclass = metaclass
def __call__(self, name, bases,dct):
#retrieves the deepest previous metaclass in the object hierarchy
bases_list = sorted ((cls for cls in chain(*(base.mro() for base in bases)))
, key=lambda s: len(type.mro(s.__class__)))
previous_metaclass = bases_list[-1].__class__
# Adds the "__superclass" attribute to the metaclass, so that it can call
# its bases:
metaclass_dict = dict(self.metaclass.__dict__).copy()
new_metaclass_name = self.metaclass.__name__
metaclass_dict["_%s__superclass" % new_metaclass_name] = previous_metaclass
#dynamicaly generates a new metaclass for this class:
new_metaclass = type(new_metaclass_name, (previous_metaclass, ), metaclass_dict)
return new_metaclass(name, bases, dct)
# From here on, example usage:
class ComponableMeta(BaseComponableMeta):
pass
class NewComponableMeta_1(BaseComponableMeta):
def __new__(metacls, *args):
print "Overriding the previous metaclass part 1"
return metacls.__superclass.__new__(metacls, *args)
class NewComponableMeta_2(BaseComponableMeta):
def __new__(metacls, *args):
print "Overriding the previous metaclass part 2"
return metacls.__superclass.__new__(metacls, *args)
class A(object):
__metaclass__ = Meta1
class B(A):
__metaclass__ = ComponentMeta(ComponableMeta)
# trying multiple inheritance, and subclassing the metaclass once:
class C(B, A):
__metaclass__ = ComponentMeta(NewComponableMeta_1)
# Adding a third metaclass to the chain:
class D(C):
__metaclass__ = ComponentMeta(NewComponableMeta_2)
# class with a "do nothing" metaclass, which calls its bases metaclasses:
class E(D):
__metaclass__ = ComponentMeta(ComponableMeta)
Within one class hierarchy, metaclasses must be subclasses of each other. That is, the metaclass of Component must be a subclass of the metaclass of OtherObjects.
If you don't name a __metaclass__ for Component then the metaclass of OtherObjects will be used by default.
If ComponentMetaClass and OtherObjectsMeta both inherit (independently) from type:
class OtherObjectsMeta(type): pass
class ComponentMetaClass(type): pass
class OtherObjects(object):
__metaclass__ = OtherObjectsMeta
class Component(OtherObjects):
__metaclass__ = ComponentMetaClass
then you get this error:
TypeError: Error when calling the metaclass bases
metaclass conflict: the metaclass of a derived class must be a (non-strict) subclass of the metaclasses of all its bases
but if you make ComponentMetaClass as subclass of OtherObjectsMeta
class ComponentMetaClass(OtherObjectsMeta): pass
then the error goes away.
Perhaps I misread your question. If want ComponentMetaClass.__new__ to call OtherObjectsMeta.__new__, then use super:
class OtherObjectsMeta(type):
def __new__(meta, name, bases, dct):
print('OtherObjectsMeta')
return super(OtherObjectsMeta,meta).__new__(meta,name,bases,dct)
class ComponentMetaClass(OtherObjectsMeta):
def __new__(meta, name, bases, dct):
print('ComponentMetaClass')
return super(ComponentMetaClass,meta).__new__(meta,name,bases,dct)
Regarding an alternative to using metaclasses, mentioned in the comments. Use super:
class Base(object):
def method(self): pass
class Base1(Base):
def method(self):
print('Base1')
super(Base1,self).method()
class Base2(Base):
def method(self):
print('Base2')
super(Base2,self).method()
class Component(Base1,Base2):
pass
c = Component()
c.method()
Related
Something I don't like about #absctractmethod is that it only produces error when the instance is created. For example, this will not fail:
from abc import abstractmethod, ABC
class AbstractClass(ABC):
#abstractmethod
def func(self):
pass
class RealClass(AbstractClass):
pass
it will only fail if I create an instance:
r = RealClass()
I want to reimplent this mechanism but so that it fails on class definition, not instantiation. For this, I created a metaclass:
class ABCEarlyFailMeta(type):
direct_inheritors = {}
def __new__(cls, clsname, bases, clsdict):
klass = super().__new__(cls, clsname, bases, clsdict)
class_path = clsdict['__module__'] + '.' + clsdict['__qualname__']
if bases == ():
# we get here when we create base abstract class.
# The registry will later be filled with abstract methods
cls.direct_inheritors[class_path] = {}
for name, value in clsdict.items():
# adding abstract methods on the proper base abstract class
if getattr(value, '__isabstractmethod__', None) is True:
cls.direct_inheritors[class_path][name] = signature(value)
else:
# we get here when create inheritors.
# Here, we need to extract list of abstractmethods
base_class = bases[0].__module__ + '.' + bases[0].__qualname__
abstract_method_names = cls.direct_inheritors[base_class]
# here, we compare current list of methods
# with list of abstractmethods and fail if some methods are missing
cls_dictkeys = set(clsdict.keys())
missing_methods = set(abstract_method_names) - cls_dictkeys
if missing_methods:
raise Exception(
f'{clsname} must implement methods: {missing_methods}'
)
return klass
this will fail when class is created, not instantiated:
class ABCEarlyFail(metaclass=ABCEarlyFailMeta):
#abstractmethod
def func(self):
pass
class Child(ABCEarlyFail):
pass
>>> Exception: Child must implement methods: {'func'}
My question is, how do search for proper base class in bases? In this example, I look for bases[0], but it will fail if the inheritor class has a mixin:
class Child(SomeMixin, ABCEarlyFail):
pass
so, what is a nicer way?
Or maybe I am reinventing the wheel?
Use Python metaclass A to create a new class B.
When C inherit from B why A's __new__ method is called?
class A(type):
def __new__(cls, name, bases, attrs):
print(" call A.__new__ ")
return type.__new__(cls, name, bases, attrs)
B = A("B", (), {})
class C(B):
pass
python test.py
call A.__new__
call A.__new__
Classes are instances of Metaclasses, and the default Metaclass type is derived from object. Metaclasses thus follow the regular rules of creating instances of object - __new__ constructs the instance, __init__ may initialise it.
>>> class DemoClass(object):
... def __new__(cls):
... print('__new__ object of DemoClass')
... return super().__new__(cls)
...
... def __init__(self):
... print('__init__ object of DemoClass')
... return super().__init__()
...
>>> demo_instance = DemoClass() # instantiate DemoClass
__new__ object of DemoClass
__init__ object of DemoClass
The same happens when our class is a metaclass - it is still an object and behaves as such.
>>> class DemoType(type):
... def __new__(mcs, name, bases, attrs):
... print('__new__ object %r of DemoType' % name)
... return super().__new__(mcs, name, bases, attrs)
...
... def __init__(self, name, bases, attrs):
... print('__init__ object %r of DemoType' % name)
... return super().__init__(name, bases, attrs)
...
>>> demo_class = DemoType('demo_class', (), {}) # instantiate DemoType
__new__ object 'demo_class' of DemoType
__init__ object 'demo_class' of DemoType
To reiterate, if a is an instance of A, then A.__new__ was used to create a. The same applies to classes and metaclasses, since the former are instances of the latter.
A class does not inherit __new__ from its metaclass. A class has a metaclass, and the metaclass' __new__ is used to create the class.
When inheriting from a class (an instance of a metaclass), the metaclass is inherited as well. This means a subclass is also an instance of the metaclass. Accordingly, both __new__ and __init__ of the metaclass are used to construct and initialise this instance.
>>> class DemoClass(metaclass=DemoType):
... ...
...
>>> class DemoSubClass(DemoClass):
... ...
...
__new__ object 'DemoClass' of DemoType
__init__ object 'DemoClass' of DemoType
__new__ object 'DemoSubClass' of DemoType
__init__ object 'DemoSubClass' of DemoType
>>> type(DemoClass) # classes are instances of their metaclass
__main__.DemoType
>>> type(DemoSubClass) # subclasses inherit metaclasses from base classes
__main__.DemoType
The purpose of this is that MetaClasses exist to define how classes are created. This includes subclasses. Calling __new__ for every subclass allows the metaclass to react to the new class body, additional bases and namespace, and keywords.
For example, if I create the class Foo, then later derive the subclass Bar, I want the myCode() method of Foo to run.
class Foo(object):
x = 0
def __init__(self):
pass
def myCode(self):
if(self.x == 0):
raise Exception("nope")
class Bar(Foo): #This is where I want myCode() to execute
def baz(self):
pass
This should happen any time a class is derived from the base class Foo. Is it possible to do this in Python? I'm using Python 3 if it matters.
Note: In my real code, Foo is actually an abstract base class.
Edit: I also need access to derived class member data and methods in myCode().
Use a metaclass:
class MetaClass:
def __init__(cls, name, bases, dictionary):
if name is not 'Parent':
print('Subclass created with name: %s' % name)
super().__init__(name, bases, dictionary)
class Parent(metaclass=MetaClass):
pass
class Subclass(Parent):
pass
Output:
Subclass created with name: Subclass
Metaclasses control how classes themselves are created. Subclass inherits its metaclass from Parent, and thus that code gets run when it is defined.
Edit: As for your use case with an abstract base class, off the top of my head I think you'd just need to define your metaclass as a subclass of ABCMeta, but I didn't test that.
May this code can help you:
class Foo:
def myCode(self):
print('myCode within Foo')
def __init__(self):
if type(self) != Foo:
self.myCode()
class Bar(Foo):
def __init__(self):
super(Bar, self).__init__()
def baz(self):
pass
Test:
>>>
>>> f = Foo()
>>> b = Bar()
myCode within Foo
>>>
This works:
class MyMeta(type):
def __new__(cls, name, parents, dct):
if name is not 'Foo':
if 'x' not in dct:
raise Exception("Nein!")
elif 'x' in dct and dct['x'] == 0:
raise Exception("Nope!")
return super(MyMeta, cls).__new__(cls, name, parents, dct)
Output:
class Bar(Foo):
x = 0
> Exception: Nope!
This is my specific use case if anyone wants to comment on whether or not this is appropriate:
class MagmaMeta(type):
def __new__(cls, name, parents, dct):
# Check that Magma instances are valid.
if name is not 'Magma':
if 'CAYLEY_TABLE' not in dct:
raise Exception("Cannot create Magma instance without CAYLEY_TABLE")
else:
# Check for square CAYLEY_TABLE
for row in CAYLEY_TABLE:
if not len(row) == len(dct['CAYLEY_TABLE']):
raise Exception("CAYLEY_TABLE must be a square array")
# Create SET and ORDER from CAYLEY_TABLE
dct['SET'] = set([])
for rows in CAYLEY_TABLE:
for x in rows:
dct['SET'].add(x)
dct['ORDER'] = len(dct['SET'])
return super(MyMeta, cls).__new__(cls, name, parents, dct)
I have python class trees, each made up of an abstract base class and many deriving concrete classes. I want all concrete classes to be accessible through a base-class method, and I do not want to specify anything during child-class creation.
This is what my imagined solution looks like:
class BaseClassA(object):
# <some magic code around here>
#classmethod
def getConcreteClasses(cls):
# <some magic related code here>
class ConcreteClassA1(BaseClassA):
# no magic-related code here
class ConcreteClassA2(BaseClassA):
# no magic-related code here
As much as possible, I'd prefer to write the "magic" once as a sort of design pattern. I want to be able to apply it to different class trees in different scenarios (i.e. add a similar tree with "BaseClassB" and its concrete classes).
Thanks Internet!
you can use meta classes for that:
class AutoRegister(type):
def __new__(mcs, name, bases, classdict):
new_cls = type.__new__(mcs, name, bases, classdict)
#print mcs, name, bases, classdict
for b in bases:
if hasattr(b, 'register_subclass'):
b.register_subclass(new_cls)
return new_cls
class AbstractClassA(object):
__metaclass__ = AutoRegister
_subclasses = []
#classmethod
def register_subclass(klass, cls):
klass._subclasses.append(cls)
#classmethod
def get_concrete_classes(klass):
return klass._subclasses
class ConcreteClassA1(AbstractClassA):
pass
class ConcreteClassA2(AbstractClassA):
pass
class ConcreteClassA3(ConcreteClassA2):
pass
print AbstractClassA.get_concrete_classes()
I'm personnaly very wary of this kind of magic. Don't put too much of this in your code.
Here is a simple solution using modern python's (3.6+) __init__subclass__ defined in PEP 487. It allows you to avoid using a meta-class.
class BaseClassA(object):
_subclasses = []
#classmethod
def get_concrete_classes(cls):
return list(cls._subclasses)
def __init_subclass__(cls):
BaseClassA._subclasses.append(cls)
class ConcreteClassA1(BaseClassA):
pass # no magic-related code here
class ConcreteClassA2(BaseClassA):
pass # no magic-related code here
print(BaseClassA.get_concrete_classes())
You should know that part of the answer you're looking for is built-in. New-style classes automatically keep a weak reference to all of their child classes which can be accessed with the __subclasses__ method:
#classmethod
def getConcreteClasses(cls):
return cls.__subclasses__()
This won't return sub-sub-classes. If you need those, you can create a recursive generator to get them all:
#classmethod
def getConcreteClasses(cls):
for c in cls.__subclasses__():
yield c
for c2 in c.getConcreteClasses():
yield c2
Another way to do this, with a decorator, if your subclasses are either not defining __init__ or are calling their parent's __init__:
def lister(cls):
cls.classes = list()
cls._init = cls.__init__
def init(self, *args, **kwargs):
cls = self.__class__
if cls not in cls.classes:
cls.classes.append(cls)
cls._init(self, *args, **kwargs)
cls.__init__ = init
#classmethod
def getclasses(cls):
return cls.classes
cls.getclasses = getclasses
return cls
#lister
class A(object): pass
class B(A): pass
class C(A):
def __init__(self):
super(C, self).__init__()
b = B()
c = C()
c2 = C()
print 'Classes:', c.getclasses()
It will work whether or not the base class defines __init__.
Is it possible to chain metaclasses?
I have class Model which uses __metaclass__=ModelBase to process its namespace dict. I'm going to inherit from it and "bind" another metaclass so it won't shade the original one.
First approach is to subclass class MyModelBase(ModelBase):
MyModel(Model):
__metaclass__ = MyModelBase # inherits from `ModelBase`
But is it possible just to chain them like mixins, without explicit subclassing? Something like
class MyModel(Model):
__metaclass__ = (MyMixin, super(Model).__metaclass__)
... or even better: create a MixIn that will use __metaclass__ from the direct parent of the class that uses it:
class MyModel(Model):
__metaclass__ = MyMetaMixin, # Automagically uses `Model.__metaclass__`
The reason: For more flexibility in extending existing apps, I want to create a global mechanism for hooking into the process of Model, Form, ... definitions in Django so it can be changed at runtime.
A common mechanism would be much better than implementing multiple metaclasses with callback mixins.
With your help I finally managed to come up to a solution: metaclass MetaProxy.
The idea is: create a metaclass that invokes a callback to modify the namespace of the class being created, then, with the help of __new__, mutate into a metaclass of one of the parents
#!/usr/bin/env python
#-*- coding: utf-8 -*-
# Magical metaclass
class MetaProxy(type):
""" Decorate the class being created & preserve __metaclass__ of the parent
It executes two callbacks: before & after creation of a class,
that allows you to decorate them.
Between two callbacks, it tries to locate any `__metaclass__`
in the parents (sorted in MRO).
If found — with the help of `__new__` method it
mutates to the found base metaclass.
If not found — it just instantiates the given class.
"""
#classmethod
def pre_new(cls, name, bases, attrs):
""" Decorate a class before creation """
return (name, bases, attrs)
#classmethod
def post_new(cls, newclass):
""" Decorate a class after creation """
return newclass
#classmethod
def _mrobases(cls, bases):
""" Expand tuple of base-classes ``bases`` in MRO """
mrobases = []
for base in bases:
if base is not None: # We don't like `None` :)
mrobases.extend(base.mro())
return mrobases
#classmethod
def _find_parent_metaclass(cls, mrobases):
""" Find any __metaclass__ callable in ``mrobases`` """
for base in mrobases:
if hasattr(base, '__metaclass__'):
metacls = base.__metaclass__
if metacls and not issubclass(metacls, cls): # don't call self again
return metacls#(name, bases, attrs)
# Not found: use `type`
return lambda name,bases,attrs: type.__new__(type, name, bases, attrs)
def __new__(cls, name, bases, attrs):
mrobases = cls._mrobases(bases)
name, bases, attrs = cls.pre_new(name, bases, attrs) # Decorate, pre-creation
newclass = cls._find_parent_metaclass(mrobases)(name, bases, attrs)
return cls.post_new(newclass) # Decorate, post-creation
# Testing
if __name__ == '__main__':
# Original classes. We won't touch them
class ModelMeta(type):
def __new__(cls, name, bases, attrs):
attrs['parentmeta'] = name
return super(ModelMeta, cls).__new__(cls, name, bases, attrs)
class Model(object):
__metaclass__ = ModelMeta
# Try to subclass me but don't forget about `ModelMeta`
# Decorator metaclass
class MyMeta(MetaProxy):
""" Decorate a class
Being a subclass of `MetaProxyDecorator`,
it will call base metaclasses after decorating
"""
#classmethod
def pre_new(cls, name, bases, attrs):
""" Set `washere` to classname """
attrs['washere'] = name
return super(MyMeta, cls).pre_new(name, bases, attrs)
#classmethod
def post_new(cls, newclass):
""" Append '!' to `.washere` """
newclass.washere += '!'
return super(MyMeta, cls).post_new(newclass)
# Here goes the inheritance...
class MyModel(Model):
__metaclass__ = MyMeta
a=1
class MyNewModel(MyModel):
__metaclass__ = MyMeta # Still have to declare it: __metaclass__ do not inherit
a=2
class MyNewNewModel(MyNewModel):
# Will use the original ModelMeta
a=3
class A(object):
__metaclass__ = MyMeta # No __metaclass__ in parents: just instantiate
a=4
class B(A):
pass # MyMeta is not called until specified explicitly
# Make sure we did everything right
assert MyModel.a == 1
assert MyNewModel.a == 2
assert MyNewNewModel.a == 3
assert A.a == 4
# Make sure callback() worked
assert hasattr(MyModel, 'washere')
assert hasattr(MyNewModel, 'washere')
assert hasattr(MyNewNewModel, 'washere') # inherited
assert hasattr(A, 'washere')
assert MyModel.washere == 'MyModel!'
assert MyNewModel.washere == 'MyNewModel!'
assert MyNewNewModel.washere == 'MyNewModel!' # inherited, so unchanged
assert A.washere == 'A!'
A type can have only one metaclass, because a metaclass simply states what the class statement does - having more than one would make no sense. For the same reason "chaining" makes no sense: the first metaclass creates the type, so what is the 2nd supposed to do?
You will have to merge the two metaclasses (just like with any other class). But that can be tricky, especially if you don't really know what they do.
class MyModelBase(type):
def __new__(cls, name, bases, attr):
attr['MyModelBase'] = 'was here'
return type.__new__(cls,name, bases, attr)
class MyMixin(type):
def __new__(cls, name, bases, attr):
attr['MyMixin'] = 'was here'
return type.__new__(cls, name, bases, attr)
class ChainedMeta(MyModelBase, MyMixin):
def __init__(cls, name, bases, attr):
# call both parents
MyModelBase.__init__(cls,name, bases, attr)
MyMixin.__init__(cls,name, bases, attr)
def __new__(cls, name, bases, attr):
# so, how is the new type supposed to look?
# maybe create the first
t1 = MyModelBase.__new__(cls, name, bases, attr)
# and pass it's data on to the next?
name = t1.__name__
bases = tuple(t1.mro())
attr = t1.__dict__.copy()
t2 = MyMixin.__new__(cls, name, bases, attr)
return t2
class Model(object):
__metaclass__ = MyModelBase # inherits from `ModelBase`
class MyModel(Model):
__metaclass__ = ChainedMeta
print MyModel.MyModelBase
print MyModel.MyMixin
As you can see this is involves some guesswork already, since you don't really know what the other metaclasses do. If both metaclasses are really simple this might work, but I wouldn't have too much confidence in a solution like this.
Writing a metaclass for metaclasses that merges multiple bases is left as an exercise to the reader ;-P
I don't know any way to "mix" metaclasses, but you can inherit and override them just like you would normal classes.
Say I've got a BaseModel:
class BaseModel(object):
__metaclass__ = Blah
and you now you want to inherit this in a new class called MyModel, but you want to insert some additional functionality into the metaclass, but otherwise leave the original functionality intact. To do that, you'd do something like:
class MyModelMetaClass(BaseModel.__metaclass__):
def __init__(cls, *args, **kwargs):
do_custom_stuff()
super(MyModelMetaClass, cls).__init__(*args, **kwargs)
do_more_custom_stuff()
class MyModel(BaseModel):
__metaclass__ = MyModelMetaClass
I don't think you can chain them like that, and I don't know how that would work either.
But you can make new metaclasses during runtime and use them. But that's a horrid hack. :)
zope.interface does something similar, it has an advisor metaclass, that will just do some things to the class after construction. If there was a metclass already, one of the things it will do it set that previous metaclass as the metaclass once it's finished.
(However, avoid doing these kinds of things unless you have to, or think it's fun.)
Adding to the answer by #jochenritzel, the following simplifies the combining step:
def combine_classes(*args):
name = "".join(a.__name__ for a in args)
return type(name, args, {})
class ABCSomething(object, metaclass=combine_classes(SomethingMeta, ABCMeta)):
pass
Here, type(name, bases, dict) works like a dynamic class statement (see docs). Surprisingly, there doesn't seem to be a way to use the dict argument for setting the metaclass in the second step. Otherwise one could simplify the whole process down to a single function call.