How to recover the mro of a class given its bases? - python

Suppose we are implementing a metaclass that needs to know the method resolution order before the class is instantiated.
class Meta(type):
def __new__(cls, name, bases, namespace):
mro = ...
Is there a builtin way to compute the mro, that is a way other than reimplementing the C3 algorithm?

The simpler thing is to just create a temporary class, extract its __mro__, compute your things, and then create the metaclass for real:
class Meta(type):
def __new__(metacls, name, bases, namespace):
tmp_cls = super().__new__(metacls, name, bases, namespace)
mro = tmp_cls.__mro__
del tmp_cls # Not actually needed, just to show you are done with it.
...
# do stuff
...
new_class = super().__new__(metacls, name, bases, namespace)
...
return new_class
Supposing that can't be done, due to crazy side-effects on the metaclasses of some of the super-classes on the hierarchy - then the same idea, but cloning the classes in the bases to "stub" classes before doing it - but possibly, reimplementing the C3 algorithm is easier than that - and certainly more efficient, since for each class you'd create an N ** 2 number of stub superclasses, where N is the depth of your class hierarchy (well that could be cached should you pick this route).
Anyway, the code for that could be something along:
stub_cache = {object: object}
def get_stub_class(cls):
# yields an mro-equivalent with no metaclass side-effects.
if cls is object:
return object
stub_bases = []
for base in cls.__bases__:
stub_bases.append(get_stub_class(base))
if cls not in stub_cache:
stub_cache[cls] = type(cls.__name__, tuple(stub_bases), {})
return stub_cache[cls]
def get_future_mro(name, bases):
stub_bases = tuple(get_stub_class(base) for base in bases)
stub_cls = type(name, stub_bases, {})
reversed_cache = {value:key for key, value in stub_cache.items()}
return [reversed_cache[mro_base] for mro_base in stub_cls.__mro__[1:]]
class Meta(type):
def __new__(metacls, name, bases, namespace):
mro = get_future_mro(name, bases)
print(mro)
return super().__new__(metacls, name, bases, namespace)
(This thing works for basic cases I've tried in interactive mode - but there might be complicated edge cases not covered, with multiple metaclasses and such)

Related

Don't know meaning and reason of 'return type.__call__(cls, *args, **kwds)' in python metaclass example code

I'm trying to understand about Python Metaclass. I almost understand what Metaclass do in python, but I can't understand meaning of example code
class MakeCalc(type):
def __new__(metacls, name, bases, namespace):
namespace['desc'] = 'calculation'
namespace['add'] = lambda self, a, b: a + b
return type.__new__(metacls, name, bases, namespace)
why does this Metaclass return type.__new__(metacls, name, bases, namespace)? what does it mean?

Python metaclasses: using arguments of class.__init__ to modify class in metaclass

I am currently trying to wrap my head around metaclasses for a particular problem:
- a container class needs to expose some functions from its elements
- the container class cannot be modified afterwards
Metaclasses seem to be a good way to accomplish the above (e.g. __slots__ can be modified in __new__.
Example:
import inspect
class MyMeta(type):
def __new__(cls, name, bases, attrs):
...
# modify slots ?
return super().__new__(cls, name, bases, attrs)
def __call__(cls, *args, **kwargs):
name = args[0]
elements = args[1]
tmp_class = super().__call__(*args, **kwargs)
ds_methods = [m for m, _ in inspect.getmembers(tmp_class, predicate=inspect.ismethod)]
e_methods = [m for m, _ in inspect.getmembers(elements[0], predicate=inspect.ismethod) if m not in ds_methods]
attrs = {m:f for m, f in inspect.getmembers(cls, predicate=inspect.ismethod)}
# for testing map to print
new_attr = {m: print for m in e_methods}
attrs.update(new_attr)
.... # call __new__?
# this does not contain the new methods
newclass = cls.__new__(cls, cls.__name__, [object], attrs)
class Container(metaclass=MyMeta):
__slots__ = ['_name', '_elements']
def __init__(self, name, elements):
self._name = name
self._elements = elements
In short: the only way I found to modify slots is in __new__ and the only way to intercept creation arguments is in __call__.
There is probably a much simpler way to accomplish this (the above does not work), I would be thankful for any pointers to help me better understand metaclasses.

Dynamic Class Variables - locals() vs __metaclass__

To create dynamic class variables you seem to have two options - using locals() or __metaclass__. I am not a fan of assigning to locals() but it seems to work in a class definition and is more succinct than it's __metaclass__ alternative.
Is it reliable to use locals() in this way? What are the advantages/drawbacks for either solution?
meta_attrs = ["alpha", "beta", "gamma"]
class C(object):
local_context = locals()
for attr in meta_attrs:
local_context[attr] = attr.upper()
class D(object):
class __metaclass__(type):
def __new__(mcs, name, bases, attrs):
for attr in meta_attrs:
attrs[attr] = attr.upper()
return type.__new__(mcs, name, bases, attrs)
if __name__ == '__main__':
print C.alpha, C.beta, C.gamma
print D.alpha, D.beta, D.gamma
As mentioned by kindall, a class decorator is a third option:
meta_attrs = ["alpha", "beta", "gamma"]
def decorate(klass):
for attr in meta_attrs:
setattr(klass, attr, attr.upper())
return klass
#decorate
class C(object):
pass
Here's a fourth option:
meta_attrs = ["alpha", "beta", "gamma"]
class C(object):
pass
for attr in meta_attrs:
setattr(C, attr, attr.upper())
I would go for a fifth option myself: Not having dynamic class attributes. I can't see the use case. If I really needed it, I'd probably use the fourth option, unless I needed it on more than one class, then I would use a decorator.
Metaclasses are often overly complicated, and the locals() hack really is just an ugly trick and should be avoided.
__metaclass__ can be any callable:
class D(object):
def __metaclass__(name, bases, dct):
for attr in meta_attrs:
dct[attr] = attr.upper()
return type(name, bases, dct)
This is almost as concise as the locals() method, and doesn't rely on CPython implementation details.

Finding __metaclass__ used before redefinition in Python

I want to redefine a __metaclass__ but I want to fall back to the metaclass which would have been used if I hadn't redefined.
class ComponentMetaClass(type):
def __new__(cls, name, bases, dct):
return <insert_prev_here>.__new__(cls, name, bases, dct)
class Component(OtherObjects):
__metaclass__ = ComponentMetaClass
From what I understand, the __metaclass__ used by default goes through the process of checking for a definition in the scope of the class, then in the bases and then in global. Normally you would use type in the redefinition and that is usually the global one, however, my OtherObjects, may have redefined the __metaclass__. So in using type, I would ignore their definition and they wouldn't run, right?
edit: note that I don't know what OtherObjects are until runtime
As #unutbu puts it: "Within one class hierarchy, metaclasses must be subclasses of each other. That is, the metaclass of Component must be a subclass of the metaclass of OtherObjects."
Which means your problem is a bit more complicated than you though first - Not only you have to call the proper metaclass from the base classes, but your current metaclass has to properly inherit from then as well.
(hack some code, confront strange behavior, come back 90 min later)
It was tricky indeed - I had to create a class that receives the desired metaclass as a parameter, and which __call__ method generates dynamically a new metaclass, modifying its bases and adding a __superclass attribute to it.
But this should do what you want and some more - you just have to inherit all your metaclasses from BaseComponableMeta and call the superclasses in the hyerarchy through the metaclass "__superclass" attribute:
from itertools import chain
class Meta1(type):
def __new__(metacls, name, bases, dct):
print name
return type.__new__(metacls, name, bases, dct)
class BaseComponableMeta(type):
def __new__(metacls, *args, **kw):
return metacls.__superclass.__new__(metacls, *args, **kw)
class ComponentMeta(object):
def __init__(self, metaclass):
self.metaclass = metaclass
def __call__(self, name, bases,dct):
#retrieves the deepest previous metaclass in the object hierarchy
bases_list = sorted ((cls for cls in chain(*(base.mro() for base in bases)))
, key=lambda s: len(type.mro(s.__class__)))
previous_metaclass = bases_list[-1].__class__
# Adds the "__superclass" attribute to the metaclass, so that it can call
# its bases:
metaclass_dict = dict(self.metaclass.__dict__).copy()
new_metaclass_name = self.metaclass.__name__
metaclass_dict["_%s__superclass" % new_metaclass_name] = previous_metaclass
#dynamicaly generates a new metaclass for this class:
new_metaclass = type(new_metaclass_name, (previous_metaclass, ), metaclass_dict)
return new_metaclass(name, bases, dct)
# From here on, example usage:
class ComponableMeta(BaseComponableMeta):
pass
class NewComponableMeta_1(BaseComponableMeta):
def __new__(metacls, *args):
print "Overriding the previous metaclass part 1"
return metacls.__superclass.__new__(metacls, *args)
class NewComponableMeta_2(BaseComponableMeta):
def __new__(metacls, *args):
print "Overriding the previous metaclass part 2"
return metacls.__superclass.__new__(metacls, *args)
class A(object):
__metaclass__ = Meta1
class B(A):
__metaclass__ = ComponentMeta(ComponableMeta)
# trying multiple inheritance, and subclassing the metaclass once:
class C(B, A):
__metaclass__ = ComponentMeta(NewComponableMeta_1)
# Adding a third metaclass to the chain:
class D(C):
__metaclass__ = ComponentMeta(NewComponableMeta_2)
# class with a "do nothing" metaclass, which calls its bases metaclasses:
class E(D):
__metaclass__ = ComponentMeta(ComponableMeta)
Within one class hierarchy, metaclasses must be subclasses of each other. That is, the metaclass of Component must be a subclass of the metaclass of OtherObjects.
If you don't name a __metaclass__ for Component then the metaclass of OtherObjects will be used by default.
If ComponentMetaClass and OtherObjectsMeta both inherit (independently) from type:
class OtherObjectsMeta(type): pass
class ComponentMetaClass(type): pass
class OtherObjects(object):
__metaclass__ = OtherObjectsMeta
class Component(OtherObjects):
__metaclass__ = ComponentMetaClass
then you get this error:
TypeError: Error when calling the metaclass bases
metaclass conflict: the metaclass of a derived class must be a (non-strict) subclass of the metaclasses of all its bases
but if you make ComponentMetaClass as subclass of OtherObjectsMeta
class ComponentMetaClass(OtherObjectsMeta): pass
then the error goes away.
Perhaps I misread your question. If want ComponentMetaClass.__new__ to call OtherObjectsMeta.__new__, then use super:
class OtherObjectsMeta(type):
def __new__(meta, name, bases, dct):
print('OtherObjectsMeta')
return super(OtherObjectsMeta,meta).__new__(meta,name,bases,dct)
class ComponentMetaClass(OtherObjectsMeta):
def __new__(meta, name, bases, dct):
print('ComponentMetaClass')
return super(ComponentMetaClass,meta).__new__(meta,name,bases,dct)
Regarding an alternative to using metaclasses, mentioned in the comments. Use super:
class Base(object):
def method(self): pass
class Base1(Base):
def method(self):
print('Base1')
super(Base1,self).method()
class Base2(Base):
def method(self):
print('Base2')
super(Base2,self).method()
class Component(Base1,Base2):
pass
c = Component()
c.method()

Metaclass Mixin or Chaining?

Is it possible to chain metaclasses?
I have class Model which uses __metaclass__=ModelBase to process its namespace dict. I'm going to inherit from it and "bind" another metaclass so it won't shade the original one.
First approach is to subclass class MyModelBase(ModelBase):
MyModel(Model):
__metaclass__ = MyModelBase # inherits from `ModelBase`
But is it possible just to chain them like mixins, without explicit subclassing? Something like
class MyModel(Model):
__metaclass__ = (MyMixin, super(Model).__metaclass__)
... or even better: create a MixIn that will use __metaclass__ from the direct parent of the class that uses it:
class MyModel(Model):
__metaclass__ = MyMetaMixin, # Automagically uses `Model.__metaclass__`
The reason: For more flexibility in extending existing apps, I want to create a global mechanism for hooking into the process of Model, Form, ... definitions in Django so it can be changed at runtime.
A common mechanism would be much better than implementing multiple metaclasses with callback mixins.
With your help I finally managed to come up to a solution: metaclass MetaProxy.
The idea is: create a metaclass that invokes a callback to modify the namespace of the class being created, then, with the help of __new__, mutate into a metaclass of one of the parents
#!/usr/bin/env python
#-*- coding: utf-8 -*-
# Magical metaclass
class MetaProxy(type):
""" Decorate the class being created & preserve __metaclass__ of the parent
It executes two callbacks: before & after creation of a class,
that allows you to decorate them.
Between two callbacks, it tries to locate any `__metaclass__`
in the parents (sorted in MRO).
If found — with the help of `__new__` method it
mutates to the found base metaclass.
If not found — it just instantiates the given class.
"""
#classmethod
def pre_new(cls, name, bases, attrs):
""" Decorate a class before creation """
return (name, bases, attrs)
#classmethod
def post_new(cls, newclass):
""" Decorate a class after creation """
return newclass
#classmethod
def _mrobases(cls, bases):
""" Expand tuple of base-classes ``bases`` in MRO """
mrobases = []
for base in bases:
if base is not None: # We don't like `None` :)
mrobases.extend(base.mro())
return mrobases
#classmethod
def _find_parent_metaclass(cls, mrobases):
""" Find any __metaclass__ callable in ``mrobases`` """
for base in mrobases:
if hasattr(base, '__metaclass__'):
metacls = base.__metaclass__
if metacls and not issubclass(metacls, cls): # don't call self again
return metacls#(name, bases, attrs)
# Not found: use `type`
return lambda name,bases,attrs: type.__new__(type, name, bases, attrs)
def __new__(cls, name, bases, attrs):
mrobases = cls._mrobases(bases)
name, bases, attrs = cls.pre_new(name, bases, attrs) # Decorate, pre-creation
newclass = cls._find_parent_metaclass(mrobases)(name, bases, attrs)
return cls.post_new(newclass) # Decorate, post-creation
# Testing
if __name__ == '__main__':
# Original classes. We won't touch them
class ModelMeta(type):
def __new__(cls, name, bases, attrs):
attrs['parentmeta'] = name
return super(ModelMeta, cls).__new__(cls, name, bases, attrs)
class Model(object):
__metaclass__ = ModelMeta
# Try to subclass me but don't forget about `ModelMeta`
# Decorator metaclass
class MyMeta(MetaProxy):
""" Decorate a class
Being a subclass of `MetaProxyDecorator`,
it will call base metaclasses after decorating
"""
#classmethod
def pre_new(cls, name, bases, attrs):
""" Set `washere` to classname """
attrs['washere'] = name
return super(MyMeta, cls).pre_new(name, bases, attrs)
#classmethod
def post_new(cls, newclass):
""" Append '!' to `.washere` """
newclass.washere += '!'
return super(MyMeta, cls).post_new(newclass)
# Here goes the inheritance...
class MyModel(Model):
__metaclass__ = MyMeta
a=1
class MyNewModel(MyModel):
__metaclass__ = MyMeta # Still have to declare it: __metaclass__ do not inherit
a=2
class MyNewNewModel(MyNewModel):
# Will use the original ModelMeta
a=3
class A(object):
__metaclass__ = MyMeta # No __metaclass__ in parents: just instantiate
a=4
class B(A):
pass # MyMeta is not called until specified explicitly
# Make sure we did everything right
assert MyModel.a == 1
assert MyNewModel.a == 2
assert MyNewNewModel.a == 3
assert A.a == 4
# Make sure callback() worked
assert hasattr(MyModel, 'washere')
assert hasattr(MyNewModel, 'washere')
assert hasattr(MyNewNewModel, 'washere') # inherited
assert hasattr(A, 'washere')
assert MyModel.washere == 'MyModel!'
assert MyNewModel.washere == 'MyNewModel!'
assert MyNewNewModel.washere == 'MyNewModel!' # inherited, so unchanged
assert A.washere == 'A!'
A type can have only one metaclass, because a metaclass simply states what the class statement does - having more than one would make no sense. For the same reason "chaining" makes no sense: the first metaclass creates the type, so what is the 2nd supposed to do?
You will have to merge the two metaclasses (just like with any other class). But that can be tricky, especially if you don't really know what they do.
class MyModelBase(type):
def __new__(cls, name, bases, attr):
attr['MyModelBase'] = 'was here'
return type.__new__(cls,name, bases, attr)
class MyMixin(type):
def __new__(cls, name, bases, attr):
attr['MyMixin'] = 'was here'
return type.__new__(cls, name, bases, attr)
class ChainedMeta(MyModelBase, MyMixin):
def __init__(cls, name, bases, attr):
# call both parents
MyModelBase.__init__(cls,name, bases, attr)
MyMixin.__init__(cls,name, bases, attr)
def __new__(cls, name, bases, attr):
# so, how is the new type supposed to look?
# maybe create the first
t1 = MyModelBase.__new__(cls, name, bases, attr)
# and pass it's data on to the next?
name = t1.__name__
bases = tuple(t1.mro())
attr = t1.__dict__.copy()
t2 = MyMixin.__new__(cls, name, bases, attr)
return t2
class Model(object):
__metaclass__ = MyModelBase # inherits from `ModelBase`
class MyModel(Model):
__metaclass__ = ChainedMeta
print MyModel.MyModelBase
print MyModel.MyMixin
As you can see this is involves some guesswork already, since you don't really know what the other metaclasses do. If both metaclasses are really simple this might work, but I wouldn't have too much confidence in a solution like this.
Writing a metaclass for metaclasses that merges multiple bases is left as an exercise to the reader ;-P
I don't know any way to "mix" metaclasses, but you can inherit and override them just like you would normal classes.
Say I've got a BaseModel:
class BaseModel(object):
__metaclass__ = Blah
and you now you want to inherit this in a new class called MyModel, but you want to insert some additional functionality into the metaclass, but otherwise leave the original functionality intact. To do that, you'd do something like:
class MyModelMetaClass(BaseModel.__metaclass__):
def __init__(cls, *args, **kwargs):
do_custom_stuff()
super(MyModelMetaClass, cls).__init__(*args, **kwargs)
do_more_custom_stuff()
class MyModel(BaseModel):
__metaclass__ = MyModelMetaClass
I don't think you can chain them like that, and I don't know how that would work either.
But you can make new metaclasses during runtime and use them. But that's a horrid hack. :)
zope.interface does something similar, it has an advisor metaclass, that will just do some things to the class after construction. If there was a metclass already, one of the things it will do it set that previous metaclass as the metaclass once it's finished.
(However, avoid doing these kinds of things unless you have to, or think it's fun.)
Adding to the answer by #jochenritzel, the following simplifies the combining step:
def combine_classes(*args):
name = "".join(a.__name__ for a in args)
return type(name, args, {})
class ABCSomething(object, metaclass=combine_classes(SomethingMeta, ABCMeta)):
pass
Here, type(name, bases, dict) works like a dynamic class statement (see docs). Surprisingly, there doesn't seem to be a way to use the dict argument for setting the metaclass in the second step. Otherwise one could simplify the whole process down to a single function call.

Categories