This question already has answers here:
What are metaclasses in Python?
(25 answers)
Closed 9 years ago.
Is there a way to trigger code when my class is subclassed?
class SuperClass:
def triggered_routine(subclass):
print("was subclassed by " + subclass.__name__)
magically_register_triggered_routine()
print("foo")
class SubClass0(SuperClass):
pass
print("bar")
class SubClass1(SuperClass):
print("test")
Should output
foo
was subclassed by SubClass0
bar
test
was subclassed by SubClass1
Classes (by default) are instances of type.
Just as an instance of a class Foo is created by foo = Foo(...),
an instance of type (i.e. a class) is created by myclass = type(name, bases, clsdict).
If you want something special to happen at the moment of class-creation, then you have to modify the thing creating the class -- i.e. type. The way to do that is to define a subclass of type -- i.e. a metaclass.
A metaclass is to its class as a class is to its instance.
In Python2 you would define the metaclass of a class with
class SuperClass:
__metaclass__ = Watcher
where Watcher is a subclass of type.
In Python3 the syntax has been changed to
class SuperClass(metaclass=Watcher)
Both are equivalent to
Superclass = Watcher(name, bases, clsdict)
where in this case, name equals the string 'Superclass', and bases is the tuple (object, ). The clsdict is a dictionary of the class attributes defined in the body of the class definition.
Note the similarity to myclass = type(name, bases, clsdict).
So, just as you would use a class's __init__ to control events at the moment of a instance's creation, you can control events at the moment of a class's creation with a metaclass's __init__:
class Watcher(type):
def __init__(cls, name, bases, clsdict):
if len(cls.mro()) > 2:
print("was subclassed by " + name)
super(Watcher, cls).__init__(name, bases, clsdict)
class SuperClass:
__metaclass__ = Watcher
print("foo")
class SubClass0(SuperClass):
pass
print("bar")
class SubClass1(SuperClass):
print("test")
prints
foo
was subclassed by SubClass0
bar
test
was subclassed by SubClass1
Edit: My old post actually didn't work. Subclassing from classmethod doesn't work as expected.
First, we would like to have some way to tell the metaclass that this particular method is supposed to have the special called on subclass behavior, we'll just set an attribute on the function we'd like to call. As a convenience, we'll even turn the function into a classmethod so that the real baseclass it was found in can be discovered, too. We'll return the classmethod so that it can be used as a decorator, which is most convenient.
import types
import inspect
def subclass_hook(func):
func.is_subclass_hook = True
return classmethod(func)
We're also going to want a convenient way to see that the subclass_hook decorator was used. We know that classmethod has been used, so we'll check for that, and only then look for the is_subclass_hook attribute.
def test_subclass_hook(thing):
x = (isinstance(thing, types.MethodType) and
getattr(thing.im_func, 'is_subclass_hook', False))
return x
Finally, we need a metaclass that acts on the information: For most cases, the most interesting thing to do here is just check each of the supplied bases for hooks. In that way, super works in the least surprising way.
class MyMetaclass(type):
def __init__(cls, name, bases, attrs):
super(MyMetaclass, cls).__init__(name, bases, attrs)
for base in bases:
if base is object:
continue
for name, hook in inspect.getmembers(base, test_subclass_hook):
hook(cls)
and that should do it.
>>> class SuperClass:
... __metaclass__ = MyMetaclass
... #subclass_hook
... def triggered_routine(cls, subclass):
... print(cls.__name__ + " was subclassed by " + subclass.__name__)
>>> class SubClass0(SuperClass):
... pass
SuperClass was subclassed by SubClass0
>>> class SubClass1(SuperClass):
... print("test")
test
SuperClass was subclassed by SubClass1
Related
I know that inside a metaclass, I can do:
class MyMetaClass(type):
def __new__(cls, name, bases, attrs):
return type.__new__(cls,name,bases,attrs)
if I want to replace bases with my own base class: MyClass[Generic[TypeVar("T")]], ie, MyClass is a generic class.
if i just do:
my_class = MyClass[Generic[TypeVar("T")]]
return type.__new__(cls,name,(my_class,),attrs)
It gives me a
type() doesn't support MRO entry resolution; use types.new_class()
What does it mean? How do I specify my base class that inherits from Generic?
I defined a metaclass which add a method named "test" to the created classes:
class FooMeta(type):
def __new__(mcls, name, bases, attrs):
def test(self):
return super().test()
attrs["test"] = test
cls = type.__new__(mcls, name, bases, attrs)
return cls
Then I create two classes using this Metaclass
class A(metaclass=FooMeta):
pass
class B(A):
pass
When I run
a = A()
a.test()
a TypeError is raised at super().test():
super(type, obj): obj must be an instance or subtype of type
Which means super() cannot infer the parent class correctly. If I change the super call into
def __new__(mcls, name, bases, attrs):
def test(self):
return super(cls, self).test()
attrs["test"] = test
cls = type.__new__(mcls, name, bases, attrs)
return cls
then the raised error becomes:
AttributeError: 'super' object has no attribute 'test'
which is expected as the parent of A does not implement test method.
So my question is what is the correct way to call super() in a dynamically added method? Should I always write super(cls, self) in this case? If so, it is too ugly (for python3)!
Parameterless super() is very special in Python because it triggers some behavior during code compilation time itself: Python creates an invisible __class__ variable which is a reference to the "physical" class statement body were the super() call is embedded (it also happens if one makes direct use of the __class__ variable inside a class method).
In this case, the "physical" class where super() is called is the metaclass FooMeta itself, not the class it is creating.
The workaround for that is to use the version of super which takes 2 positional arguments: the class in which it will search the immediate superclass, and the instance itself.
In Python 2 and other occasions one may prefer the parameterized use of super, it is normal to use the class name itself as the first parameter: at runtime, this name will be available as a global variable in the current module. That is, if class A would be statically coded in the source file, with a def test(...): method, you would use super(A, self).test(...) inside its body.
However, although the class name won't be available as a variable in the module defining the metaclass, you really need to pass a reference to the class as the first argument to super. Since the (test) method receives self as a reference to the instance, its class is given by either self.__class__ or type(self).
TL;DR: just change the super call in your dynamic method to read:
class FooMeta(type):
def __new__(mcls, name, bases, attrs):
def test(self):
return super(type(self), self).test()
attrs["test"] = test
cls = type.__new__(mcls, name, bases, attrs)
return cls
I read that it is considered bad practice to create a variable in the class namespace and then change its value in the class constructor.
(One of my sources: SoftwareEngineering SE: Is it a good practice to declare instance variables as None in a class in Python.)
Consider the following code:
# lib.py
class mixin:
def __init_subclass__(cls, **kwargs):
cls.check_mixin_subclass_validity(cls)
super().__init_subclass__(**kwargs)
def check_mixin_subclass_validity(subclass):
assert hasattr(subclass, 'necessary_var'), \
'Missing necessary_var'
def method_used_by_subclass(self):
return self.necessary_var * 3.14
# app.py
class my_subclass(mixin):
necessary_var = None
def __init__(self, some_value):
self.necessary_var = some_value
def run(self):
# DO SOME STUFF
self.necessary_var = self.method_used_by_subclass()
# DO OTHER STUFF
To force its subclass to declare the variable necessary_var, the class mixin uses the metaclass subclass_validator.
And the only way I know to makes it work on app.py side, is to initialized necessary_var as a class variable.
I am missing something or is it the only way to do so?
Short answer
You should check that attributes and methods exist at instantiation of a class, not before. This is what the abc module does and it has good reasons to work like this.
Long answer
First, I would like to point out that it seems what you want to check is that an instance attribute exists.
Due to Python dynamic nature, it is not possible to do so before an instance is created, that is after the call to __init__. We could define Mixin.__init__, but we would then have to rely on the users of your API to have perfect hygiene and to always call super().__init__.
One option is thus to create a metaclass and add a check in its __call__ method.
class MetaMixin(type):
def __call__(self, *args, **kwargs):
instance = super().__call__(*args, **kwargs)
assert hasattr(instance, 'necessary_var')
class Mixin(metaclass=MetaMixin):
pass
class Foo(Mixin):
def __init__(self):
self.necessary_var = ...
Foo() # Works fine
class Bar(Mixin):
pass
Bar() # AssertionError
To convince yourself that it is good practice to do this at instantiation, we can look toward the abc module which uses this behaviour.
from abc import abstractmethod, ABC
class AbstractMixin(ABC):
#abstractmethod
def foo(self):
...
class Foo(AbstractMixin):
pass
# Right now, everything is still all good
Foo() # TypeError: Can't instantiate abstract class Foo with abstract methods foo
As you can see the TypeError was raise at instantiation of Foo() and not at class creation.
But why does it behave like this?
The reason for that is that not every class will be instantiated, consider the example where we want to inherit from Mixin to create a new mixin which checks for some more attributes.
class Mixin:
def __init_subclass__(cls, **kwargs):
assert hasattr(cls, 'necessary_var')
super().__init_subclass__(**kwargs)
class MoreMixin(Mixin):
def __init_subclass__(cls, **kwargs):
assert hasattr(cls, 'other_necessary_var')
super().__init_subclass__(**kwargs)
# AssertionError was raised at that point
class Foo(MoreMixin):
necessary_var = ...
other_necessary_var = ...
As you see, the AssertionError was raised at the creation of the MoreMixin class. This is clearly not the desired behaviour since the Foo class is actually correctly built and that is what our mixin was supposed to check.
In conclusion, the existence of some attribute or method should be done at instantiation, Otherwise, you are preventing a whole lot of helpful inheritance techniques. This is why the abc module does it like that and this is why we should.
Here is a very simple Base class, which contains a static method and a class method:
class Base():
#staticmethod
def f():
print("Base.f")
#classmethod
def g(cls):
print("Base.g")
def h(self):
print("Base.h")
If a class is to be derived from Base and override either f or g then, the staticmethod and classmethod decorators need to be used again on the overriding methods.
class A(Base):
#staticmethod
def f():
print("A.f")
class B(Base):
#classmethod
def g(cls):
print("B.g")
So, at first I thought I would create a metaclass that automatically makes f a staticmethod and g a staticmethod.
class BaseMeta(type):
def __init__(cls, name, bases, namespace):
super().__init__(name, bases, namespace)
if 'f' in namespace: cls.f = staticmethod(cls.f)
if 'g' in namespace: cls.g = classmethod(cls.g)
Now the rest of the classes don't need to use staticmethod and classmethod explicitly.
class Base(metaclass=BaseMeta):
def f():
print("Base.f")
def g(cls):
print("Base.g")
def h(self):
print("Base.h")
class A(Base):
def f():
print("A.f")
class B(Base):
def g(cls):
print("B.g")
This works, but I don't like how it looks. Now, I realize that the staticmethod and classmethod decorators should be explicitly used (after all, explicit is better than implicit, isn't it?)
So I thought I could keep the metaclass, but this time instead of enforcing the decorators, I should just check whether or not they have been used and throw an exception if they haven't.
class BaseMeta(type):
def __init__(cls, name, bases, namespace):
super().__init__(name, bases, namespace)
# check if cls.f is a static method
if not inspect.isfunction(cls.f):
raise Exception("f should be a static method")
# check if cls.g is a static method
if not (inspect.ismethod(cls.g) and cls.g.__self__ == cls):
raise Exception("g should be a class method")
Unfortunately, this doesn't work. It seems that in the metaclasse's __init__, everything is considered to be just a function (just printing cls.f and cls.g makes this apparent).
Is there something I'm missing here?
This:
if not (inspect.ismethod(cls.g) and cls.g.__self__ == cls):
raise Exception("g should be a class method")
works fine, but this:
if not inspect.isfunction(cls.f):
raise Exception("f should be a static method")
doesn't, because on Python 3, cls.f will be the f function whether or not the staticmethod decorator was applied. (On Python 2, it would have been an unbound method object without the decorator.)
Instead of accessing cls.f or cls.g and trying to work out what kind of descriptor you went through based on the results of the descriptor protocol, bypass the descriptor protocol and access the raw contents of the class definition's namespace:
if 'f' in namespace and not isinstance(namespace['f'], staticmethod):
whatever()
if 'g' in namespace and not isinstance(namespace['g'], classmethod):
whatever()
OK, so it seems that trying to check whether cls.f or cls.g are static or class methods in the metaclass is pointless, they don't seem to be bound yet.
However, having used the staticmethod or classmethod decorator on a method must surely have left its mark on it. I played around, and eventually found that what I originally wanted to do can be implemented as follows:
class BaseMeta(type):
def __init__(cls, name, bases, namespace):
super().__init__(name, bases, namespace)
# check if cls.f is a static method
if 'f' in namespace and not isinstance(namespace['f'], staticmethod):
raise Exception(cls.__name__ + ".f should be a static method")
# check if cls.g is a class method
if 'g' in namespace and not isinstance(namespace['g'], classmethod):
raise Exception(cls.__name__ + ".g should be a class method")
So, the answer to the original question is:
Checking whether a method has been decorated with staticmethod or classmethod is possible in a metaclass, by retrieving the method from the namespace and checking whether it is an instance of 'staticmethod' or 'classmethod'.
For Python 3.2+, you can use inspect.getattr_static:
Retrieve attributes without triggering dynamic lookup via the descriptor protocol, getattr() or getattribute().
Example:
import inspect
class A:
#staticmethod
def f():
pass
#classmethod
def g(cls):
pass
def r():
pass
a = A()
print(isinstance(inspect.getattr_static(a, "f"), staticmethod))
print(isinstance(inspect.getattr_static(A, "f"), staticmethod))
print(isinstance(inspect.getattr_static(a, "g"), classmethod))
print(isinstance(inspect.getattr_static(A, "g"), classmethod))
print(isinstance(inspect.getattr_static(a, "r"), classmethod))
print(isinstance(inspect.getattr_static(A, "r"), staticmethod))
Will output:
True
True
True
True
False
False
I want to register classes to a manager after class was loaded, like http handlers register to a handler manager.
It can be done in other ways, such as define the relationship in a map or call a register function after class definition.
But is there a better way to do this automatically?
update:
While Dunes's answer filled my need. I'm trying to improve the question and make it more useful for others who meet the same problem.
Here are the examples.
handler/__init__.py
handler/basehandler.py - classBase, HandlerManager
handler/handlerA.py - classA(classBase)
handler/hanlderB.py - classB(classBase)
handlerA and handlerB contains classA and classB, which are subclasses of classBase.
classA handlers requests from /a/, classB handlers /b/
I need to register them to HandlerManager automatically at the first time the handler module is imported.
If "being loaded" here means "being imported" (at the first time), then class decorator is an solution. Below sample code is copied from this page
registry = {}
def register(cls):
registry[cls.__clsid__] = cls
return cls
#register
class Foo(object):
__clsid__ = "123-456"
def bar(self):
pass
Seems like a possible use for metaclasses. It's rare to need to use metaclasses for anything -- they're overkill for pretty much everything. And most things that can be achieved using a meta class can be more easily achieved using decorators. However, this way you can ensure that any subclass of your base handler will automatically be registered too (unless it asks to not be registered).
class HandlerManager:
handlers = []
#classmethod
def register(cls, handler):
print("registering", handler)
cls.handlers.append(handler)
class HandlerRegisterer(type):
def __init__(self, name, bases, attrs, register=True):
super().__init__(name, bases, attrs)
if register:
HandlerManager.register(self)
def __new__(metaclass, name, bases, attrs, register=True):
return super().__new__(metaclass, name, bases, attrs)
class BaseHandler(metaclass=HandlerRegisterer, register=False):
# not actually a real handler, so don't register this class
pass
class MyHandler(BaseHandler):
# only have to inherit from another handler to make sure this class
# gets registered.
pass
print(HandlerManager.handlers)
assert BaseHandler not in HandlerManager.handlers
assert MyHandler in HandlerManager.handlers
If you need to use abstract classes then you will need to make your meta class subclass ABCMeta. This is because abstract classes are achieved by using meta classes, and python only allows a class to have one meta class. By subclassing ABCMeta you make the two subclasses compatible (there's no code in either one that conflicts with the other).
from abc import ABC, ABCMeta, abstractmethod
class HandlerRegisterer(ABCMeta):
# subclass ABCMeta rather than type
def __init__(self, name, bases, attrs, register=True):
super().__init__(name, bases, attrs)
if register:
HandlerManager.register(self)
def __new__(metaclass, name, bases, attrs, register=True):
return super().__new__(metaclass, name, bases, attrs)
class AbstractSubHandler(MyHandler, ABC, register=False):
# not strictly necessary to subclass ABC, but nice to know it won't
# screw things up
#abstractmethod
def some_method(self):
pass
try:
AbstractSubHandler()
except TypeError:
print("Can not instantiate abstract class")
print(HandlerManager.handlers)
assert AbstractSubHandler not in HandlerManager.handlers
I am not sure what you mean by "loaded" classes are usually initialized, or called.
In which case either the __init__ method or the __call__ method are used.
Both can be defined and can include calls to register in a manager.
Classes, and specifically the __init__ method are described better here.
Small example:
class test:
def __init__(self):
print 'I have started!'
>>> x = test()
I have started!
(Just replace the print with your registration code.)
Yes, there is a way to do this automatically.
As Inbar suggests, the __init__ method is the place to register an object creation.
Here is an example that you can use to effectively wrap existing classes, rather than overwriting __init__. In this case I have made a wrapper for the lists class. By calling super you can use initialising code from the original class.
class nlist(list):
""" """
def __init__(self, *args, **kwargs):
print('making a new list!') # overwrite with call to a manager
super().__init__(*args)
How this looks:
>>> list('hello')
['h', 'e', 'l', 'l', 'o']
>>> nlist('hello')
making a new list!
['h', 'e', 'l', 'l', 'o']