Unaccessible `__init__` - python

class MetaA(type):
def __new__(self, clsname, bases, clsdict):
print('meta __new__ called')
return super().__new__(self, clsname, bases, clsdict)
def __call__(self, *args, **kwargs):
print('meta __call__ called')
class A(metaclass=MetaA):
def __init__(self, x):
print('__init__ called')
self.x = x
meta __new__ called
a = A(['1'])
meta __call__ called
type(a)
NoneType
It appears that specifying a __call__ in the metaclass, blocks the __init__ of the main class.
How do I access the __init__ of the main class?

Your Meta.__call__ method doesn't return anything, which is the same as returning None. That's why a is None at the end of your code.
The default behavior of type.__call__ (which you're overriding) is to create an instance and return it. If you still want that to happen, you need to call (and return the result of) super().__call__(*args, **kwargs), similar to what you do in Meta.__new__.
def __call__(self, *args, **kwargs):
print('meta __call__ called')
return super().__call__(*args, **kwargs)
I suppose you could instead reimplement the behavior of type.__call__ (e.g. by calling the __new__ and __init__ methods of the class yourself), but unless you want to change some part of the default behavior, calling type.__call__ to do it via super() is easier.
It's unrelated to your issue, but the first argument name you're using in your metaclass methods may lead you astray at some point. The __new__ method is called as if it was a classmethod, with the metaclass as its first argument. The __call__ method on the other hand is a normal method, being called on an instance of the metaclass, which is a class. I'd use meta (or maybe mcls) and cls as the argument names, respectively, to avoid confusion. I reserve self for regular classes, to refer to their own instances.
I guess if you wanted to reimplement type.__call__, you could use self to name the instance after you've created it:
def __call__(cls, *args, **kwargs):
print('meta __call__ called')
self = cls.__new__(cls, *args, **kwargs) # create the instance with __new__
if isinstance(self, cls): # if __new__ didn't do something strange
self.__init__(*args, **kwargs) # call __init__ on the instance
return self # then return it to the caller

Related

Decorator class and missing required positional arguments

I'm having problems with a wrapper class, and can't figure out what I'm doing wrong.
How do I go about getting that wrapper working with any class function with the 'self' argument?
This is for Python 3.7.3.
The thing is I remember the wrapper working before, but it seems something has changed...maybe I'm just doing something wrong now that I wasn't before.
class SomeWrapper:
def __init__(self, func):
self.func = func
def __call__(self, *args, **kwargs):
# this fails because self is not passed
# ERROR: __init__() missing 1 required positional argument: 'self'
func_ret = self.func(*args, **kwargs)
# this is also wrong, because that's the wrong "self"
# ERROR: 'SomeWrapper' object has no attribute 'some_func'
# func_ret = self.func(self, *args, **kwargs)
return func_ret
class SomeClass:
SOME_VAL = False
def __init__(self):
self.some_func()
print("Success")
#SomeWrapper
def some_func(self):
self.SOME_VAL = True
def print_val(self):
print(self.SOME_VAL)
SomeClass().print_val()
So, what happens is that in python 3, for method declarations work as methods, when they are just defined as functions inside the class body, what happens is that the language makes use of the "descriptor protocol".
And to put it simply, an ordinary method is just a function, until it is retrieved from an instance: since the function has a __get__ method, they are recognized as descriptors, and the __get__ method is the one responsible to return a "partial function" which is the "bound method", and will insert the self parameter upon being called. Without a __get__ method, the instance of SomeWrapper when retrieved from an instance, has no information on the instance.
In short, if you are to use a class-based decorator for methods, you not only have to write __call__, but also a __get__ method. This should suffice:
from copy import copy
class SomeWrapper:
def __init__(self, func):
self.func = func
def __call__(self, *args, **kwargs):
func_ret = self.func(self.instance, *args, **kwargs)
return func_ret
def __get__(self, instance, owner):
# self here is the instance of "somewrapper"
# and "instance" is the instance of the class where
# the decorated method is.
if instance is None:
return self
bound_callable = copy(self)
bound_callable.instance = instance
return self
Instead of copying the decorator instance, this would also work:
from functools import partial
class SomeWrapper:
...
def __call__(self, instance, *args, **kw):
...
func_ret = self.func(instance, *args, **kw)
...
return func_ret
def __get__(self, instance, owner):
...
return partial(self, instance)
Both the "partial" and the copy of self are callables that "know" from which instances they where "__got__" from.
Simply setting the self.instance attribute in the decorator instance and returning self would also work, but limited to a single instance of the method being used at a time. In programs with some level of parallelism or even if the code would retrieve a method to call it lazily (such as using it to a callback), it would fail in a spectacular and hard to debug way, as the method would receive another instance in its "self" parameter.

Decorating class methods by overriding __new__ doesn't work?

I want to decorate all the methods of my class. I have written a sample small decorator for illustration purpose here.
Decorator:
def debug(func):
msg = func.__name__
#wraps(func)
def wrapper(*args, **kwargs):
print(msg)
return func(*args, **kwargs)
return wrapper
def debugmethods(cls):
for key, val in vars(cls).items():
if callable(val):
setattr(cls, key, debug(val))
return cls
Now I want to decorate all the methods of my class. One simple way is to use #debugmethods annotation on top of my class but I am trying to understand two other different approaches for doing so.
a) Overriding __new__
class Spam:
def __new__(cls, *args, **kwargs):
clsobj = super().__new__(cls)
clsobj = debugmethods(clsobj)
return clsobj
def __init__(self):
pass
def foo(self):
pass
def bar(self):
pass
spam = Spam()
spam.foo()
b) Writing metaclass
class debugmeta(type):
def __new__(cls, clsname, bases, clsdict):
clsobj = super().__new__(cls, clsname, bases, clsdict)
clsobj = debugmethods(clsobj)
return clsobj
class Spam(metaclass = debugmeta):
def foo(self):
pass
def bar(self):
pass
spam = Spam()
spam.foo()
I am not sure
Why " a) overriding __new__ " doesn't work ?
Why signature of method __new__ is different in metaclass?
Can someone help me understand what am I missing here.
You appear to be confused between __new__ and metaclasses. __new__ is called to create a new object (an instance from a class, a class from a metaclass), it is not a 'class created' hook.
The normal pattern is:
Foo(...) is translated to type(Foo).__call__(Foo, ...), see special method lookups for why that is. The type() of a class is it's metaclass.
The standard type.__call__ implementation used when Foo is a custom Python class will call __new__ to create a new instance, then call the __init__ method on that instance if the result is indeed an instance of the Foo class:
def __call__(cls, *args, **kwargs): # Foo(...) -> cls=Foo
instance = cls.__new__(cls, *args, **kwargs) # so Foo.__new__(Foo, ...)
if isinstance(instance, cls):
instance.__init__(*args, **kwargs) # Foo.__init__(instance, ...)
return instance
So Foo.__new__ is not called when the Foo class itself is created, only when instances of Foo are created.
You don't usually need to use __new__ in classes, because __init__ suffices to initialise the attributes of instances. But for immutable types, like int or tuple, you can only use __new__ to prepare the new instance state, as you can't alter the attributes of an immutable object once it is created. __new__ is also helpful when you want change what kinds of instances ClassObj() produce (such as creating singletons or producing specialised subclasses instead).
The same __call__ -> __new__ and maybe __init__ process applies to metaclasses. A class Foo: ... statement is implemented by calling the metaclass to create a class object, passing in 3 arguments: the class name, the class bases, and the class body, as a dictionary usually. With class Spam(metaclass = debugmeta): ..., that means debugmeta('Spam', (), {...}) is called, which means debugmeta.__new__(debugmeta, 'Spam', (), {...}) is called.
Your first attempt a, setting Spam.__new__ doesn't work, because you are not creating a class object there. Instead, super().__new__(cls) creates an empty Spam() instance with no attributes, so vars() returns an empty dictionary and debugmethods() ends up doing nothing.
If you want to hook into class creation, then you want a metaclass.

__call__ from metaclass shadows signature of __init__

I would like to have in the code underneath that when i type instance_of_A = A(, that the name of the supposed arguments is init_argumentA and not *meta_args, **meta_kwargs. But unfortunatally, the arguments of the __call__ method of the metaclass are shown.
class Meta(type):
def __call__(cls,*meta_args,**meta_kwargs):
# Something here
return super().__call__(*meta_args, **meta_kwargs)
class A(metaclass = Meta):
def __init__(self,init_argumentA):
# something here
class B(metaclass = Meta):
def __init__(self,init_argumentB):
# something here
I have searched for a solution and found the question How to dynamically change signatures of method in subclass?
and Signature-changing decorator: properly documenting additional argument. But none, seem to be completely what I want. The first link uses inspect to change the amount of variables given to a function, but i can't seem to let it work for my case and I think there has to be a more obvious solution.
The second one isn't completely what I want, but something in that way might be a good alternative.
Edit: I am working in Spyder. I want this because I have thousands of classes of the Meta type and each class have different arguments, which is impossible to remember, so the idea is that the user can remember it when seeing the correct arguments show up.
Using the code you provided, you can change the Meta class
class Meta(type):
def __call__(cls, *meta_args, **meta_kwargs):
# Something here
return super().__call__(*meta_args, **meta_kwargs)
class A(metaclass=Meta):
def __init__(self, x):
pass
to
import inspect
class Meta(type):
def __call__(cls, *meta_args, **meta_kwargs):
# Something here
# Restore the signature of __init__
sig = inspect.signature(cls.__init__)
parameters = tuple(sig.parameters.values())
cls.__signature__ = sig.replace(parameters=parameters[1:])
return super().__call__(*meta_args, **meta_kwargs)
Now IPython or some IDE will show you the correct signature.
I found that the answer of #johnbaltis was 99% there but not quite what was needed to ensure the signatures were in place.
If we use __init__ rather than __call__ as below we get the desired behaviour
import inspect
class Meta(type):
def __init__(cls, clsname, bases, attrs):
# Restore the signature
sig = inspect.signature(cls.__init__)
parameters = tuple(sig.parameters.values())
cls.__signature__ = sig.replace(parameters=parameters[1:])
return super().__init__(clsname, bases, attrs)
def __call__(cls, *args, **kwargs):
super().__call__(*args, **kwargs)
print(f'Instanciated: {cls.__name__}')
class A(metaclass=Meta):
def __init__(self, x: int, y: str):
pass
which will correctly give:
In [12]: A?
Init signature: A(x: int, y: str)
Docstring: <no docstring>
Type: Meta
Subclasses:
In [13]: A(0, 'y')
Instanciated: A
Ok - even though the reason for you to want that seems to be equivocated, as any "honest" Python inspecting tool should show the __init__ signature, what is needed for what you ask is that for each class you generate a dynamic metaclass, for which the __call__ method has the same signature of the class's own __init__ method.
For faking the __init__ signature on __call__ we can simply use functools.wraps. (but you might want to check the answers at
https://stackoverflow.com/a/33112180/108205 )
And for dynamically creating an extra metaclass, that can be done on the __metaclass__.__new__ itself, with just some care to avoud infinite recursion on the __new__ method - threads.Lock can help with that in a more consistent way than a simple global flag.
from functools import wraps
creation_locks = {}
class M(type):
def __new__(metacls, name, bases, namespace):
lock = creation_locks.setdefault(name, Lock())
if lock.locked():
return super().__new__(metacls, name, bases, namespace)
with lock:
def __call__(cls, *args, **kwargs):
return super().__call__(*args, **kwargs)
new_metacls = type(metacls.__name__ + "_sigfix", (metacls,), {"__call__": __call__})
cls = new_metacls(name, bases, namespace)
wraps(cls.__init__)(__call__)
del creation_locks[name]
return cls
I initially thought of using a named parameter to the metaclass __new__ argument to control recursion, but then it would be passed to the created class' __init_subclass__ method (which will result in an error) - so the Lock use.
Not sure if this helps the author but in my case I needed to change inspect.signature(Klass) to inspect.signature(Klass.__init__) to get signature of class __init__ instead of metaclass __call__.

Relationship of metaclass's "__call__" and instance's "__init__"?

Say I've got a metaclass and a class using it:
class Meta(type):
def __call__(cls, *args):
print "Meta: __call__ with", args
class ProductClass(object):
__metaclass__ = Meta
def __init__(self, *args):
print "ProductClass: __init__ with", args
p = ProductClass(1)
Output as follows:
Meta: __call__ with (1,)
Question:
Why isn't ProductClass.__init__ triggered...just because of Meta.__call__?
UPDATE:
Now, I add __new__ for ProductClass:
class ProductClass(object):
__metaclass__ = Meta
def __new__(cls, *args):
print "ProductClass: __new__ with", args
return super(ProductClass, cls).__new__(cls, *args)
def __init__(self, *args):
print "ProductClass: __init__ with", args
p = ProductClass(1)
Is it Meta.__call__'s responsibility to call ProductClass's __new__ and __init__?
There is a difference in OOP between extending a method and overriding it, what you just did in your metaclass Meta is called overriding because you defined your __call__ method and you didn't call the parent __call__. to have the behavior that you want you have to extend __call__ method by calling the parent method:
class Meta(type):
def __call__(cls, *args):
print "Meta: __call__ with", args
return super(Meta, cls).__call__(*args)
Yes - it's up to Meta.__call__ to call ProductClass.__init__ (or not, as the case may be).
To quote the documentation:
for example defining a custom __call__() method in the metaclass
allows custom behavior when the class is called, e.g. not always
creating a new instance.
That page also mentions a scenario where the metaclass's __call__ may return an instance of a different class (i.e. not ProductClass in your example). In this scenario it would clearly be inappropriate to call ProductClass.__init__ automatically.

Metaclasses in Python: a couple of questions to clarify

After crashing with metaclasses i delved into the topic of metaprogramming in Python and I have a couple of questions that are, imho, not clearly anwered in available docs.
When using both __new__ and __init__ in a metaclass, their arguments must be defined the same?
What's most efficient way to define class __init__ in a metaclass?
Is there any way to refer to class instance (normally self) in a metaclass?
When using both __new__ and __init__
in a metaclass, their arguments must
be defined the same?
I think Alex Martelli explains
it most succinctly:
class Name(Base1,Base2): <<body>>
__metaclass__==suitable_metaclass
means
Name = suitable_metaclass('Name', (Base1,Base2), <<dict-built-by-body>>)
So stop thinking about
suitable_metaclass as a metaclass
for a moment and just regard it as a
class. Whenever you see
suitable_metaclass('Name', (Base1,Base2), <<dict-built-by-body>>)
it tells you that
suitable_metaclass's __new__
method must have a signature
something like
def __new__(metacls, name, bases, dct)
and a __init__ method like
def __init__(cls, name, bases, dct)
So the signatures are not exactly the same, but they differ only in the first argument.
What's most efficient way to define
class __init__ in a metaclass?
What do you mean by efficient? It is
not necessary to define the __init__
unless you want to.
Is there any way to refer to class
instance (normally self) in a
metaclass?
No, and you should not need to.
Anything that depends on the class
instance should be dealt with in the
class definition, rather than in the
metaclass.
For 1: The __init__ and __new__ of any class have to accept the same arguments, because they would be called with the same arguments. It's common for __new__ to take more arguments that it ignores (e.g. object.__new__ takes any arguments and it ignores them) so that __new__ doesn't have to be overridden during inheritance, but you usually only do that when you have no __new__ at all.
This isn't a problem here, because as it was stated, metaclasses are always called with the same set of arguments always so you can't run into trouble. With the arguments at least. But if you're modifying the arguments that are passed to the parent class, you need to modify them in both.
For 2: You usually don't define the class __init__ in a metaclass. You can write a wrapper and replace the __init__ of the class in either __new__ or __init__ of the metaclass, or you can redefine the __call__ on the metaclass. The former would act weirdly if you use inheritance.
import functools
class A(type):
def __call__(cls, *args, **kwargs):
r = super(A, cls).__call__(*args, **kwargs)
print "%s was instantiated" % (cls.__name__, )
print "the new instance is %r" % (r, )
return r
class B(type):
def __init__(cls, name, bases, dct):
super(B, cls).__init__(name, bases, dct)
if '__init__' not in dct:
return
old_init = dct['__init__']
#functools.wraps(old_init)
def __init__(self, *args, **kwargs):
old_init(self, *args, **kwargs)
print "%s (%s) was instantiated" % (type(self).__name__, cls.__name__)
print "the new instance is %r" % (self, )
cls.__init__ = __init__
class T1:
__metaclass__ = A
class T2:
__metaclass__ = B
def __init__(self):
pass
class T3(T2):
def __init__(self):
super(T3, self).__init__()
And the result from calling it:
>>> T1()
T1 was instantiated
the new instance is <__main__.T1 object at 0x7f502c104290>
<__main__.T1 object at 0x7f502c104290>
>>> T2()
T2 (T2) was instantiated
the new instance is <__main__.T2 object at 0x7f502c0f7ed0>
<__main__.T2 object at 0x7f502c0f7ed0>
>>> T3()
T3 (T2) was instantiated
the new instance is <__main__.T3 object at 0x7f502c104290>
T3 (T3) was instantiated
the new instance is <__main__.T3 object at 0x7f502c104290>
<__main__.T3 object at 0x7f502c104290>
For 3: Yes, from __call__ as shown above.

Categories