__new__ is not getting called in object creation - python

case 1:
class Person:
def __new__(cls, *args, **kwargs):
print "called"
return super(Person, cls).__new__(cls, *args, **kwargs)
p=Person()
case 2:
class Person(object):
def __new__(cls, *args, **kwargs):
print "called"
return super(Person, cls).__new__(cls, *args, **kwargs)
p=Person()
In first case, the __new__() method is not called but in 2nd case it does.
If it doesn't get called, then how is Person object being created?

I guess it is something related to new and old style classes in Python2:
Old-style classes don't actually have a __new__ method because for them __init__ is the constructor, so basically if we would have:
class A:
def __new__(cls):
print "A.__new__ is called" # -> this is never called
A()
the body of __new__ will never be executed in this case because it is not the purpose for old-style classes.
In Python3, the behavior is the same, doesn't matter if you explicitly inherit from the object or not:
class Person1:
def __new__(cls, *args, **kwargs):
print("called")
return super(Person1, cls).__new__(cls, *args, **kwargs)
class Person2(object):
def __new__(cls, *args, **kwargs):
print("called")
return super(Person2, cls).__new__(cls, *args, **kwargs)
p1 = Person1()
p2 = Person2()
These should print "called" twice when invoked from 3.x.

I was looking for the documentation, and finally found it here:
https://staging2.python.org/dev/peps/pep-0253/
The type object has a new slot, tp_new, which can act as a factory for
instances of the type. Types are now callable, because the tp_call
slot is set in PyType_Type (the metatype); the function looks for the
tp_new slot of the type that is being called.
To add onto #devforfu's answer, in the old days, __new__ didn't exist. It was added with the addition of new-style classes.

Related

Python3 impossible to pass #property as decorator argument

I've implemented decorator that can receive extra arguments and want to use it with class methods. I want to pass #property as decorator argument, but instead of #property result I got this:
<property object at 0x7f50f5195230>
This is my decorator:
class Decorator(object):
def __init__(self, some_arg):
self.func = None
self.some_arg = some_arg
def __get__(self, instance, owner):
import functools
return functools.partial(self.__call__, instance)
def __call__(self, func):
self.func = func
def wrapper(*args, **kwargs):
return self._process_sync(*args, **kwargs)
return wrapper
def _process_sync(self, *args, **kwargs):
try:
print(self.some_arg)
return self.func(*args, **kwargs)
except Exception as e:
print(e)
return None
My test class:
class Test(object):
#property
def some_data(self):
return {'key': 'value'}
#Decorator(some_data)
def some_method(self):
print('method output')
return None
Usage:
test = Test()
test.some_method()
Two questions:
How to correctly pass property to receive #property result instead of <property object at 0x7f50f5195230>
Does it possible to pass class properties/methods to the decorator if they are below in code?
A property object is a descriptor. To get a value out of it, you need to call its __get__ method with an appropriate instance. Figuring out when to do that in your current code is not easy, since your Decorator object has a bunch of different roles. It's both a decorator factory (getting initialized with an argument in the #Decorator(x) line), and the decorator itself (getting called with the function to be decorated). You've given it a __get__ method, but I don't expect that to ever get used, since the instance of Decorator never gets assigned to a class variable (only the wrapper function that gets returned from __call__).
Anyway, here's a modified version where the Decorator handles almost all parts of the descriptor protocol itself:
class Decorator:
def __init__(self, arg):
self.arg = arg # this might be a descriptor, like a property or unbound method
def __call__(self, func):
self.func = func
return self # we still want to be the descriptor in the class
def __get__(self, instance, owner):
try:
arg = self.arg.__get__(instance, owner) # try to bind the arg to the instance
except AttributeError: # if it doesn't work, self.arg is not a descriptor, that's OK
arg = self.arg
def wrapper(*args, **kwargs): # this is our version of a bound method object
print(arg) # do something with the bound arg here
return self.func.__get__(instance, owner)(*args, **kwargs)
return wrapper

Python - Base class' constructor is overriden

As explained in How does Python's super() work with multiple inheritance?, super can be used in multiple inheritance as well, as it will look for the attribute in both parents. But what attribute? The subclass already includes a super (if you look at the code below). How do I specify the attribute I want? I want Error's constructor.
class Error(object):
def __init__(self, values):
self.values = values
class AddDialog(sized_controls.SizedDialog, Error):
def __init__(self, *args, **kwargs):
Error.__init__(self, *args)
super(AddDialog, self).__init__(*args, **kwargs)
It is as easy as just trying it out:
class Error(object):
def __init__(self, values):
self.values = values
print('Error')
class SizedDialog(object):
def __init__(self, values):
self.values = values
print('SizedDialog')
class AddDialog(SizedDialog, Error):
def __init__(self, *args, **kwargs):
print('AddDialog')
Error.__init__(self, *args)
super(AddDialog, self).__init__(*args, **kwargs)
Now, super() is nothing else but going along the method resolution order (MRO) which you can get with mro():
>>> AddDialog.mro()
[__main__.AddDialog, __main__.SizedDialog, __main__.Error, object]
So, in your case you call the __init__() of Error explicitly first. Then super() will, in this specific case, find the __init__() of SizedDialog because it comes before Error in the MRO.
>>> a = AddDialog(10)
AddDialog
Error
SizedDialog
If you only use super() (no call to __init__() of Error), you get only the __init__() of SizedDialog:
class AddDialog(SizedDialog, Error):
def __init__(self, *args, **kwargs):
print('AddDialog')
super(AddDialog, self).__init__(*args, **kwargs)
>>> a = AddDialog(10)
AddDialog
SizedDialog
Finally, if you only call the __init__() of Error, it is the only __init__() that is called.
class AddDialog(SizedDialog, Error):
def __init__(self, *args, **kwargs):
print('AddDialog')
Error.__init__(self, *args)
>>> a = AddDialog(10)
AddDialog
Error
So your question:
But what attribute?
has the answer:
The one you call.
It does not matter if you hard-wire the class, as done with Error, or let super() find the appropriate parent class, i.e. the next one in the MRO.
The only difference is that super() might call the __init__()of the grandparent class if the parent class does not have an __init__().
But this is the intended behavior of super().

decorate specific methods of a class

i read about decorators and i am trying to decorate all the methods of a class WITHOUT static methods.
right now i just use the decorator i wrote for the specific functions that are not static, so i wonder if there is a way to both decorate a lot of methods but avoid static ones
what i get with my decorator:
TypeError: unbound method test() must be called with ClassTest instance as first argument (got nothing instead)
my decorator:
def decorator(func):
def wrapper(self, *args, **kwargs):
print "test"
return func(self, *args, **kwargs)
return wrapper
First of all, decorating a class is pretty simple:
def class_decorator(cls):
# modify cls
return cls
In order to add/remove/modify functionality to a method, you could call setattr with a decorated version of a method (or a variable):
setattr(some_class, some_attribute, decorator(some_callable))
As to differentiating between different types of methods, there are a couple of attributes you'll be able to use
to determine whether a method is an instance/class/static method.
A full working example:
def _is_instance_method(var):
if not hasattr(var, '__call__'): # It's not a callable
return False
if not hasattr(var, 'im_self'): # It's a callable, but it's not a bound method
return False
if getattr(var, 'im_self') is not None: # At this point, if it's a class method,
# it will be bound to the class, while
# the instance method is still unbound
# return False if it's bound (i.e. a class method)
return False
return True # All that remains is a callable, that's boundable, but not yet -- an instance method!
def func_decorator(func):
def func_wrapper(self, *args, **kwargs):
print "Inside %s!" % (func.__name__,)
return func(self, *args, **kwargs)
return func_wrapper
def class_decorator(cls):
for attr in cls.__dict__:
var = getattr(cls, attr)
if _is_instance_method(var): # Determine whether the attribute is an instance method
setattr(cls, attr, func_decorator(var)) # Replace the function with a decorated one
return cls # Return the class with its new decorated instance methods
#class_decorator
class B(object):
#staticmethod
def static_method():
return "static method"
#classmethod
def cls_method(cls):
return "cls method"
def instance_method(self):
return "instance method"
print B.static_method()
print B.cls_method()
b = B()
print b.instance_method()

Decorating a child class's __init__ method with super()

My class hierarchy is set up so that every child's __init__() must set self._init_has_run() to False, call the parent's __init__(), then do their own __init__(), and finally set self._init_has_run() to True. I have the following code:
class Parent:
def __init__(self, arg1, arg2):
pass # do stuff
def init(cls, fun):
def decorated_init(self, *args, **kwargs):
self._init_has_run = False
x = super()
super().__init__(*args, **kwargs)
fun(self, *args, **kwargs)
self._init_has_run = True
return decorated_init
class Child(Parent):
#Parent.init
def __init__(self, arg1, arg2):
pass # do stuff
Since there are a number of subclasses that follow the same general pattern for __init__(), and I can't figure out how to use metaclasses, I am using a decorator to consolidate the repetitive logic and then just applying that decorator to all descendant __init__() methods.
Python is throwing the following:
File "filename.py", line 82, in decorated_init
super().__init__(*args, **kwargs)
TypeError: object.__init__() takes no parameters
I confirmed through the debugger that the toggling of self._init_has_run works fine and super() is resolving to the Parent class, but when the decorator tries to call super().__init__(*args, **kwargs), why does Python try to call object.__init__() instead?
You can easily use metaclasses to do some pre/post-init stuff. Consider this example:
class Meta(type):
def __new__(meta, *args):
# This is something like 'class constructor'.
# It is called once for every new class definition.
# It sets default value of '_init_has_run' for all new objects.
# This is analog to `class Foo: _init_has_run = False`:
# new objects will all have _init_has_run set to False by default.
cls = super(Parent, meta).__new__(meta, *args)
cls._init_has_run = False
return cls
def __call__(cls, *args, **kwargs):
# This is called each time you create new object.
# It will run new object's constructor
# and change _init_has_run to False.
obj = type.__call__(cls, *args, **kwargs)
obj._init_has_run = True
return obj
class Child:
__metaclass__ = Meta
def __init__(self):
print 'init:', self._init_has_run
def foo(self):
print 'foo:', self._init_has_run
a = Child()
a.foo()
a = Child()
a.foo()
Output:
init: False
foo: True
init: False
foo: True
Hope this helps!

TypeError: type() takes 1 or 3 arguments

I have a TypeClass to make Class:
class MyMetaClass(type):
def __new__(cls, *args, **kwargs):
print('call __new__ from MyMetaClass.')
return type(cls.__name__, *args, **kwargs)
but when use it :
Foo= MyMetaClass('Foo', (), {'name':'pd'})
raise Error :
TypeError: type() takes 1 or 3 arguments
if Change it like :
class MyMetaClass(type):
def __new__(cls, *args, **kwargs):
print('call __new__ from MyMetaClass.')
return type(cls.__name__, (), {})
it will works okey !
where is problem ?
The __new__ method is passed 3 positional arguments in args; the class name, the baseclasses and the class body. The cls argument is bound to the metaclass, so MyMetaClass here.
You are adding another name to that sequence; drop the name, or remove the first argument from args:
class MyMetaClass(type):
def __new__(cls, *args, **kwargs):
print('call __new__ from MyMetaClass.')
return type(*args, **kwargs)
or
class MyMetaClass(type):
def __new__(cls, *args, **kwargs):
print('call __new__ from MyMetaClass.')
return type(cls.__name__, *args[1:], **kwargs)
The cls argument is the metaclass object however, so unless you want all your classes to be called MyMetaClass I'd stick with the first option.
See the Customizing class creation section of the Python data model:
These steps will have to be performed in the metaclass’s __new__() method – type.__new__() can then be called from this method to create a class with different properties. This example adds a new element to the class dictionary before creating the class:
class metacls(type):
def __new__(mcs, name, bases, dict):
dict['foo'] = 'metacls was here'
return type.__new__(mcs, name, bases, dict)
and the object.__new__ documentation:
__new__() is a static method (special-cased so you need not declare it as such) that takes the class of which an instance was requested as its first argument. The remaining arguments are those passed to the object constructor expression (the call to the class).
where class of which an instance was requested is your metaclass (producing a class object).
Demo:
>>> class MyMetaClass(type):
... def __new__(cls, *args, **kwargs):
... print('call __new__ from MyMetaClass.')
... return type(*args, **kwargs)
...
>>> class Foo(object):
... __metaclass__ = MyMetaClass
...
call __new__ from MyMetaClass.
>>> Foo
<class '__main__.Foo'>
>>> class MyMetaClass(type):
... def __new__(cls, *args, **kwargs):
... print('call __new__ from MyMetaClass.')
... return type(cls.__name__, *args[1:], **kwargs)
...
>>> class Foo(object):
... __metaclass__ = MyMetaClass
...
call __new__ from MyMetaClass.
>>> Foo
<class '__main__.MyMetaClass'>
>>> # Note the ^^^^^^^^^^^^ class.__name__ attribute here
...

Categories