Python how to add __init__ param to subclass - python

I have a subclass sharing the __ init __ of it's base class:
class SubClass(BaseClass)
def __init__(self, param, *args, **kwargs):
super().__init__(*args, **kwargs)
self.thing = param
The problem I have been having is the subclass __ init __ parameter "param" is being passed into the super().__init__(*args, **kwargs) as an extra parameter. This usually gives me an error like:
TypeError: __init__() takes from 1 to 2 positional arguments but 3 were given
I don't want that. I only want "param" to be used for these subclass instances. How do stop sending the extra param to the baseclass __ init __ while still being able to use it in the subclass __ init __? Example code to reproduce the issue:
from unittest import TestCase
class TestCaseSubClass(TestCase):
def __init__(self, param, *args, **kwargs):
super().__init__(*args, **kwargs) # Just use whatever is in TestCase's init + our stuff
self.thing = param
print(self.thing)
class TestClass(TestCaseSubClass(param='bdfbdfb')):
def test_stuff(self):
print('test stuff here')
Or with just raw python, no import, why cant I do this? (same error)
class A(object):
def __init__(self, athing='thing'):
self.thing = athing
print(self.thing)
class AB(A):
def __init__(self, param, *args, **kwargs):
super().__init__(*args, **kwargs)
self.param= param
print(self.param)
class ABC(AB(param='thh')):
pass
ABCinstance = ABC()

I'm interpreting this question as "how can I provide a default parameter to a subclass without defining an __init__ for it?". One possible way is to define the default value as a class attribute, which you access in the parent class' __init__:
from unittest import TestCase
class TestCaseSubClass(TestCase):
_default_param = None
def __init__(self, *args, **kwargs):
param = kwargs.pop("param", self._default_param)
super().__init__(*args, **kwargs) # Just use whatever is in TestCase's init + our stuff
self.thing = param
class TestClass(TestCaseSubClass):
_default_param = "bdfbdfb"
def test_stuff(self):
print('test stuff here')
x = TestClass()
print(x.thing) #"bdfbdfb"
y = TestClass(param="foo")
print(y.thing) #"foo"
This approach doesn't quite match the argument format in your question, since now param is a keyword-only argument, rather than a named positional argument. The principal practical difference is that you can't supply an argument for param unless you refer to it by name: z = TestClass("foo") won't do it, for example.
Based on the edits and comments to this question, another possible interpretation may be "How can I provide a parameter to a subclass that gets passed to the parent class, by any means necessary?", which has no requirement regarding default values. If you're willing to make param a mandatory parameter, then you simply need to pass the value in when creating a TestClass instance:
from unittest import TestCase
class TestCaseSubClass(TestCase):
def __init__(self, param, *args, **kwargs):
super().__init__(*args, **kwargs) # Just use whatever is in TestCase's init + our stuff
self.thing = param
class TestClass(TestCaseSubClass):
def test_stuff(self):
print('test stuff here')
x = TestClass("bdfbdfb")
print(x.thing) #"bdfbdfb"

Related

Extending behavior of the property decorator

I would like to extend the behavior of the builtin #property decorator. The desired usage is shown in the code below:
class A:
def __init__(self):
self.xy = 42
#my_property(some_arg="some_value")
def x(self):
return self.xy
print(A().x) # should print 42
First of all, the decorator should retain the property behavior so that no () is needed after the x. Next, I would like to be able to access the arguments a programmer passes to my decorator.
I started off with this:
class my_property(property):
def __init__(self, fn):
super().__init__(fn)
TypeError: __init__() got an unexpected keyword argument 'some_arg'
After adding **kwargs:
class my_property(property):
def __init__(self, fn, **kwargs):
super().__init__(fn)
TypeError: __init__() missing 1 required positional argument: 'fn'
OK, let's do *args instead:
class my_property(property):
def __init__(self, *args, **kwargs):
super().__init__(*args)
TypeError: 'my_property' object is not callable
Let's make it callable:
class my_property(property):
def __init__(self, *args, **kwargs):
super().__init__(*args)
def __call__(self, *args, **kwargs):
pass
No errors, but prints None instead of 42
And now I am lost. I have not even yet managed to access `some_arg="some_value" and the property behavior seems to be already gone. What is wrong and how to fix it?
It's not clear how you intent to use some_arg, but to pass a parameter to a decorator you need to have "two layers" of decorators
#my_decorator(arg)
def foo():
return
under the hood this translates to my_decorator(arg)(foo) (i.e. my_decorator(arg) must return another decorator that is called with foo). The inner decorator in this case should be your custom implementation of property
def my_property(some_arg):
class inner(object):
def __init__(self, func):
print(some_arg) # do something with some_arg
self.func = func
def __get__(self, obj, type_=None):
return self.func(obj)
return inner
Now you can use it like this:
class MyClass:
def __init__(self, x):
self.x = x
#my_property('test!')
def foo(self):
return self.x
obj = MyClass(42) # > test!
obj.foo # > 42
Read more about descriptors here

Override function decorator argument

I've a base class and a child class. Base class has a class variable which is passed to decorator. Now, when I inherit Base into child, and change the variable value, the decorator does not take the over-ride class variable value.
Here's the code:-
class Base():
variable = None
#decorator(variable=variable)
def function(self):
pass
class Child(Base):
variable = 1
Without overriding the function again: How do I pass child class variable to the decorator?
The comment from deceze already explained why this is not getting reflected on the sub classes.
One workaround is, you can build the logic on the decorator side.
Ie, something like this.
def decorator(_func=None, *, variable):
def decorator_func(func):
def wrapper(self, *args, **kwargs):
variable_value = getattr(self.__class__, variable)
print(variable_value)
# You can use this value to do rest of the work.
return func(self, *args, **kwargs)
return wrapper
if _func is None:
return decorator_func
else:
return decorator_func(_func)
Also update the decorator syntax from #decorator(variable=variable) to #decorator(variable='variable')
class Base:
variable = None
#decorator(variable='variable')
def function(self):
pass
DEMO
b = Base()
b.function() # This will print `None`.
Lets try with the subclass
b = Child()
b.function() # This will print `1`.

Python - Base class' constructor is overriden

As explained in How does Python's super() work with multiple inheritance?, super can be used in multiple inheritance as well, as it will look for the attribute in both parents. But what attribute? The subclass already includes a super (if you look at the code below). How do I specify the attribute I want? I want Error's constructor.
class Error(object):
def __init__(self, values):
self.values = values
class AddDialog(sized_controls.SizedDialog, Error):
def __init__(self, *args, **kwargs):
Error.__init__(self, *args)
super(AddDialog, self).__init__(*args, **kwargs)
It is as easy as just trying it out:
class Error(object):
def __init__(self, values):
self.values = values
print('Error')
class SizedDialog(object):
def __init__(self, values):
self.values = values
print('SizedDialog')
class AddDialog(SizedDialog, Error):
def __init__(self, *args, **kwargs):
print('AddDialog')
Error.__init__(self, *args)
super(AddDialog, self).__init__(*args, **kwargs)
Now, super() is nothing else but going along the method resolution order (MRO) which you can get with mro():
>>> AddDialog.mro()
[__main__.AddDialog, __main__.SizedDialog, __main__.Error, object]
So, in your case you call the __init__() of Error explicitly first. Then super() will, in this specific case, find the __init__() of SizedDialog because it comes before Error in the MRO.
>>> a = AddDialog(10)
AddDialog
Error
SizedDialog
If you only use super() (no call to __init__() of Error), you get only the __init__() of SizedDialog:
class AddDialog(SizedDialog, Error):
def __init__(self, *args, **kwargs):
print('AddDialog')
super(AddDialog, self).__init__(*args, **kwargs)
>>> a = AddDialog(10)
AddDialog
SizedDialog
Finally, if you only call the __init__() of Error, it is the only __init__() that is called.
class AddDialog(SizedDialog, Error):
def __init__(self, *args, **kwargs):
print('AddDialog')
Error.__init__(self, *args)
>>> a = AddDialog(10)
AddDialog
Error
So your question:
But what attribute?
has the answer:
The one you call.
It does not matter if you hard-wire the class, as done with Error, or let super() find the appropriate parent class, i.e. the next one in the MRO.
The only difference is that super() might call the __init__()of the grandparent class if the parent class does not have an __init__().
But this is the intended behavior of super().

How to do this Class inheritance in Python?

I have a Python/Tornado application that responds to HTTP requests with the following 3 classes:
import tornado.web
class MyClass1(tornado.web.RequestHandler):
x = 1
y = 2
def my_method1(self):
print "Hello World"
class MyClass2(MyClass1):
#tornado.web.authenticated
def get(self):
#Do Something 1
pass
#tornado.web.authenticated
def post(self):
#Do Something 2
pass
class MyClass3(MyClass2):
pass
I would like all instances MyClass2 to have an instance variable m set to the integer 3. But any instances of MyClass3 should over-ride that and have m set to the integer 4. How can I do it?
I tried adding the following constructors to MyClass2 and MyClass3 respectively, but then when I try to create an instance of MyClass3, I get the following error: TypeError: __init__() takes exactly 1 argument (3 given)
MyClass2.init():
def __init__(self):
self.m = 3 # This value will be overridden by a subclass
MyClass3.init():
def __init__(self):
self.m = 4
The tornado.web.RequestHandler already has a __init__ method and Tornado expects it to take two arguments (plus the self argument of a bound method). Your overridden versions don't take these.
Update your __init__ methods to take arbitrary extra arguments and pass these on via super():
class MyClass2(MyClass1):
def __init__(self, *args, **kwargs):
super(MyClass2, self).__init__(*args, **kwargs)
self.m = 3
#tornado.web.authenticated
def get(self):
#Do Something 1
pass
#tornado.web.authenticated
def post(self):
#Do Something 2
pass
class MyClass3(MyClass2):
def __init__(self, *args, **kwargs):
super(MyClass3, self).__init__(*args, **kwargs)
self.m = 4
You could also use the RequestHandler.initialize() method to set up per-request instance variables; you may have to use super() again to pass on the call to the parent class, if your parent class initialize() does more work than just set self.m.
ReequestHandler's constructor takes arguments:
class RequestHandler(object):
...
def __init__(self, application, request, **kwargs):
...
When you inherit RequestHandler you then either:
a) Do not override __init__ (i.e. do not provide your own constructor)
or
b) If you override __init__ (provide your own constructor), then your constructor should have the same signature, since the framework will call the constructor.

Decorating a child class's __init__ method with super()

My class hierarchy is set up so that every child's __init__() must set self._init_has_run() to False, call the parent's __init__(), then do their own __init__(), and finally set self._init_has_run() to True. I have the following code:
class Parent:
def __init__(self, arg1, arg2):
pass # do stuff
def init(cls, fun):
def decorated_init(self, *args, **kwargs):
self._init_has_run = False
x = super()
super().__init__(*args, **kwargs)
fun(self, *args, **kwargs)
self._init_has_run = True
return decorated_init
class Child(Parent):
#Parent.init
def __init__(self, arg1, arg2):
pass # do stuff
Since there are a number of subclasses that follow the same general pattern for __init__(), and I can't figure out how to use metaclasses, I am using a decorator to consolidate the repetitive logic and then just applying that decorator to all descendant __init__() methods.
Python is throwing the following:
File "filename.py", line 82, in decorated_init
super().__init__(*args, **kwargs)
TypeError: object.__init__() takes no parameters
I confirmed through the debugger that the toggling of self._init_has_run works fine and super() is resolving to the Parent class, but when the decorator tries to call super().__init__(*args, **kwargs), why does Python try to call object.__init__() instead?
You can easily use metaclasses to do some pre/post-init stuff. Consider this example:
class Meta(type):
def __new__(meta, *args):
# This is something like 'class constructor'.
# It is called once for every new class definition.
# It sets default value of '_init_has_run' for all new objects.
# This is analog to `class Foo: _init_has_run = False`:
# new objects will all have _init_has_run set to False by default.
cls = super(Parent, meta).__new__(meta, *args)
cls._init_has_run = False
return cls
def __call__(cls, *args, **kwargs):
# This is called each time you create new object.
# It will run new object's constructor
# and change _init_has_run to False.
obj = type.__call__(cls, *args, **kwargs)
obj._init_has_run = True
return obj
class Child:
__metaclass__ = Meta
def __init__(self):
print 'init:', self._init_has_run
def foo(self):
print 'foo:', self._init_has_run
a = Child()
a.foo()
a = Child()
a.foo()
Output:
init: False
foo: True
init: False
foo: True
Hope this helps!

Categories