I have some basic setup/teardown code that I want to reuse in a whole bunch of unit tests. So I got the bright idea of creating some derived classes to avoid repeating code in every test class.
In so doing, I received two strange errors. One, I cannot solve. Here is the unsolvable one:
AttributeError: 'TestDesktopRootController' object has no attribute '_testMethodName'
Here is my base class:
import unittest
import twill
import cherrypy
from cherrypy._cpwsgi import CPWSGIApp
class BaseControllerTest(unittest.TestCase):
def __init__(self):
self.controller = None
def setUp(self):
app = cherrypy.Application(self.controller)
wsgi = CPWSGIApp(app)
twill.add_wsgi_intercept('localhost', 8080, lambda : wsgi)
def tearDown(self):
twill.remove_wsgi_intercept('localhost', 8080)
And here is my derived class:
import twill
from base_controller_test import BaseControllerTest
class TestMyController(BaseControllerTest):
def __init__(self, args):
self.controller = MyController()
BaseControllerTest.__init__(self)
def test_root(self):
script = "find 'Contacts'"
twill.execute_string(script, initial_url='http://localhost:8080/')
The other strange error is:
TypeError: __init__() takes exactly 1 argument (2 given)
The "solution" to that was to add the word "args" to my __init__ function in the derived class. Is there any way to avoid that?
Remember, I have two errors in this one.
It's because you're overriding __init__() incorrectly. Almost certainly, you don't want to override __init__() at all; you should do everything in setUp(). I've been using unittest for >10 years and I don't think I've ever overridden __init__().
However, if you really do need to override __init__(), remember that you don't control where your constructor is called -- the framework calls it for you. So you have to provide a signature that it can call. From the source code (unittest/case.py), that signature is:
def __init__(self, methodName='runTest'):
The safe way to do this is to accept any arguments and just pass 'em up to the base class. Here is a working implementation:
class BaseTest(unittest.TestCase):
def __init__(self, *args, **kwargs):
unittest.TestCase.__init__(self, *args, **kwargs)
def setUp(self):
print "Base.setUp()"
def tearDown(self):
print "Base.tearDown()"
class TestSomething(BaseTest):
def __init__(self, *args, **kwargs):
BaseTest.__init__(self, *args, **kwargs)
self.controller = object()
def test_silly(self):
self.assertTrue(1+1 == 2)
In BaseController's __init__ you need to call unittest.TestCase's __init__ just like you did in TestMyController.
The call to construct a TestCase from the framework may be passing an argument. The best way to handle this for deriving classes is:
class my_subclass(parentclass):
def __init__(self, *args, **kw):
parentclass.__init__(self, *args, **kw)
...
Related
I have searched around for an answer to this question but couldn't find anything. My apologies if this was already asked before.
Of the 3-4 methods I know for enforcing from a parent class a given method on a child class (editing the __new__ method of a metaclass, hooking into builtins.__build_class__, use of __init_subclass__ or using abc.abstractmethod) I usually end up using the __init_subclass__, basically because of ease of use and, unlike #abc.abstractmethod, the constraint on the child class is checked upon child class definition and not class instantiation. Example:
class Par():
def __init_subclass__(self, *args, **kwargs):
must_have = 'foo'
if must_have not in list(self.__dict__.keys()):
raise AttributeError(f"Must have {must_have}")
def __init__(self):
pass
class Chi(Par):
def __init__(self):
super().__init__()
This example code will obviously throw an error, since Chi does not have a foo method. Nevertheless, I kind of just came across the fact that this constraint from the upstream class can be by-passed by using a simple class decorator:
def add_hello_world(Cls):
class NewCls(object):
def __init__(self, *args, **kwargs):
self.instance = Cls(*args, **kwargs)
def hello_world(self):
print("hello world")
return NewCls
#add_hello_world
class Par:
def __init_subclass__(self, *args, **kwargs):
must_have = "foo"
if must_have not in list(self.__dict__.keys()):
raise AttributeError(f"Must have {must_have}")
def __init__(self):
pass
class Chi(Par):
def __init__(self):
super().__init__()
c = Chi()
c.hello_world()
The above code runs without a problem. Now, disregarding the fact that the class I have decorated is Par (and, of course, if Par is library code I might not even have access to it as a user code developer), I cannot really explain this behavior. It is obvious to me that one could use a decorator to add a method or functionality to an existing class, but I had never seen an unrelated decorator (just prints hello world, doesn't even mess with class creation) disable a method already present in the class.
Is this an intended Python behavior? Or is this some kind of bug? To be honest, in my understanding, this might present some security concerns.
Does this happen only to the __init_subclass__ data model? Or also to others?
Remember, decorator syntax is just function application:
class Par:
def __init_subclass__(...):
...
Par = add_hello_world(Par)
The class originally bound to Par defined __init_subclass__; the new class defined inside add_hello_world does not, and that's the class that the post-decoration name Par refers to, and the class that you are subclassing.
Incidentally, you can still access the original class Par via __init__.
Calling the decorator explicitly:
class Par:
def __init_subclass__(self, *args, **kwargs):
must_have = "foo"
if must_have not in list(self.__dict__.keys()):
raise AttributeError(f"Must have {must_have}")
def __init__(self):
pass
Foo = Par # Keep this for confirmation
Par = add_hello_world(Par)
we can confirm that the closure keeps a reference to the original class:
>>> Par.__init__.__closure__[0].cell_contents
<class '__main__.Par'>
>>> Par.__init__.__closure__[0].cell_contents is Par
False
>>> Par.__init__.__closure__[0].cell_contents is Foo
True
And if you did try to subclass it, you would get the expected error:
>>> class Bar(Par.__init__.__closure__[0].cell_contents):
... pass
...
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "tmp.py", line 16, in __init_subclass__
raise AttributeError(f"Must have {must_have}")
AttributeError: Must have foo
1. My Requirements
The Decorator Class should use functools.wraps so it has proper introspection and organization for later.
Access to the decorated instance should be possible.
In the example below, I do it by passing a wrapped_self argument to the __call__ method.
As the title states, the Decorator Class must have parameters that you can tune for for each method.
2. An Example of What It Would Look Like
The ideal situation should look something like this:
class A():
def __init__(self):
...
#LoggerDecorator(logger_name='test.log')
def do_something(self):
...
with the Decorator Class being, so far (basic logger decorator based on a recipe coming from David Beazley's Python Cookbook):
class LoggerDecorator():
def __init__(self, func, logger_name):
wraps(func)(self)
self.logger_name = logger_name
def config_logger(self):
... # for example, uses `self.logger_name` to configure the decorator
def __call__(self, wrapped_self, *args, **kwargs):
self.config_logger()
wrapped_self.logger = self.logger
func_to_return = self.__wrapped__(wrapped_self, *args, **kwargs)
return func_to_return
def __get__(self, instance, cls):
if instance is None:
return self
else:
return types.MethodType(self, instance)
3. How Do I Fix It?
The error I'm getting refers to __init__ not recognizing a third argument apparently:
TypeError: __init__() missing 1 required positional argument: 'func'
It has been suggested to me that I should be putting func in the __call__ method. However, if I put it there as a parameter, wrapped_self isn't properly read as a parameter and I get this error:
__call__() missing 1 required positional argument: 'wrapped_self'
I've tried many things to fix this issue, including: putting wraps(func)(self) inside __call__; and many variations of this very close but not quite filling all of the requirements solution (the problem with it is that I can't seem to be able to access wrapped_self anymore).
Since you're implementing a decorator that takes parameters, the __init__ method of LoggerDecorator should take only the parameters that configures the decorator, while the __call__ method instead should become the actual decorator that returns a wrapper function:
class LoggerDecorator():
def __init__(self, logger_name):
self.logger_name = logger_name
self.config_logger()
def __call__(self, func):
#wraps(func)
def wrapper(wrapped_self, *args, **kwargs):
wrapped_self.logger = self.logger
func_to_return = func(wrapped_self, *args, **kwargs)
return func_to_return
return wrapper
from functools import wraps
class LoggerDecorator:
def __init__(self, logger):
self.logger = logger
def __call__(self, func, *args, **kwargs):
print func, args, kwargs
# do processing
return func
#LoggerDecorator('lala')
def a():
print 1
The above should work as expected. If you're planning to call the decorator using keyword arguments you can remove the logger from __init__ and use **kwargs which will return a dict of the passed keywork arguments.
I need to add another option in a class, a simple 'edit=False'. Whithout override completely init().
I found this piece of code written for kivy:
class TitleBox(BoxLayout):
def __init__(self, **kwargs):
# make sure we aren't overriding any important functionality
super(TitleBox, self).__init__(**kwargs)
But when I try to edit for my purposes I receive: "TypeError: init() takes at most 2 arguments (3 given)"
class Person_Dialog(tkSimpleDialog.Dialog):
def __init__(self, edit=False, **kwargs):
super(Person_Dialog, self).__init__(**kwargs)
self.edit = edit
Given an __init__ signature of:
def __init__(self, edit=False, **kwargs):
When you do this:
add = Person_Dialog(root, 'Add person')
Python creates an instance and assigns it to the self argument. Then it assigns root to the edit argument. Then it takes 'Add a person' and finds no other positional arguments to assign it to.
To fix this add another argument to __init__:
class Person_Dialog(tkSimpleDialog.Dialog):
def __init__(self, parent, edit=False, **kwargs): # added parent argument
super(Person_Dialog, self).__init__(parent, **kwargs)
self.edit = edit
Note that we also pass parent to the superclass because tkSimpleDialog.Dialog has this signature __init__(self, parent, title=None).
Unfortunately, your code now fails with TypeError: must be type, not classobj because tkSimpleDialog.Dialog is an old style class and you can't use super() with old style classes. (Python 3 does away with old style classes, so you won't have this issue there.)
So to fix this replace the call to super() with a direct reference to the superclass:
class Person_Dialog(tkSimpleDialog.Dialog):
def __init__(self, parent, edit=False, **kwargs):
# referencing the superclass directly
tkSimpleDialog.Dialog.__init__(self, parent, **kwargs)
self.edit = edit
Now your code will work.
i had a class called CacheObject,and many class extend from it.
now i need to add something common on all classes from this class so i write this
class CacheObject(object):
def __init__(self):
self.updatedict = dict()
but the child class didn't obtain the updatedict attribute.i know calling super init function was optional in python,but is there an easy way to force all of them to add the init rather than walk all the classes and modify them one by one?
I was in a situation where I wanted classes to always call their base classes' constructor in order before they call their own. The following is Python3 code that should do what you want:
class meta(type):
def __init__(cls,name,bases,dct):
def auto__call__init__(self, *a, **kw):
for base in cls.__bases__:
base.__init__(self, *a, **kw)
cls.__init__child_(self, *a, **kw)
cls.__init__child_ = cls.__init__
cls.__init__ = auto__call__init__
class A(metaclass=meta):
def __init__(self):
print("Parent")
class B(A):
def __init__(self):
print("Child")
To illustrate, it will behave as follows:
>>> B()
Parent
Child
<__main__.B object at 0x000001F8EF251F28>
>>> A()
Parent
<__main__.A object at 0x000001F8EF2BB2B0>
I suggest a non-code fix:
Document that super().__init__() should be called by your subclasses before they use any other methods defined in it.
This is not an uncommon restriction. See, for instance, the documentation for threading.Thread in the standard library, which says:
If the subclass overrides the constructor, it must make sure to invoke the base class constructor (Thread.__init__()) before doing anything else to the thread.
There are probably many other examples, I just happened to have that doc page open.
You can override __new__. As long as your base classes doesn't override __new__ without calling super().__new__, then you'll be fine.
class CacheObject(object):
def __new__(cls, *args, **kwargs):
instance = super().__new__(cls, *args, **kwargs)
instance.updatedict = {}
return instance
class Foo(CacheObject):
def __init__(self):
pass
However, as some commenters said, the motivation for this seems a little shady. You should perhaps just add the super calls instead.
This isn't what you asked for, but how about making updatedict a property, so that it doesn't need to be set in __init__:
class CacheObject(object):
#property
def updatedict(self):
try:
return self._updatedict
except AttributeError:
self._updatedict = dict()
return self._updatedict
Hopefully this achieves the real goal, that you don't want to have to touch every subclass (other than to make sure none uses an attribute called updatedict for something else, of course).
There are some odd gotchas, though, because it is different from setting updatedict in __init__ as in your question. For example, the content of CacheObject().__dict__ is different. It has no key updatedict because I've put that key in the class, not in each instance.
Regardless of motivation, another option is to use __init_subclass__() (Python 3.6+) to get this kind of behavior. (For example, I'm using it because I want users not familiar with the intricacies of Python to be able to inherit from a class to create specific engineering models, and I'm trying to keep the structure of the class they have to define very basic.)
In the case of your example,
class CacheObject:
def __init__(self) -> None:
self.updatedict = dict()
def __init_subclass__(cls) -> None:
orig_init = cls.__init__
#wraps(orig_init)
def __init__(self, *args, **kwargs):
orig_init(self, *args, **kwargs)
super(self.__class__, self).__init__()
cls.__init__ = __init__
What this does is any class that subclasses CacheObject will now, when created, have its __init__ function wrapped by the parent class—we're replacing it with a new function that calls the original, and then calls super() (the parent's) __init__ function. So now, even if the child class overrides the parent __init__, at the instance's creation time, its __init__ is then wrapped by a function that calls it and then calls its parent.
You can add a decorator to your classes :
def my_decorator(cls):
old_init = cls.__init__
def new_init(self):
self.updatedict = dict()
old_init(self)
cls.__init__ = new_init
return cls
#my_decorator
class SubClass(CacheObject):
pass
if you want to add the decorators to all the subclasses automatically, use a metaclass:
class myMeta(type):
def __new__(cls, name, parents, dct):
return my_decorator(super().__new__(cls, name, parents, dct))
class CacheObject(object, metaclass=myMeta):
pass
I get this error:
TypeError: object.__init__() takes no parameters
when running my code, I don't really see what I'm doing wrong here though:
class IRCReplyModule(object):
activated=True
moduleHandlerResultList=None
moduleHandlerCommandlist=None
modulename=""
def __init__(self,modulename):
self.modulename = modulename
class SimpleHelloWorld(IRCReplyModule):
def __init__(self):
super(IRCReplyModule,self).__init__('hello world')
You are calling the wrong class name in your super() call:
class SimpleHelloWorld(IRCReplyModule):
def __init__(self):
#super(IRCReplyModule,self).__init__('hello world')
super(SimpleHelloWorld,self).__init__('hello world')
Essentially what you are resolving to is the __init__ of the object base class which takes no params.
Its a bit redundant, I know, to have to specify the class that you are already inside of, which is why in python3 you can just do: super().__init__()
This has bitten me twice recently (I know I should have learned from my mistake the first time) and the accepted answer hasn't helped me either time so while it is fresh in my mind I thought I would submit my own answer just in case anybody else is running into this (or I need this again in future).
In my case the issue was that I was passing a kwarg into the initialisation of the subclass but in the superclass that keyword arg was then being passed though into the super() call.
I always think these types of things are best with an example:
class Foo(object):
def __init__(self, required_param_1, *args, **kwargs):
super(Foo, self).__init__(*args, **kwargs)
self.required_param = required_param_1
self.some_named_optional_param = kwargs.pop('named_optional_param', None)
def some_other_method(self):
raise NotImplementedException
class Bar(Foo):
def some_other_method(self):
print('Do some magic')
Bar(42) # no error
Bar(42, named_optional_param={'xyz': 123}) # raises TypeError: object.__init__() takes no parameters
So to resolve this I just need to alter the order that I do things in the Foo.__init__ method; e.g.:
class Foo(object):
def __init__(self, required_param_1, *args, **kwargs):
self.some_named_optional_param = kwargs.pop('named_optional_param', None)
# call super only AFTER poping the kwargs
super(Foo, self).__init__(*args, **kwargs)
self.required_param = required_param_1