I have following example code:
class AuxiliaryClass:
#staticmethod
def high_cost_method():
"Do something"
class MyTestedClass:
def do_something(self):
something = AuxiliaryClass.high_cost_method()
"do something else"
I want to test MyTestedClass. For this purpose I've created AuxiliaryClassStub class to override high_cost_method(). I want my test to execute do_something() from MyTestedClass, but do_something() should use stub instead of real class.
How can I do that?
My real auxiliary class is quite big, it has a lot of methods and I will use it in many tests, so I don't want to patch single methods. I need to replace whole class during tests.
Note, that high_cost_method() is static, so mocking __init__() or __new__() will not help in this case.
Does it work if you use self.__class__.high_cost_method inside do_something? This way you avoid the direct reference to the class name, which should enable subclassing and overriding the staticmethod with the one from AuxiliaryClass.
class MyTestedClass:
def do_something(self):
something = self.__class__.high_cost_method()
something()
#staticmethod
def high_cost_method():
print("high cost MyTestedClass")
class AuxiliaryClass(MyTestedClass):
#staticmethod
def high_cost_method():
print("high cost AuxiliaryClass")
Then you get
test = AuxiliaryClass()
test.high_cost_method()
high cost AuxiliaryClass
and otherwise
test = MyTestedClass()
test.high_cost_method()
high cost MyTestedClass
Related
Say I have a class
class Base(object):
def my_method(self, input):
print input #suppose this is many lines
print "mymethod" #so is this
and a subclass that has a method which does almost the same thing, except for an extra operation in the middle of the method, e.g.
class Sub(Base):
def mymethod(self, input): #how do I properly define this?
print input
print "some other stuff" #additional operation in the middle
print "mymethod"
What is the proper way to overriding mymethod?
Do I copy-and-paste the majority of Base.mymethod()? (Probably not - that definitely violates DRY).
Do I define Base.mymethod() to have a conditional statement for the additional operation that only returns true in a subclass case? (Probably not - that doesn't make sense since the base class should be standalone and this seems like a recipe for disaster)
Can I somehow use super()? (Seems not - Sub's additional operation is in the middle of the method, not the beginning or end)
For such a simple example, I will most likely copy these three small lines, even if creates a repetitions. Try to avoid over-engineering.
In the case where my_method() is actually more complex, you can divide your function into three steps, and let the child classes overload the part they want.
class Base(object):
def my_method(self, input):
self._preprocess(input)
self._process()
self._postprocess()
def _preprocess(self, input):
print(input)
def _process(self):
pass
def _postprocess(self):
print("mymethod")
class Sub(Base):
def _process(self):
print("some other stuff")
Of course you should use more meaningful method names.
That depends on where stuff belongs. Usually if you end up wanting to insert stuff in between operations of the base's method, it means that method should actually be split into several methods.
For instance:
class Base(object):
def my_method(self, input):
print input #suppose this is many lines
print "mymethod" #so is this
could become:
class Base(object):
def my_method(self, input):
self.do_first_thing(input)
self.do_second_thing("mymethod")
def do_first_thing(self, input):
print(input)
def do_second_thing(self, data):
print(data)
This lets subclasses redefine the whole process without having to re-implement each step. The concept is akin to a template method, but backwards.
(Normally the point of a template method pattern is to let subclasses redefine steps, here we use the same structure to let subclasses redefine the template itself).
Code first:
from abc import abstractmethod
class SomeInterfaceishClass():
#abstractmethod
#staticmethod
def foo(bar):
pass
class SomeClass(SomeInterfaceishClass):
#staticmethod
def foo(bar):
print("implementation here")
class EventHandler():
#staticmethod
#Multitasking.threaded
def foo(bar):
pass
I'm writing a frameworkish thingy where the end-user is forced to implement a few methods (only one here for simplicity). Therefore I have defined the SomeInterfaceishClass class with a abstract method. My problem is that I will not run the method from this class, instead, I'd like to run it from the EventHandler class. As you can see it has a decorator to make it run async and I think this is where the problem arises.
I know that I can call EventHandler.foo = getattr(self, 'foo') in the constructor of SomeInterfaceishClass, but when I do this the complete function will be overridden (loses it's decorator).
I'd like to keep the end-users code as clean as possible and therefore don't really want to add the decorator in SomeClass.
Is there any way to accomplish this? For example is there a way to add an implementation to a class method, rather than add a method to a class?
Just to be clear: I want to add it to the EventHandler class, not an instance of it.
Thank you all! :)
I am testing a class that inherits from another one very complex, with DB connection methods and a mess of dependences. I would like to mock its base class so that I can nicely play with the method defined in the subclass, but in the moment I inherit from a mocked class, the object itself turns a mock and loses all its methods.
How can I mock a superclass?
More or less the situation can be summed up in this:
import mock
ClassMock = mock.MagicMock()
class RealClass(ClassMock):
def lol(self):
print 'lol'
real = RealClass()
real.lol() # Does not print lol, but returns another mock
print real # prints <MagicMock id='...'>
This is a simplified case. What is actually happening is that RealClass extends AnotherClass, but I managed to intercept the AnotherClass and replace it with a mock.
This is something I've been struggling with for a long time, but I think I've finally found a solution.
As you already noticed, if you try to replace the base class with a Mock, the class you're attempting to test simply becomes the mock, which defeats your ability to test it. The solution is to mock only the base class's methods rather than the entire base class itself, but that's easier said than done: it can be quite error prone to mock every single method one by one on a test by test basis.
What I've done instead is created a class that scans another class, and assigns to itself Mock()s that match the methods on the other class. You can then use this class in place of the real base class in your testing.
Here is the fake class:
class Fake(object):
"""Create Mock()ed methods that match another class's methods."""
#classmethod
def imitate(cls, *others):
for other in others:
for name in other.__dict__:
try:
setattr(cls, name, Mock())
except (TypeError, AttributeError):
pass
return cls
So for example you might have some code like this (apologies this is a little bit contrived, just assume that BaseClass and SecondClass are doing non-trivial work and contain many methods and aren't even necessarily defined by you at all):
class BaseClass:
def do_expensive_calculation(self):
return 5 + 5
class SecondClass:
def do_second_calculation(self):
return 2 * 2
class MyClass(BaseClass, SecondClass):
def my_calculation(self):
return self.do_expensive_calculation(), self.do_second_calculation()
You would then be able to write some tests like this:
class MyTestCase(unittest.TestCase):
def setUp(self):
MyClass.__bases__ = (Fake.imitate(BaseClass, SecondBase),)
def test_my_methods_only(self):
myclass = MyClass()
self.assertEqual(myclass.my_calculation(), (
myclass.do_expensive_calculation.return_value,
myclass.do_second_calculation.return_value,
))
myclass.do_expensive_calculation.assert_called_once_with()
myclass.do_second_calculation.assert_called_once_with()
So the methods that exist on the base classes remain available as mocks you can interact with, but your class does not itself become a mock.
And I've been careful to ensure that this works in both python2 and python3.
This should work for you.
import mock
ClassMock = mock.MagicMock # <-- Note the removed brackets '()'
class RealClass(ClassMock):
def lol(self):
print 'lol'
real = RealClass()
real.lol() # Does not print lol, but returns another mock
print real # prints <MagicMock id='...'>
You should'nt pass an instance of the class as you did. mock.MagicMock is a class, so you pass it directly.
In [2]: inspect.isclass(mock.MagicMock)
Out[2]: True
I was facing a similar problem and was able to do this via #patch.object. See examples for patch decorators in the official python doc.
class MyTest(unittest.TestCase):
#patch.object(SomeClass, 'inherited_method')
def test_something(self, mock_method):
SomeClass.static_method()
mock_method.assert_called_with()
Just exemplifying #Akash's answer, which was the one that in fact solved my inheritance mock challenge:
#patch.object(SomeClassInheritingAnother, "inherited_method")
def test_should_test_something(self, mocked_inherited_method, mocker, caplog):
#Mocking an HTTP result status code
type(mocked_inherited_method.return_value).status_code = mocker.PropertyMock(return_value=200)
#Calling the inherited method, that should end up using the mocked method
SomeClassInheritingAnother.inherited_method()
#Considering that the request result is being logged as 'Request result: {response.status_code}'
assert "Request result: 200" in caplog.text
For example, I have a
class BaseHandler(object):
def prepare(self):
self.prepped = 1
I do not want everyone that subclasses BaseHandler and also wants to implement prepare to have to remember to call
super(SubBaseHandler, self).prepare()
Is there a way to ensure the superclass method is run even if the subclass also implements prepare?
I have solved this problem using a metaclass.
Using a metaclass allows the implementer of the BaseHandler to be sure that all subclasses will call the superclasses prepare() with no adjustment to any existing code.
The metaclass looks for an implementation of prepare on both classes and then overwrites the subclass prepare with one that calls superclass.prepare followed by subclass.prepare.
class MetaHandler(type):
def __new__(cls, name, bases, attrs):
instance = type.__new__(cls, name, bases, attrs)
super_instance = super(instance, instance)
if hasattr(super_instance, 'prepare') and hasattr(instance, 'prepare'):
super_prepare = getattr(super_instance, 'prepare')
sub_prepare = getattr(instance, 'prepare')
def new_prepare(self):
super_prepare(self)
sub_prepare(self)
setattr(instance, 'prepare', new_prepare)
return instance
class BaseHandler(object):
__metaclass__ = MetaHandler
def prepare(self):
print 'BaseHandler.prepare'
class SubHandler(BaseHandler):
def prepare(self):
print 'SubHandler.prepare'
Using it looks like this:
>>> sh = SubHandler()
>>> sh.prepare()
BaseHandler.prepare
SubHandler.prepare
Tell your developers to define prepare_hook instead of prepare, but
tell the users to call prepare:
class BaseHandler(object):
def prepare(self):
self.prepped = 1
self.prepare_hook()
def prepare_hook(self):
pass
class SubBaseHandler(BaseHandler):
def prepare_hook(self):
pass
foo = SubBaseHandler()
foo.prepare()
If you want more complex chaining of prepare calls from multiple subclasses, then your developers should really use super as that's what it was intended for.
Just accept that you have to tell people subclassing your class to call the base method when overriding it. Every other solution either requires you to explain them to do something else, or involves some un-pythonic hacks which could be circumvented too.
Python’s object inheritance model was designed to be open, and any try to go another way will just overcomplicate the problem which does not really exist anyway. Just tell everybody using your stuff to either follow your “rules”, or the program will mess up.
One explicit solution without too much magic going on would be to maintain a list of prepare call-backs:
class BaseHandler(object):
def __init__(self):
self.prepare_callbacks = []
def register_prepare_callback(self, callback):
self.prepare_callbacks.append(callback)
def prepare(self):
# Do BaseHandler preparation
for callback in self.prepare_callbacks:
callback()
class MyHandler(BaseHandler):
def __init__(self):
BaseHandler.__init__(self)
self.register_prepare_callback(self._prepare)
def _prepare(self):
# whatever
In general you can try using __getattribute__ to achive something like this (until the moment someone overwrites this method too), but it is against the Python ideas. There is a reason to be able to access private object members in Python. The reason is mentioned in import this
I have decorator #login_testuser applied to method test_1():
class TestCase(object):
#login_testuser
def test_1(self):
print "test_1()"
Is there a way I can apply #login_testuser on every method of the class prefixed with "test_"?
In other words, the decorator would apply to test_1(), test_2() methods below, but not on setUp().
class TestCase(object):
def setUp(self):
pass
def test_1(self):
print "test_1()"
def test_2(self):
print "test_2()"
In Python 2.6, a class decorator is definitely the way to go. e.g., here's a pretty general one for these kind of tasks:
import inspect
def decallmethods(decorator, prefix='test_'):
def dectheclass(cls):
for name, m in inspect.getmembers(cls, inspect.isfunction):
if name.startswith(prefix):
setattr(cls, name, decorator(m))
return cls
return dectheclass
#decallmethods(login_testuser)
class TestCase(object):
def setUp(self):
pass
def test_1(self):
print("test_1()")
def test_2(self):
print("test_2()")
will get you what you desire. In Python 2.5 or worse, the #decallmethods syntax doesn't work for class decoration, but with otherwise exactly the same code you can replace it with the following statement right after the end of the class TestCase statement:
TestCase = decallmethods(login_testuser)(TestCase)
Sure. Iterate all attributes of the class. Check each one for being a method and if the name starts with "test_". Then replace it with the function returned from your decorator
Something like:
from inspect import ismethod, getmembers
for name, obj in getmembers(TestCase, ismethod):
if name.startswith("test_"):
setattr(TestCase, name, login_testuser(obj))
Are you sure you wouldn't be better off by putting login_testuser's code into setUp instead? That's what setUp is for: it's run before every test method.
Yes, you can loop over the class's dir/__dict__ or have a metaclass that does so, identifying if the attributes start with "test". However, this will create less straightforward, explicit code than writing the decorator explicitly.