Mocking a module level function in pytest - python

I have a function that has a decorator. The decorator accepts arguments and the value of the argument is derived from another function call.
example.py
from cachetools import cached
from cachetools import TTLCache
from other import get_value
#cached(cache=TTLCache(maxsize=1, ttl=get_value('cache_ttl')))
def my_func():
return 'result'
other.py
def get_value(key):
data = {
'cache_ttl': 10,
}
# Let's assume here we launch a shuttle to the space too.
return data[key]
I'd like to mock the call to get_value(). I'm using the following in my test:
example_test.py
import mock
import pytest
from example import my_func
#pytest.fixture
def mock_get_value():
with mock.patch(
"example.get_value",
autospec=True,
) as _mock:
yield _mock
def test_my_func(mock_get_value):
assert my_func() == 'result'
Here I'm injecting mock_get_value to test_my_func. However, since my decorator is called on the first import, get_value() gets called immediately. Any idea if there's a way to mock the call to get_value() before module is imported right away using pytest?

Move the from example import my_func inside your with in your test function. Also patch it where it's really coming from, other.get_value. That may be all it takes.
Python caches modules in sys.modules, so module-level code (like function definitions) only runs on the first import from anywhere. If this isn't the first time, you can force a re-import using either importlib.reload() or by deleting the appropriate key in sys.modules and importing again.
Beware that re-importing a module may have side effects, and you may also want to re-import the module again after running the test to avoid interfering with other tests. If another module was using objects defined in the re-imported module, these don't just disappear, and may not be updated the way it expects. For example, re-importing a module may create a second instance of what was supposed to be a singleton.
One more robust approach would be save the original imported module object somewhere else, delete from sys.modules, re-import with the patched version for the duration of the test, and then put back the original import into sys.modules after the test. You could do this with an import inside of a patch.dict() context on sys.modules.
import mock
import sys
import pytest
#pytest.fixture
def mock_get_value():
with mock.patch(
"other.get_value",
autospec=True,
) as _mock, mock.patch.dict("sys.modules"):
sys.modules.pop("example", None)
yield _mock
def test_my_func(mock_get_value):
from example import my_func
assert my_func() == 'result'
Another possibility is to call the decorator yourself in the test, on the original function. If the decorator used functools.wraps()/functools.update_wrapper(), then original function should be available as a __wrapped__ attribute. This may not be available depending on how the decorator was implemented.

Related

Most pythonic way to call functions already available in global namespace

I'm currently setting up a test suite for a file called main.py. The test file is called test_main.py. Here's an example:
# main.py
def add(a,b):
return a+b
#test_main.py
import pytest
from main import *
def test_add():
assert add(1,2) == 3
For reasons which are outside the scope of this question, I would like to dynamically load the function add in test_main.py as opposed to calling it directly. I'm already aware this is possible using the following
globals or vars
use of importlib
use of eval
However, I'd like to see if there's another option. globals and vars are bad practice. eval is allright, but it doesn't return the function object and I have to do some string manipulation to get the function call, including its arguments right. importlib is by far the best option, but main.py happens to contain functions which I want to import the "normal" way. It feels wrong to import functions in test_main.py using both an import statement and the importlib module.
So, is there a better way? One which is more pythonic?

Mocking imported class in Python

I am trying to mock a class which is called in a module which imports said class, which I want to test.
# application.py
from my_module.my_submodule import MyClass
def my_function(var1):
instance = MyClass()
instance.some_function(var1)
and my testing file
# test_application.py
import mock
import application
def test_my_function():
with mock.patch('my_module.my_submodule.MyClass') as MockClass:
application.my_function(var1)
MockClass.assert_called()
This gives an error saying MockClass was not called.
Now, by looking at this question: Why python mock patch doesn't work?, I was inspired to change the application.py import to this
# application.py
import my_module.my_submodule as mysub
def my_function(var1):
instance = mysub.MyClass()
instance.some_function(var1)
that is, I don't directly import the class that I want to mock in the test. Now it works.
My question is, if this is working as intended, or I am doing something wrong in the original way? Is it really necessary to always import modules like this, if I want to mock a class used in a module I want to test?
Yes it is working as intended, but you patched the wrong target.
Try patching application.MyClass.
application is not using my_module.my_submodule.MyClass anywhere,
but MyClass alias instead.
No, you don't have to import modules in some specific way to be able to mock/patch some name. What you have to do is to look at how that name is used at runtime, when you want it patched.
If the test is in a separate module from the module being tested, and the tested module, application in this case, imports the name and uses it directly like in from a import MyClass, then in the test module you import application and patch application.MyClass. If instead application uses import a and then calls a.MyClass() you have to patch application.a.MyClass.
So you adapt the patch target to the concrete naming scenario in the application module. For example if your test is in the same application module, and the class is being used as MyClass, you need to patch __main__.MyClass.
It is however true, that writing code in certain ways can make it easier to patch when you are testing. A good example is when the entity to mock is a function parameter. Just call the function with a mock argument.
If you find that patching is too convoluted or looks impossible, try to rewrite the code in the application so that it is more "testable".
For another example as reference, see Where to patch

Mocking a function in another file and a class in a second other file

I'm noticing that the class methods are being mocked out properly, but the function is bypassing the mock and running the actual function.
from module1 import myClass
from module2 import my_function
import unittest
from mock import patch
class TestStuff(unittest.TestCase):
#patch('module2.my_function')
#patch('module1.myClass.my_method')
def test_my_function(self, mock_method, mock_function):
test_obj = myClass()
test_obj.my_method()
assert mock_method.called
my_function()
assert mock_function.called
if I print out my_function() and type(my_function) it will not show the mock, but the real function. Does it matter that I'm importing a class then mocking a method, but I'm importing the function directly?
I think I found the issue:
When I'm testing a function, and there's something I want to mock, I should not import it. Importing it causes it to get used, even if I've put the patch decorator.
I think this was general confusion about how mocking/testing works. What I wanted to do was to test a function without actually querying the database. Instead of mocking out the whole function, I could take the line that actually hits the db - query.all() - and make it it's own function, then mock that out.
from module1 import myClass
from module2 import my_function
import unittest
from mock import patch
class TestStuff(unittest.TestCase):
#patch('module2.db_query')
#patch('module1.myClass.my_method')
def test_my_function(self, mock_method, mock_db_query):
test_obj = myClass()
test_obj.my_method()
assert mock_method.called
my_function() # my_function calls db_query(), now mocked out
assert mock_db_query.called
If I had wanted to actually mock out the all of my_function, I could have just not imported it. At least that's how I'm understanding this at the moment.
I ran into a similar issue. Turns out I needed to give it the FULL path to my_function:
#patch('home.myuser.myprojects.mymodule.myfunc')

conditionally import module to shadow local implementation

I am writing a Python script where some of the core functionalities can be done by another existing library. Unfortunately, while that library has more features, it is also slower, so I'd like if the user could at runtime select whether they want to use that library or my own fast and simple implementation. Unfortunately I'm stuck at a point where I don't understand some of the workings of Python's module system.
Suppose that my main program was main.py, that the (optional) external module is in module_a.py and that my own fast and simple implementation of module_a together with the actual program code that uses either my own implementation or the one of module_a is in the file module_x.py:
main.py:
import module_x
module_x.test(True)
module_x.test(False)
module_a.py:
class myclass():
def __init__(self):
print("i'm myclass in module_a")
module_x.py:
class myclass():
def __init__(self):
print("i'm myclass in module_x")
def test(enable_a):
if enable_a:
try:
from module_a import myclass
except ImportError:
global myclass
enable_a = False
else:
global myclass
i = myclass()
When I now execute main.py I get:
$ python3 main.py
i'm myclass in module_a
i'm myclass in module_a
But why is this? If False is passed to test() then the import of the module_a implementation should never happen. Instead it should only see myclass from the local file. Why doesn't it? How do I make test() use the local definition of myclass conditionally?
My solution is supposed to run in Python3 but I see the same effect when I use Python2.7.
An import statement is permanent within the thread of execution unless it is explicitly undone. Furthermore, once the from ... import statement is executed in this case, it replaces the variable myclass in the global scope (at which point the class it was previously referencing defined in the same file is no longer referenced and can in theory be garbage collected)
So what is happening here is whenever you run test(True) the first time, your myclass in module_x is effectively deleted and replaced with the myclass from module_a. All subsequent calls to test(False) then call global myclass which is effectively a no-op since the global myclass now refers to the one imported from the other class (and besides the global call is unneeded when not changing the global variable from a local scope as explained here).
To work around this, I would strongly suggest encapsulating the desired module-switching behavior in a class that is independent of either module you would like to switch. You can then charge that class with holding a reference to both modules and providing the rest of you client code with the correct one. E.g.
module_a_wrapper.py
import module_x
import module_a
class ModuleAWrapper(object):
_target_module = module_x # the default
#classmethod
def get_module(cls):
return cls._target_module
def set_module(enable_a):
if enable_a:
ModuleAWrapper._target_module = module_a
else:
ModuleAWrapper._target_module = module_x
def get_module():
return ModuleAWrapper.get_module()
main.py:
from module_a_wrapper import set_module, get_module
set_module(True)
get_module().myclass()
set_module(False)
get_module().myclass()
Running:
python main.py
# Outputs:
i'm myclass in module_a
i'm myclass in module_x
You can read more about the guts of the python import system here
The answer by lemonhead properly explains why this effect happens and gives a valid solution.
The general rule seems to be: wherever and however you import a module, it will always replace any variables of the same name from the global scope.
Funnily, when I use the import foo as bar construct, then there must neither be a global variable named foo nor one named bar!
So while lemonhead's solution worked it adds lots of complexity and will lead to my code being much longer because every time I want to get something from either module I have to prefix that call with the getter function.
This solution allows me to solve the problem with a minimal amount of changed code:
module_x.py:
class myclass_fast():
def __init__(self):
print("i'm myclass in module_x")
def test(enable_a):
if enable_a:
try:
from module_a import myclass
except ImportError:
enable_a = False
myclass = myclass_fast
else:
myclass = myclass_fast
i = myclass()
So the only thing I changed was to rename the class I had in global scope from myclass to myclass_fast. This way it will not be overwritten anymore by the import of myclass from module_a. Then, on demand, I change the local variable myclass to either be the imported module or myclass_fast.

Can I "fake" a package (or at least a module) in python for testing purposes?

I want to fake a package in python. I want to define something so that the code can do
from somefakepackage.morefakestuff import somethingfake
And somefakepackage is defined in code and so is everything below it. Is that possible? The reason for doing this is to trick my unittest that I got a package ( or as I said in the title, a module ) in the python path which actually is just something mocked up for this unittest.
Sure. Define a class, put the stuff you need inside that, assign the class to sys.modules["classname"].
class fakemodule(object):
#staticmethod
def method(a, b):
return a+b
import sys
sys.modules["package.module"] = fakemodule
You could also use a separate module (call it fakemodule.py):
import fakemodule, sys
sys.modules["package.module"] = fakemodule
Yes, you can make a fake module:
from types import ModuleType
m = ModuleType("fake_module")
import sys
sys.modules[m.__name__] = m
# some scripts may expect a file
# even though this file doesn't exist,
# it may be used by Python for in error messages or introspection.
m.__file__ = m.__name__ + ".py"
# Add a function
def my_function():
return 10
m.my_function = my_function
Note, in this example its using an actual module (of ModuleType) since some
Python code may expect modules, (instead of a dummy class).
This can be made into a utility function:
def new_module(name, doc=None):
import sys
from types import ModuleType
m = ModuleType(name, doc)
m.__file__ = name + '.py'
sys.modules[name] = m
return m
print(new_module("fake_module", doc="doc string"))
Now other scripts can run:
import fake_module
I took some of the ideas from the other answers and turned them into a Python decorator #modulize which converts a function into a module. This module can then be imported as usual. Here is an example.
#modulize('my_module')
def my_dummy_function(__name__): # the function takes one parameter __name__
# put module code here
def my_function(s):
print(s, 'bar')
# the function must return locals()
return locals()
# import the module as usual
from my_module import my_function
my_function('foo') # foo bar
The code for the decorator is as follows
import sys
from types import ModuleType
class MockModule(ModuleType):
def __init__(self, module_name, module_doc=None):
ModuleType.__init__(self, module_name, module_doc)
if '.' in module_name:
package, module = module_name.rsplit('.', 1)
get_mock_module(package).__path__ = []
setattr(get_mock_module(package), module, self)
def _initialize_(self, module_code):
self.__dict__.update(module_code(self.__name__))
self.__doc__ = module_code.__doc__
def get_mock_module(module_name):
if module_name not in sys.modules:
sys.modules[module_name] = MockModule(module_name)
return sys.modules[module_name]
def modulize(module_name, dependencies=[]):
for d in dependencies: get_mock_module(d)
return get_mock_module(module_name)._initialize_
The project can be found here on GitHub. In particular, I created this for programming contests which only allow the contestant to submit a single .py file. This allows one to develop a project with multiple .py files and then combine them into one .py file at the end.
You could fake it with a class which behaves like somethingfake:
try:
from somefakepackage.morefakestuff import somethingfake
except ImportError:
class somethingfake(object):
# define what you'd expect of somethingfake, e.g.:
#staticmethod
def somefunc():
...
somefield = ...
TL;DR
Patch sys.modules using unittest.mock:
mock.patch.dict(
sys.modules,
{'somefakepackage': mock.Mock()},
)
Explanation
Other answers correctly recommend to fix sys.modules but a proper way to do it is by patching it using mock.patch. Meaning replacing it temporarily (only for when tests are run) with a fake object that optionally imitates the desired behaviour. And restoring it back once tests are finished to not affect other test cases.
The code in TL;DR section will simply make your missing package not raise ImportError. To provide fake package with contents and imitate desired behaviour, initiate mock.Mock(…) with proper arguments (e.g. add attributes via Mock's **kwargs).
Full code example
The code below temporarily patches sys.modules so that it includes somefakepackage and makes it importable from the dependent modules without ImportError.
import sys
import unittest
from unittest import mock
class SomeTestCase(unittest.TestCase):
def test_smth(self):
# implement your testing logic, for example:
self.assertEqual(
123,
somefakepackage_dependent.some_func(),
)
#classmethod
def setUpClass(cls): # called once before all the tests
# define what to patch sys.modules with
cls._modules_patcher = mock.patch.dict(
sys.modules,
{'somefakepackage': mock.Mock()},
)
# actually patch it
cls._modules_patcher.start()
# make the package globally visible and import it,
# just like if you have imported it in a usual way
# placing import statement at the top of the file,
# but relying on a patched dependency
global somefakepackage_dependent
import somefakepackage_dependent
#classmethod # called once after all tests
def tearDownClass(cls):
# restore initial sys.modules state back
cls._modules_patcher.stop()
To read more about setUpClass/tearDownClass methods, see unittest docs.
unittest's built-in mock subpackage is actually a very powerful tool. Better dive deeper into its documentation to get a better understanding.

Categories