I have a python script that does this at the begining of the script:
def initialize(module_name):
return importlib.import_module(module_name) # import module from string
I want to write a test that 'mocks' out the module name like so:
def test():
# assemble module at run time
module_obj = {'name': Object, 'another_name': AnotherObject}
# inject to the "import system"
inject_mock_module('mymodule', module_obj)
# assert that the import went correctly
assert my_module_mock == initialize('mymodule')
How do I do this? First, specifically, how to I create module_obj and how do I define inject_mock_module? This needs to work in both 2.7+ and 3.3+
Use the mock library to mock out the import_module() function. As of Python 3.3, you can import that module as unittest.mock:
try:
# Python 3.3+
from unittest.mock import patch
except ImportError:
# External dependency
from mock import patch
def test():
module_obj = {'name': Object, 'another_name': AnotherObject}
with patch('importlib.import_module', new=module_obj.get):
assert initialize('name') is module_obj['name']
Related
I've looked around but still don't get how to mock a library used inside a function and assert that its been called properly.
a.py
import win32clipboard
def copy():
win32clipboard.OpenClipboard()
win32clipboard.EmptyClipboard()
win32clipboard.SetClipboardText('dummy')
win32clipboard.CloseClipboard()
test_a.py
import a
import pytest
def test_copy():
# Mock win32clipboard somehow
# Run a.copy()
# assert mock win32clipboard.call_count == 4
There is a mistake in your approach.
win32clipboard is a library, with some classes and methods. You must mock every class from this library you want to use (OpenClipboard, EmptyClipboard, SetClipboardText and CloseClipboard)
import a
import pytest
from unittest.mock import patch
#patch('win32clipboard.OpenClipboard')
#patch('win32clipboard.EmptyClipboard')
#patch('win32clipboard.SetClipboardText')
#patch('win32clipboard.CloseClipboard')
def test_copy(mock_close, mock_set, mock_empty, mock_open):
a.copy()
assert mock_close.called
assert mock_set.called
assert mock_empty.called
assert mock_open.called
Folks,
I have a problem during including file.py to test_file.py namely:
file.py uses Robot library BuiltIn:
from robot.libraries.BuiltIn import BuiltIn
DEFAULT_IPHY_TTI_TRACE_DIR =
os.path.join(BuiltIn().get_variable_value('${OUTPUT_DIR}'), 'iphy_tti_trace')
And when I try to include file.py in my test_file.py
import pytest
#import file.py
I receive:
test_file.py:8: in <module>
/opt/ute/python/lib/python2.7/site-packages/robot/libraries/BuiltIn.py:1331: in get_variable_value
return self._variables[self._get_var_name(name)]
/opt/ute/python/lib/python2.7/site-packages/robot/libraries/BuiltIn.py:75: in _variables
return self._namespace.variables
/opt/ute/python/lib/python2.7/site-packages/robot/libraries/BuiltIn.py:71: in _namespace
return self._get_context().namespace
/opt/ute/python/lib/python2.7/site-packages/robot/libraries/BuiltIn.py:66: in _get_context
raise RobotNotRunningError('Cannot access execution context')
E RobotNotRunningError: Cannot access execution context
How can I mock this? This is posible at all?
Sure, the issue is just that you can't mock the BuiltIn class where it is used (in file.py). You have to mock the class where it is declared (in robot.libraries.BuiltIn).
Using mocks:
from unittest.mock import patch, MagicMock
def _test_default_iphy_tti_trace_dir():
with patch('robot.libraries.BuiltIn.BuiltIn.get_variable_value', return_value='/foo/bar'):
import file
assert file.DEFAULT_IPHY_TTI_TRACE_DIR == '/foo/bar/iphy_tti_trace'
Using monkeypatch fixture:
def test_default_iphy_tti_trace_dir(monkeypatch):
def mocked_get(self, name):
return '/foo/bar'
monkeypatch.setattr('robot.libraries.BuiltIn.BuiltIn.get_variable_value', mocked_get)
import file
assert file.DEFAULT_IPHY_TTI_TRACE_DIR == '/foo/bar/iphy_tti_trace'
Also note that the mocking is done for the scope of a single test only, so you can't import file on top of the test module as the BuiltIn will be unpatched there, raising the context error.
I'm trying to write my first module in Ansible, which is essentially a wrapper around another module. Here is my module:
#!/usr/bin/python
import ansible.runner
import sys
def main():
module.exit_json(changed=False)
from ansible.module_utils.basic import *
main()
and here is the error it gives me (stripped from 'msg'):
ImportError: No module named ansible.runner
I am on ubuntu and installed ansible with aptitude, version is 1.9.1
Any ideas?
Modules have to essentially be standalone. The boilerplate gets injected at runtime (the text of the boilerplate replaces the import at the bottom), and the combined text of the module + boilerplate is squirted to the remote machine and run there. As such, you can't import things from ansible core like the runner (unless you install ansible on the remote machine- don't be that guy). "module" is one of the items that you have to create from stuff defined in the boilerplate. Here's a sample module skeleton I wrote:
#! /usr/bin/python
import json
def main():
module = AnsibleModule(
argument_spec = dict(
state = dict(default='present', choices=['present', 'absent'])
),
supports_check_mode = True
)
p = module.params
changed = False
state = p['state']
if not module.check_mode:
# do stuff
pass
#module.fail_json(msg='it broke')
module.exit_json(changed=changed)
from ansible.module_utils.basic import *
main()
I just checked a module I wrote a while back and I don't have such an import line. The only import I have is from ansible.module_utils.basic import *. The module object I create myself in main:
module = AnsibleModule(
argument_spec=dict(
paramA=dict(required=True),
paramB=dict(required=False),
paramC=dict(required=False),
),
add_file_common_args=True,
supports_check_mode=True
)
Is there any way to make an implicit initializer for modules (not packages)?
Something like:
#file: mymodule.py
def __init__(val):
global value
value = 5
And when you import it:
#file: mainmodule.py
import mymodule(5)
The import statement uses the builtin __import__ function.
Therefore it's not possible to have a module __init__ function.
You'll have to call it yourself:
import mymodule
mymodule.__init__(5)
These things often are not closed as duplicates, so here's a really nice solution from Pass Variable On Import. TL;DR: use a config module, configure that before importing your module.
[...] A cleaner way to do it which is very useful for multiple configuration
items in your project is to create a separate Configuration module
that is imported by your wrapping code first, and the items set at
runtime, before your functional module imports it. This pattern is
often used in other projects.
myconfig/__init__.py :
PATH_TO_R_SOURCE = '/default/R/source/path'
OTHER_CONFIG_ITEM = 'DEFAULT'
PI = 3.14
mymodule/__init__.py :
import myconfig
PATH_TO_R_SOURCE = myconfig.PATH_TO_R_SOURCE
robjects.r.source(PATH_TO_R_SOURCE, chdir = True) ## this takes time
class SomeClass:
def __init__(self, aCurve):
self._curve = aCurve
if myconfig.VERSION is not None:
version = myconfig.VERSION
else:
version = "UNDEFINED"
two_pi = myconfig.PI * 2
And you can change the behaviour of your module at runtime from the
wrapper:
run.py :
import myconfig
myconfig.PATH_TO_R_SOURCE = 'actual/path/to/R/source'
myconfig.PI = 3.14159
# we can even add a new configuration item that isn't present in the original myconfig:
myconfig.VERSION="1.0"
import mymodule
print "Mymodule.two_pi = %r" % mymodule.two_pi
print "Mymodule.version is %s" % mymodule.version
Output:
> Mymodule.two_pi = 6.28318
> Mymodule.version is 1.0
I know that if I import a module by name import(moduleName), then I can reload it with reload(moduleName)
But, I am importing a bunch of modules with a Kleene star:
from proj import *
How can I reload them in this case?
I think there's a way to reload all python modules. The code for Python 2.7 is listed below: Instead of importing the math module with an asterisk, you can import whatever you need.
from math import *
from sys import *
Alfa = modules.keys()
modules.clear()
for elem in Alfa:
str = 'from '+elem+' import *'
try:
exec(str)
except:
pass
This is a complicated and confusing issue. The method I give will reload the module and refresh the variables in the given context. However, this method will fall over if you have multiple modules using a starred import on the given module as they will retain their original values instead of updating. In generally, even having to reload a module is something you shouldn't be doing, with the exception of when working with a REPL. Modules aren't something that should be dynamic. You should consider other ways of providing the updates you need.
import sys
def reload_starred(module_name, context):
if context in sys.modules:
context = vars(sys.modules[context])
module = sys.modules[module_name]
for name in get_public_module_variables(module):
try:
del context[name]
except KeyError:
pass
module = reload(module)
context.update(get_public_module_variables(module))
def get_public_module_variables(module):
if hasattr(module, "__all__"):
return dict((k,v) for (k,v) in vars(module).items()
if k in module.__all__)
else:
return dict((k,v) for (k,v) in vars(module).items()
if not k.startswith("_"))
Usage:
reload_starred("my_module", __name__)
reload_starred("my_module", globals())
reload_starred("my_module", "another_module")
def function():
from my_module import *
...
reload_starred("my_module", locals())