I'm currently setting up a test suite for a file called main.py. The test file is called test_main.py. Here's an example:
# main.py
def add(a,b):
return a+b
#test_main.py
import pytest
from main import *
def test_add():
assert add(1,2) == 3
For reasons which are outside the scope of this question, I would like to dynamically load the function add in test_main.py as opposed to calling it directly. I'm already aware this is possible using the following
globals or vars
use of importlib
use of eval
However, I'd like to see if there's another option. globals and vars are bad practice. eval is allright, but it doesn't return the function object and I have to do some string manipulation to get the function call, including its arguments right. importlib is by far the best option, but main.py happens to contain functions which I want to import the "normal" way. It feels wrong to import functions in test_main.py using both an import statement and the importlib module.
So, is there a better way? One which is more pythonic?
Related
I have a function that has a decorator. The decorator accepts arguments and the value of the argument is derived from another function call.
example.py
from cachetools import cached
from cachetools import TTLCache
from other import get_value
#cached(cache=TTLCache(maxsize=1, ttl=get_value('cache_ttl')))
def my_func():
return 'result'
other.py
def get_value(key):
data = {
'cache_ttl': 10,
}
# Let's assume here we launch a shuttle to the space too.
return data[key]
I'd like to mock the call to get_value(). I'm using the following in my test:
example_test.py
import mock
import pytest
from example import my_func
#pytest.fixture
def mock_get_value():
with mock.patch(
"example.get_value",
autospec=True,
) as _mock:
yield _mock
def test_my_func(mock_get_value):
assert my_func() == 'result'
Here I'm injecting mock_get_value to test_my_func. However, since my decorator is called on the first import, get_value() gets called immediately. Any idea if there's a way to mock the call to get_value() before module is imported right away using pytest?
Move the from example import my_func inside your with in your test function. Also patch it where it's really coming from, other.get_value. That may be all it takes.
Python caches modules in sys.modules, so module-level code (like function definitions) only runs on the first import from anywhere. If this isn't the first time, you can force a re-import using either importlib.reload() or by deleting the appropriate key in sys.modules and importing again.
Beware that re-importing a module may have side effects, and you may also want to re-import the module again after running the test to avoid interfering with other tests. If another module was using objects defined in the re-imported module, these don't just disappear, and may not be updated the way it expects. For example, re-importing a module may create a second instance of what was supposed to be a singleton.
One more robust approach would be save the original imported module object somewhere else, delete from sys.modules, re-import with the patched version for the duration of the test, and then put back the original import into sys.modules after the test. You could do this with an import inside of a patch.dict() context on sys.modules.
import mock
import sys
import pytest
#pytest.fixture
def mock_get_value():
with mock.patch(
"other.get_value",
autospec=True,
) as _mock, mock.patch.dict("sys.modules"):
sys.modules.pop("example", None)
yield _mock
def test_my_func(mock_get_value):
from example import my_func
assert my_func() == 'result'
Another possibility is to call the decorator yourself in the test, on the original function. If the decorator used functools.wraps()/functools.update_wrapper(), then original function should be available as a __wrapped__ attribute. This may not be available depending on how the decorator was implemented.
I think that this is a quite basic question, but I wasn't able to find anything. Sorry if this happen to be a duplicate.
I have a file with some functions defined, let's call this file main_functions.py.
In this file I rely on a function, which we can call foo(). For instance, in the file main_functions.py we can have something like this:
def bar():
return foo()
foo() is definend in another file, called secondary_functions.py
def foo():
return 1
Now, in my main script, I would like to import a file where I can define foo(), and then do something like:
from secondary_functions import * # Here I define foo()
from main_functions import *
bar()
If I do so, the function inside main_functions is not able to find the definitions that are present in secondary_functions, and I will get an error like:
NameError: name 'foo' is not defined
It is very important for me to solve this problem.
My aim is to be able to have different files called secondary_functions1.py, secondary_functions2.py, eccetera, definitions of foo().
And, to solve the problem, I don't want to change everytime the file that depend on these definitions, for instance inserting everytime something like import secondary_functionsX.py, which would solve the problem. I would like to change only the main script.
The foo name is imported in main.py. foo is not available in the main_functions.py module, because you have not imported it in that module. See Namespaces with Module Imports for more on why it works this way.
One way to do what you want is to supply foo as an argument to bar(). Example:
main.py:
from secondary_functions import foo
from main_functions import bar
bar(foo)
main_functions.py:
def bar(foo):
return foo()
secondary_functions.py:
def foo():
return 1
After the import statements, variables like pippo have become global variables in the scope of the main program. But global variables are not inherited by modules that get imported. Modules are supposed to be able to stand on their own as self-contained units, that can be imported by any program; imagine what could go wrong if they started using variables from whatever imports them...
Thus, the only way to do this is explicitly ‘giving’ the values to your module, for instance as additional function arguments. You could put everything that’s in main_functions.py in a Class, and then have your main script give it the desired global variables as arguments of its init construction function, so it can store them for usage by bar() and other methods.
It seems that the problem isn't calling the files correctly it's that you're not calling pippo correctly, if pippo is a global variable then i don't see why it's not working. the only way i can think of solving this is by saying file.pippo
and if you're going to have multiple files with a variable called pippo then it's best not to make them global and call then individually like i just showed you.
another thing that could be the problem is if you are defining pippo inside a function, which then makes it a local variable to that function only.
And the last problem i can think of is if you're using them in main_functions and they haven't been defined in main_functions and you're not importing the files into main_functions i.e
# in main_functions.py
import secondary_functions
then i don't think main_functions will be able to find the function and variable without making them arguments for the function you're using them in. or again you can do something like file.pippo
I am sorry if the question title is vague, I could not think of a better one.
I have a bunch of functions inside a module which I wish behaved differently when called locally versus when called from other modules.
Here is a toy example
moduleA.py
def func(arg1):
pass
do something
moduleB.py
import moduleA
func(arg1)
In moduleB the call for func() needs to do
initSomething
func(arg1)
doSomethingElse
And when func() is called from moduleA, I still need the original behavior.
While the problem screams at me for using decorators, I am not sure on writing a decorator for func() that will be triggered only when called from a module.
Sounds to like you want to give the function calls a certain context. That's what context managers are for. You could do sth like:
from contextlib import contextmanager
#contextmanager
def func_context():
# init_something
yield
# do_something_else
with func_context():
func(arg1)
I have a module with the name my_module.py
Inside of this module there is a function my_function:
def myFunction():
print my_variable
Apparently when this functions is called it prints my_variable which is not instantiated anywhere yet. So, calling myFunction() from inside of the module itself will crash the execution.
Now, aside from my_module.py I have another script with the name my_app.py residing in the same folder.
Inside of my_app.py I am importing my_module.py and instantiating my_variable under its namespace. After my_variable is instantiated I am calling my_module.myFunction() which picks up my_variable and prints its context out:
import module
module.my_variable = 'this variable is instantiated inside of another script'
module.myFunction()
While this approach works I wonder if it is designed properly. Is there other way to instantiate a variable outside the imported module to be used by this imported module?
import module
module.my_variable = 'this variable is instantiated inside of another script'
module.myFunction()
While this approach works I wonder if it is designed properly.
No, this is not designed properly. One proper way is to pass the value to the function explicitly.
Is there other way to instantiate a variable outside the imported module to be used by this imported module?
Just have another module were you declare this variable(s). For example my_vars.py:
my_variable = 'this variable is instantiated inside of another script'
Then in my_module.py:
import my_vars
def myFunction():
print my_vars.my_variable
I'm not sure what you're trying to achieve but it's generally best practice to not mutate "global" variables. Every time you'd want to use my_function() in your code you'd have to explicitly change my_variable first, which can trigger side effects in your code if other functions/methods are depending on it. The best way would be to rewrite my_function() so that it accepts my_variable as an argument
I have created my own module X. At the beginning, I import functions from some other modules (e.g. from math import func). I have noticed that when I create documentation with:
pydoc -w X
the resulting file also contains the imported function function from the math module, which is undesirable (especially if I import many functions from several modules, which is what I do).
It can be avoided by instead calling:
import math
but in this case I have to import all the functions and then call them using math.func instead of just func.
Is there another way with which I can avoid populating my documentation with imported functions while still being able to import functions with from?
Looking inside the source for Pydoc you can see the following comment made:
if all is not None:
# only document that which the programmer exported in __all__
return name in all
meaning that pydoc will look for the __all__ module attribute, and, if defined, will only document the functions defined in it.
So, for you module X, you can define the functions to be exported in __all__ by specifying their names. Only those will get documented in the corresponding Functions section:
__all__ = ['myfunc1', 'myfunc2', ..., 'myfuncN']
Case in point:
Without __all__, the following simple file named mod.py:
from math import cos
def myfunc():
""" documentation"""
pass
Generates a mod.html file that contains the documentation for the user defined myfunc() and for the imported built-in function cos():
By adding __all__ and specifying the function name(s) you'd want to export inside it:
__all__ = ['myfunc'] # visible names
from math import cos
def myfunc():
""" documentation"""
pass
You'll 'filter out' the cos() function and only have documentation for myfunc():
Note: __all__ can contain functions and variable names used inside you script. pydoc will discriminate between these and segregate them in two different groups:
Functions in Functions
Variables in Data.