I am working on developing unittests for a project that has been already completed, however I am having a hard time running my unittests without modifying the original code. The module I am trying to test has other dependencies in the same folder that will not import when the unittests are run. Here is what my directory looks like:
root
|--main_folder
|--module1.py
|--module2.py
|--tests
|--test_module1.py
The original code in module1.py successfully imports module2.py on its own like this: from module2 import Practices where Practices is a function from module2.
The issue I am running into is that in order to run test_module1.py (which I am doing by calling python3 -m unittest from the root directory), I have to modify module1.py itself such that it says: from main_folder.module2 import Practices.
If I run the test file without modifying module1.py, I get the error ModuleNotFoundError: No module named 'module2'.
Ideally I cannot modify the code in this way, and I am trying to find a way to make my tests work without touching the application itself. How should I go about this? module1.py runs normally when I run the application without modifying the file, however modifying it so that the tests work breaks the main application. What can I do to make my tests independent of the code for the main app?
(For some more background, the test_module1.py file works by calling from main_folder.module1 import fun1 where fun1 is the function I am trying to test)
Try running your tests using one of the following commands (replacting the actual paths):
if your tests import the modules "from main_folder import ..."
env PYTHONPATH=/root python3 -m unittest
or if your tests import directly "import module1":
env PYTHONPATH=/root/main_folder python3 -m unittest
As a side note, you might need to have existing
main_folder/__init__.py
file, to get the main_folder recognized as package, depending of the python version you're using. If you currently don't have such file, try creating it (empty, no need to put code inside it) and check if the issue persists.
Related
I've got a project running on a server with the structure
proj
__init__.py
module_a.py
module_b.py
main.py
And in the header of main.py, I import from other modules with the format
from .module_a import func1
from .module_b import func2
This runs fine on the server, but when I'm testing things on my local machine it raises the error:
ModuleNotFoundError: No module named '__main__.module_a'; '__main__' is not a package
There have been a lot of questions asked regarding this error and the accepted solution is almost always to replace the import statement with
from proj.module_a import func1
Is there something I can do to configure my local environment to allow this type of syntax without having a completely different set of import statements depending on whether the code is running locally or remotely?
Keep your imports relative, without using the package full path, so that you have the flexibility of renaming it as you wish, like in
from .module_a import func1
Then in your local environment, change your current dir to the proj parent folder and run:
python -m proj.main
An alternative would be to rename main.py to __main__.py and then just writing
python -m proj
will do. But that may affect the behaviour on the server if you copy the files as is.
Packages are usually to be imported. This is a common problem when we start running from arbitrary scripts located inside the package (in this case main.py). If the package is simply imported from outside, everything works.
I have a project I am working on, let's call it Project, which lives in the directory Project somewhere wholly unknown to me (really it lives both on my local system and on a couple Docker build systems). In that project, I have some source files, source/module1.py and source/module2.py. I also have some example files, some test files, and an init.py So my directory looks something like this:
Project
__init__.py
/source
module1.py
module2.py
/test
testRunner.py
/examples
awesomeExample.py
However, module1 needs some stuff from module2. My naive self thought this could be done by putting an import statement in module1:
import module2
# Do some other interesting stuff
And this works, but only when I am running / importing module 1 from the source directory. If I am, for example, running some unit tests in another directory test/testRunner.py, either from the test directory or in the main Project directory, the import will fail. Same with trying to use it when running an example in the examples directory.
So here is my problem: in general, I don't know where the calling script lives. It might be in the examples directory, it might be in the test directory, or it might be in the main Project directory (for example when trying to import stuff with an init.py). How do I ensure that module1 can always import module2 in each of these scenarios?
I am not looking for a solution like "add all those directories to your python path". Initially I just added Project to my python path on my local machine, and then did all my imports relative to that (import Project.source.module2), but this (predictably) caused my builds to fail on the Docker instances. I don't just want this to work on my local machine, but also on the Docker instances I'm using to build and test this software, and on any user's machine that subsequently installs it (i.e. by doing a pip install Project. What is the most robust way to make sure this dependency is satisfied? How can I make sure module1 can import module2 regardless of where module1 itself is imported from? Any python 3.x.x solution is welcome.
I figured out a way to do it (credit here) - it's a little inelegant, but extremely robust. Works on my local machine independent of whether using an import statement or running a script directly, as well as my build servers which use github actions and Travis CI.
Basically, I added a file in the source directory, called context.py with the following contents:
import os
import sys
fileLocation = os.path.dirname(os.path.abspath(__file__))
sourceLocation = os.path.abspath(os.path.join(fileLocation, '..', 'source/'))
sys.path.insert(0, sourceLocation)
This finds out the current file being executed from python, and then uses that to add to the python path. And then in my module1.py file, at the top I have:
import context
import module2
Now, whenever module1 is imported, it successfully imports module2. More elegant answers or comments on why this works and in which cases it might fail are appreciated.
I have a directory structure like what follows:
package_root/lib/package_name/foo.py
in foo.py I have a function that creates a file (bar.py) that contains a function (f). foo.py also has a function that then imports bar and runs bar.f().
When I state, within foo.py "import bar", it works, and I have access to bar.f and it runs just fine.
However, this is not the case when running pytest. We run pytest from package_root and it cannot find the module bar when it attempts to import it. This is because (I believe) when running pytest, it creates bar.py in /package_root which contains no init.py file. Since our tests run automatically for our cicd pipeline, I need it to be able to properly import when running pytest from package_root. Any suggestions?
As far as I comprehend from your question is that you are facing imports issue in your pipeline(Correct me if am wrong). In pytest, normally your framework should contain the pytest.ini/tox.ini along with all the test scripts. Please refer link(http://doc.pytest.org/en/latest/customize.html).
Create a file pytest.ini/tox.ini in your framework design from where you are running your code(in your case in the package_root/ directory).
#pytest.ini
[pytest]
python_paths = . lib/<package-name>/
I'm setting up some code for unittesting. My directory currently looks like this:
project/
src/
__init__.py
sources.py
test/
__init__.py
sources_test.py
In __init__.py for the test directory, I have these two lines:
import sys
sys.path.insert(0, '../')
In the test files, I have the line import src.sources.
When I use nose to run these tests from the project directory, everything works just fine. If I try to run the tests individually it gives me this error:
ImportError: No module named src.sources
I assume that this is because when I run the test from the command line it isn't using __init__.py. Is there a way I can make sure that it will use those lines even when I try to run the tests individually?
I could take the lines out of __init__.py and put them into my test files, but I'm trying to avoid doing that.
To run the tests individually I am running python sources_test.py
You're really trying to abuse packages here, and that isn't a good idea.
The simple solution is to not run the tests from within the tests directory. Just cd up a level, then do python tests/sources_test.py.
Of course that in itself isn't going to import test/__init__.py. For that, you really need to import the package. So python -m tests.sources_test is probably a better idea… except, of course, that if your package is made to be run as a script but not to be imported, that won't work.
Alternatively, you could (on POSIX platforms, at least) do PYTHONPATH=.. python sources_test.py from within tests. This is a bit hacky, but it should work.
Or, better, combine the above, and, from outside of tests, do PYTHONPATH=. python tests/sources_test.py.
A really hacky workaround is to explicitly import __init__. This should basically work for you simple use case, but everything ends up wrong—in particular, you end up with a module named __init__ instead of one named test, and of course your main module isn't named test.sources_test, and in fact there is no test package at all. Unless you accidentally re-import anything after modifying sys.path, in which case you may get duplicates of the modules.
If you write
import src.source
the python interpreter looks into the src directory for a __init__.py file. If it exists, you can use the directory as a package name. If your are not in your project directory, which is the case when you are in the src directory, then python looks into the directories in $PYTHONPATH environment variable (at least in linux, windows should also have some environment variable, maybe with another name), if it can find some directory src with a __init__.py file in it.
Did you set your $PYTHONPATH?
I'm starting a project in python, the code structure now as below:
project/
__init__.py
a.py
b.py
mainA.py
utilities/
__init__.py
mainB.py
c.py
The __init__ files are all blank.
I want to run utilities/mainB.py as a program(using something like python main.py), and mainB needs to import a.py and b.py. So I tried from .. import a and some other approaches, but the import failed. The error information is:
ValueError: Attempted relative import in non-package
So here comes the questions:
how to fix mainB.py so it can be run as a main program?
mainA.py can be run as main program now, it also imports a.py and b.py(using import a and import b).
I think the code structure may become more complex. Say, if mainA.py has to import a module from project/some/directory, how can I do that?
See this previous question. You have two options. One is to use the __package__ attribute as described in PEP 366 to set the relative name of your modules. The other is to execute your scripts as modules (using the -m flag to the interpreter) instead of running them directly as scripts.
You could use Python's built-in module-running functionality (python -m <module>).
python -m project.utilities.mainB
This allows you to write mainB normally as part of the package, so relative and absolute imports will both work correctly.
For an in-depth discussion of this functionality, see PEP-338.
You should add 'project' dir in PYTHON_PATH and then, in mainB.py:
from project import a