How do I import from a subdirectory? - python

This is a simplification of a problem that exists in a complex project.
In a folder called root_test, I have a folder called test, containing tester.py. I also have (in test) a modules directory (empty) containing a lib directory, which holds logger.py. Dir structure below.
|-root_test
|---test/
|-----tester.py
|-----__init__.py
|-----modules/
|-------__init__.py
|-------lib/
|---------__init__.py
|---------logger.py
alternatively, run from root_test:
$ ls
test
$ ls test/
__init__.py modules tester.py
$ ls test/modules/
__init__.py lib
$ ls test/modules/lib/
__init__.py logger.py
tester.py is as follows:
#!/usr/bin/env python
import sys, os
# allow running without installing
sys.path.append(os.path.join(os.path.dirname(os.path.realpath(__file__)), '..'))
import test.modules.lib.logger
but when I try and run it from root_test dir, I get the following error:
$ python test/tester.py
Traceback (most recent call last):
File "test/tester.py", line 8, in <module>
import test.modules.lib.logger
ImportError: cannot import name logger
This doesn't happen on my other laptop, and their $PYTHONPATHs are identical:
$ echo $PYTHONPATH
/usr/local/lib/python2.7/dist-packages/:/usr/local/lib/python2.7/site-packages/:

The solution was that there was a module installed called tester. And even though I was explicitly running python ~/test/tester.py, it still ran the installed module instead. Removing that module fixed the problem.

Related

ModuleNotFoundError: No module named {directory}

When I call /Users/rgupta75/github/ece-prm/.venv/bin/python -s /Users/rgupta75/github/ece-prm/deploy/hcc_k8_deploy.py stage I receive an error -
ModuleNotFoundError: No module named 'ece_prm_agent' line 4, in
This is my code snippet
import sys, os, requests, subprocess, json
from pathlib import Path # requires python 3.4+
os.environ["env"] = "local"
from ece_prm_agent.utils.cyberark.cyberark import cyberark
and below is my folder structure image -
Surprisingly, when run from IDE it works fine but not from terminal.
That error means that the module ece_prm_agent wasn't found in sys.path. It was probably working on your IDE because it is being executed from the current directory ece-prm itself which as documented is allowed:
When a module named spam is imported, the interpreter first searches for a built-in module with that name. If not found, it then searches for a file named spam.py in a list of directories given by the variable sys.path. sys.path is initialized from these locations:
The directory containing the input script (or the current directory when no file is specified).
PYTHONPATH (a list of directory names, with the same syntax as the shell variable PATH).
The installation-dependent default.
Assuming this is your file tree
$ tree
.
├── deploy
│   └── hcc_k8_deploy.py # Let's say this just calls cyberark.py::cyberark()
└── ece_prm_agent
└── utils
└── cyberark
└── cyberark.py # Let's say this just prints "Success"
Running it from the directory itself would run successfully
$ pwd
/home/nponcian/Documents/ece-prm
$ python3 deploy/hcc_k8_deploy.py
Success!
While running it outside would fail
$ pwd
/home/nponcian
$ python3 /home/nponcian/Documents/ece-prm/deploy/hcc_k8_deploy.py
Traceback (most recent call last):
File "/home/nponcian/Documents/ece-prm/deploy/hcc_k8_deploy.py", line 1, in <module>
from ece_prm_agent.utils.cyberark.cyberark import cyberark
ModuleNotFoundError: No module named 'ece_prm_agent'
As stated in the docs above, you should set PYTHONPATH so that Python can find the import modules relative to ece-prm
$ export PYTHONPATH=/home/nponcian/Documents/ece-prm/
$ python3 /home/nponcian/Documents/ece-prm/deploy/hcc_k8_deploy.py
Success!
Or if you want to do it within the Python file ece-prm/deploy/hcc_k8_deploy.py
from pathlib import Path
import sys
sys.path.append(str(Path(__file__).parent.parent))
...

Running coverage with unittest from a separate folder gives import errors

I'm trying to run coverage with the unittest module on my python project. The project has the following structure:
project/
src/
__init__.py
foo1.py
foo2.py
tests/
__init__.py
data.py
test_foo1.py
test_foo2.py
.venv/
.venv is the virtual environment where the project dependencies are installed as well as the coverage module.
The file "data.py" has some of the datasets used in the testing and is imported into the test file, as well as the module being tested. These imports are done using the following code:
#test_foo1.py
import unittest
from .data import *
from src.foo1 import ClassFoo #Code to be tested
Just running the tests works fine with the following command (ubuntu terminal):
~/project $ sudo .venv/bin/python -m unittest tests/test_foo1.py
The problem is when I try run coverage with the command
~/project $ sudo .venv/bin/coverage run tests/test_foo1.py
It starts giving me some import errors such as:
Traceback (most recent call last):
File "tests/foo1.py", line 3, in <module>
from .data import *
ImportError: attempted relative import with no known parent package
When I take the '.' out from .data (which is how it works with just unittest) it seems to fix this problem and gives the following problem:
Traceback (most recent call last):
File "tests/test_foo1.py", line 4, in <module>
from src.foo1 import ClassFoo
ModuleNotFoundError: No module named 'src'
Can someone explain why these exceptions occour with coverage but not with unittest and how I can make it so that these tests run normally on both? Do I have to change the import syntax when using coverage?

How to split python script into several files using relative imports?

I have import.py script. I want to extract some code into a separate file, say, m1.py:
$ ls
import.py m1.py
$ cat import.py
from .m1 import a
a()
$ cat m1.py
def a():
print('it works')
$ python import.py
Traceback (most recent call last):
File "import.py", line 1, in <module>
from .m1 import a
ModuleNotFoundError: No module named '__main__.m1'; '__main__' is not a package
When I switch to absolute import, it works. But I don't want accidentally importing other module. I want to be sure module from script's directory is imported. How do I make it work? Or what am I doing wrong?
If you're not overriding the built in modules. By default, python looks first in your current directory for the file name you want to import. So if there is another script having the same name in another directory, only the one you have in the current directory is the one that will be imported.
Then, you could import using the absolute import.
from m1 import a
a()
You can check this post out, for more infrotmation about importing in python.
To make sure that the one your importing isn't the built in. You can create your own package in the current directory for example,"my_package" and have your module m1 moved in it. Then you can import by:
from my_package import m1
m1.a()
Add __init__.py in the directory where m1.py is.
EDIT : Run it as a package from the previous working directory. cd .. && python -m prev_dir.import

Running python unittest in the console

I have the follwoing package structure
my-base-project
-> package1
__init__.py
MyScript.py
-> test
__init__.py
TestMyScript.py
I'd like to run the TestMyScript.py in the console. Therefore I cd in to my-base-project/test and execute python TestMyScript.py. However, I'm getting the error:
user#computer:~/my-base-project/test$ python TestMyScript.py
Traceback (most recent call last):
File "TestMyScript.py", line 4, in <module>
from package1 import MyScript
ImportError: No module named package1
How do I run these tests?
From this SO question, consider adding the directory you need to the PYTHONPATH:
import sys
sys.path.append('your certain directory')
Maybe you want to add the parent directory
sys.path.append('..')

How to import your package/modules from a script in bin folder in python

When organising python project, this structure seems to be a standard way of doing it:
myproject\
bin\
myscript
mypackage\
__init__.py
core.py
tests\
__init__.py
mypackage_tests.py
setup.py
My question is, how do I import my core.py so I can use it in myscript?
both __init__.py files are empty.
Content of myscript:
#!/usr/bin/env python
from mypackage import core
if __name__ == '__main__':
core.main()
Content of core.py
def main():
print 'hello'
When I run myscript from inside myproject directory, I get the following error:
Traceback (most recent call last):
File "bin/myscript", line 2, in <module>
from mypackage import core
ImportError: No module named mypackage
What am I missing?
Usually, setup.py should install the package in a place where the Python interpreter can find it, so after installation import mypackage will work. To facilitate running the scripts in bin right from the development tree, I'd usually simply add a simlink to ../mypackage/ to the bin directory. Of course, this requires a filesystem supporting symlinks…
I'm not sure if there is a "best choice", but the following is my normal practice:
Put whatever script I wanna run in /bin
do "python -m bin.script" in the dir myproject
When importing in script.py, consider the dir in which script.py is sitting as root. So
from ..mypackage import core
If the system supports symlink, it's a better choice.
I usually add my bin path into $PYTHONPATH, that will enable python to look for asked module in bin directory too.
export PYTHONPATH=/home/username/bin:$PYTHONPATH
$ python
import module_from_bin
I solved the issue following setuptools specifications.
In setup.py you can specify the modules as an argument for the function setup():
packages = find_packages()
This finds all modules.
p.s. you have to import this function: from setuptools import setup, find_packages

Categories