I'm a beginner and I'm not understanding something in the project folder structure.
I have this project:
.
└── convertersProject/
├── conftest.py -----------------> best practice if you're not using src structure
├── converters -----------------> module I'm trying to import/
│ ├── __init__.py
│ ├── binconverter.py
│ └── importester1.py --------> import binconverter OR from converters import */
│ └── submodule/
│ ├── __init__.py -----> not needed if I don't want to use it as package
│ └── importester2.py -> from converter import binconverter
├── outsidemodule /
│ ├── importester3.py
│ └── __init__.py
└── test /
└── test_converters.py
I'm receiving always ModuleNotErrorFound when trying to execute importester1/2/3.py directly from project folder. I'm using a virtual enviroment, setting up with python -m venv 'name' from a pyenv python 3.8.5 set with pyenv shell 3.8.5
What I think I'm understanding:
I have to use absolute paths as from converters.binconverter import bin2dec being bin2dec fucntion in binconverter. If I'd want to use relative, I should be inside the folder tree as trying to execute importertest2.py, because submodule is inside converters. So, I coudn't use relatives for outsidemodule
PYTHONPATH is the current folder you're executing, so if I executing from project folder as python converters/submodule/importester2.py, I don't have to append any value to the PYTHONPATH of the virtual enviroment (indeed, I've read it's not a good practice)
__init__.py allows you to use module without appending values to PYTHONPATH virtualenv, so I could import the converter module into outsidemodule using absolute paths.
tests are working using this logic. In fact if I change something the VSCode debugger detect import problems.
What the hell am I missing?
Related
I always have the same problem and I finally want to get rid of it. My folder structure looks like this
project
├── scipts
│ └── folder
│ └── file.py
└── submodules
└── lab_devices
└── optical_devices
└── __init__.py
└── powermeter_driver.py
I now want to include the powermeter_driver.py in file.py. So what I do in the file.py is:
from submodules.lab_devices.optical_devices.powermeter_driver import PowermeterDriver
but this gives ModuleNotFoundError: No module named 'submodules'. I don't want to use
import sys
sys.path.insert(0, '../submodules')
Is there an easy workaround?
The imports will be resolved correctly if you run the script in the correct way, which is from the parent directory and using the -m switch. So you should cd in the parent folder, add __init__.py files as in:
project
├── scripts
└── __init__.py
│ └── folder
└── __init__.py
│ └── file.py
└── submodules
└── __init__.py
└── lab_devices
└── __init__.py
└── optical_devices
└── __init__.py
└── powermeter_driver.py
so that python knows these are packages then run
python -m scripts.folder.file # note no .py
In file.py you can then use the absolute import as you are cause submodules will be detected as a package. You should indeed avoid hacking the sys.path by all means.
You need to consider that if you write from submodules.... this is an absolute import. It means Python starts searching for the submodules in all directories in sys.path. Python usually adds your current working directory as first item to sys.path, so if you cd to your project directory and then run it as a module using python -m it could work.
Of course absolute imports suck if you have files in a relative location to each other. I've had similar issues and I've created an experimental, new import library ultraimport that allows to do file system based imports. It could solve your issue if you are willing to add a new library for this.
Instead of:
from submodules.lab_devices.optical_devices.powermeter_driver import PowermeterDriver
In file.py you would then write:
import ultraimport
PowermeterDriver = ultraimport('__dir__/../../submodules/lab_devices/optical_devices/powermeter_driver.py', 'PowermeterDriver')
The file path is relative to file.py and thus this will always work, no matter how you run your code or what is in sys.path.
One caveat when importing scripts like this is if they contain further relative imports. ultraimport has a builtin preprocessor to rewrite subsequent relative imports so they continue to work.
ModuleNotFoundError running Python 3.8.x
I'm building a python package containing classes and functions each with verbose tests. Inside the package I'm trying to use these building blocks to provide end-to-end uses and examples by using absolute imports at the top of these files (like in my tests).
The project structure is as follows:
.
├── __init__.py
├── setup.py
├── examples
│ ├── __init__.py
│ └── end_to_end_1.py
├── tests
│ ├── __init__.py
│ └── utils
│ ├── __init__.py
│ ├── test_useful_one.py
│ └── test_useful_two.py
└── utils
├── __init__.py
├── useful_one.py
└── useful_two.py
I'm running all tests from the package root using python -m unittest tests/**/*.py, and both test files contain absolute package imports from utils like so,
from utils.useful_one import UsefulClass
this approach is succesful in running the tests and importing the UsefulClass class into the test files.
My issue arises when trying to use the same import statement inside the examples/end_to_end_1.py module and executing the file (again from the root of the package) using
python examples/end_to_end_1.py
Now I get a runtime ModuleNotFoundError: No module named 'utils'.
In trying to follow the python language guidelines I'm trying to use absolute imports where possible but to no avail.
I have definitely misunderstood how the __init__.py files are supposed to tell the runtime where to resolve packages from. I don't think this use case is abnormal, since I see this same pattern inside node packages and ruby gems all the time.
Makeshift (Temporary) Solution
At the moment to solve this I have applied the solution from Absolute import results in ModuleNotFoundError which, despite working, seems not too scalable when the repository is publically available and I feel like the Python ecosystem will have a solution for this issue. As Raymond Hettinger says, it feels like..
There must be a better way!
I am currently developing a package simultaneously with a few projects that use the package, and I'm struggling to figure out how to structure my directory tree and imports.
Ideally, I want something like this:
main_directory
├── shared_package
│ ├── __init__.py
│ ├── package_file1.py
│ └── package_file2.py
├── project1
│ ├── main.py
│ ├── module1.py
│ └── other_package
│ ├── __init__.py
│ └── other_package_file.py
└── project2
└── ...
I can't figure out how to make the imports work cleanly for importing shared_package from python files in project1. Is there a preferred way to do this?
Any help would be appreciated!
shared_package will eventually be standalone. Other people will import and install it the normal way, and it'll be stored with the rest of the python modules in site-packages or wherever.
To replicate this, I recommend just updating your PYTHONPATH to point to main_directory (or wherever you put shared_package anyway) - that way,
import shared_package
will still work fine for the code if shared_package was installed normally, because it's on the pythonpath either way.
Note that PYTHONPATH is an environment variable, so the means for doing this will vary based on your operating system. Regardless, a quick search for how to modify the variable permanently on your OS should be easy.
File tree:
/path/to/project/root/
├── .git/
├── Pipfile
├── Pipfile.lock
├── __init__.py
└── package1
├── __init__.py
├── src
│ ├── __init__.py
│ ├── external_lib
│ └── packageX
│ ├── packageY
│ └── packageZ
│ ├── foo.py
│ ├── bar.py
Constraints:
foo.py imports bar.py
both foo.py and bar.py import some files from external_lib/packageX/packageZ
files in external_lib/packageX/packageZ require packageX and packageY to be in the PYTHONPATH for their own includes.
I don't have a single entry point: sometimes I want to launch foo.py, sometimes bar.py.
external_lib is external, it's a git submodule. I don't want to change its content.
This is a shared project, I'd like a solution that's included in the git repository...
What I'm doing right now
I'm following the advice given here (in French though), that claims to be good practice:
Before running any script, I run cd package1/src/external_lib/packageX; export PYTHONPATH=$PYTHONPATH:pwd:pwd/packageY; cd ../../../..
All my imports are written like this: import package1.src.bar, import package1.src.external_lib.packageX.packageZ.some_module
I always call all scripts from the project root.
This gives something like this:
cd /path/to/project/root/
pipenv shell
cd package1/src/external_lib/packageX; export PYTHONPATH=$PYTHONPATH:`pwd`:`pwd`/packageY; cd ../../../..
python package1/src/foo.py
python package1/src/bar.py
Question
What is the best practice, in terms of project architecture, Pipenv use, import writing, etc., to manage all the imports smoothly, with a minimum amount of manual actions and constraints (such as export PYTHONPATH=... commands before running scripts and launching the scripts from the root), in the most Pythonic and Pipenv-ic way ?
Ideally I'd also like Pycharm to be able to get all the imports correctly without further manual settings, so that what I do in Pycharm reflects what would happen from a terminal.
One possible solution I've seen is to use Pipenv's .env file, to add the export PYTHONPATH=... in it, and rely on its automatic loading. However, apparently this file is not meant for that and should not be commited.
I am trying to run tests in radish, a Behaviour Driven Development environment for Python, but I am failing to do even the easiest of things.
I have this structure:
.
├── features
│ └── my.feature
└── radish
├── __init__.py
├── harness
│ ├── __init__.py
│ └── main.py
└── steps.py
When I do
python -c "import radish.harness"
from my working dir ".", things are fine.
When I do the same ("import radish.harness" or "import harness") in the file steps.py, I'm getting this when calling the command "radish features" from the same directory:
ModuleNotFoundError: No module named 'radish.harness'
or
ModuleNotFoundError: No module named 'harness'
The radish-bdd quick start guide quick start guide says about this:
How does radish find my python modules? radish imports all python
modules inside the basedir. Per default the basedir points to
$PWD/radish which in our case is perfectly fine.
Indeed a file placed in the radish directory will be imported automatically, but I am unable to import anything from within these files (apart from system libraries).
Can anyone advise me on how to import modules? I'm lost. It seems that my python knowledge on module import isn't helping.
I suggest you to move the 'harness' directory at the same level as 'features' and 'radish' directory.
.
├── features
│ └── my.feature
├── radish
│ ├── __init__.py
│ └── steps.py
└── harness
├── __init__.py
└── main.py
If you call radish from your working dir (".") like this:
radish -b radish features/my.feature
Then you can import your "harness" module from steps.py like this
import harness
That will work because in this case Python will find your "harness" module as it is in the current directory.