Python module not found when calling from another - python

I have this project structure:
/project_name
main.py
----- __init__.py
------ /modules
-------- __init__.py
-------- module1.py
-------- module2.py
I've edited to add more information. After working and testing a lot of recomendations to solve the problem, nothing works.
Enviroment
Windows
Conda virtual enviroment project python 3.10
VSCode
Problem
When running main.py from VScode
from modules.module1 import *
if __name__ == "__main__":
pass
this error raise
from module1 import *
ModuleNotFoundError: No module named 'module2'
Modules
module1.py
from module2 import *
module2.py
def test():
print("just testing")
So the problem always occurs when from main.py i import a module that imports another module. The second module imported from the module that i have imported from main.py is not found.
Still looking for the solution

You could try to set PythonPath first. If you use vscode to develop, you could set this PythonPath in setting.json
if module1.py and module2.py are in the same directory, you could try to use relative import. Please pay attention to cycle import as well.
from .module1 import Module1
main.py had better to move into src directory.

Pycharm working directory
If you use Pycharm, you can configure the run configuration working directory to the root of your project.

Import your package in editable mode
Create a pyproject.toml file in the root of the project with this content:
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[project]
name = "package_name"
version = "0.0.1"
requires-python = ">=3.7"
Your package need to follow this structure:
project_name/
└── src/
└── package_name/
├── __init__.py
└── example.py
To install it in your environment from the root of the project:
user/projects/package_name$ pip install -e .
By doing that, you won't have to worry about PYTHONPATH, workind directory, relative/absolute import. You just import and use your package using the intended path you created, Python will know how to look thanks of the pip install command.

run the file from project_name folder
If you run a file from inside a folder, python will be able to look local module from this folder. You need to call it at the root of the project.
user/projects/package_name$ python -m src/modules/module2.py

Thanks all. Finally i've solved it including this line before importing my file module:
sys.path.append("X:\\path\\root_folder\\")

Related

ImportError when using pytest in terminal [duplicate]

I used easy_install to install pytest on a Mac and started writing tests for a project with a file structure likes so:
repo/
|--app.py
|--settings.py
|--models.py
|--tests/
|--test_app.py
Run py.test while in the repo directory, and everything behaves as you would expect.
But when I try that same thing on either Linux or Windows (both have pytest 2.2.3 on them), it barks whenever it hits its first import of something from my application path. For instance, from app import some_def_in_app.
Do I need to be editing my PATH to run py.test on these systems?
I'm not sure why py.test does not add the current directory in the PYTHONPATH itself, but here's a workaround (to be executed from the root of your repository):
python -m pytest tests/
It works because Python adds the current directory in the PYTHONPATH for you.
Recommended approach for pytest>=7: use the pythonpath setting
Recently, pytest has added a new core plugin that supports sys.path modifications via the pythonpath configuration value. The solution is thus much simpler now and doesn't require any workarounds anymore:
pyproject.toml example:
[tool.pytest.ini_options]
pythonpath = [
"."
]
pytest.ini example:
[pytest]
pythonpath = .
The path entries are calculated relative to the rootdir, thus . adds repo directory to sys.path in this case.
Multiple path entries are also allowed: for a layout
repo/
├── src/
| └── lib.py
├── app.py
└── tests
├── test_app.py
└── test_lib.py
the configuration
[tool.pytest.ini_options]
pythonpath = [
".", "src",
]
or
[pytest]
pythonpath = . src
will add both app and lib modules to sys.path, so
import app
import lib
will both work.
Original answer (not recommended for recent pytest versions; use for pytest<7 only): conftest solution
The least invasive solution is adding an empty file named conftest.py in the repo/ directory:
$ touch repo/conftest.py
That's it. No need to write custom code for mangling the sys.path or remember to drag PYTHONPATH along, or placing __init__.py into dirs where it doesn't belong (using python -m pytest as suggested in Apteryx's answer is a good solution though!).
The project directory afterwards:
repo
├── conftest.py
├── app.py
├── settings.py
├── models.py
└── tests
└── test_app.py
Explanation
pytest looks for the conftest modules on test collection to gather custom hooks and fixtures, and in order to import the custom objects from them, pytest adds the parent directory of the conftest.py to the sys.path (in this case the repo directory).
Other project structures
If you have other project structure, place the conftest.py in the package root dir (the one that contains packages but is not a package itself, so does not contain an __init__.py), for example:
repo
├── conftest.py
├── spam
│ ├── __init__.py
│ ├── bacon.py
│ └── egg.py
├── eggs
│ ├── __init__.py
│ └── sausage.py
└── tests
├── test_bacon.py
└── test_egg.py
src layout
Although this approach can be used with the src layout (place conftest.py in the src dir):
repo
├── src
│ ├── conftest.py
│ ├── spam
│ │ ├── __init__.py
│ │ ├── bacon.py
│ │ └── egg.py
│ └── eggs
│ ├── __init__.py
│ └── sausage.py
└── tests
├── test_bacon.py
└── test_egg.py
beware that adding src to PYTHONPATH mitigates the meaning and benefits of the src layout! You will end up with testing the code from repository and not the installed package. If you need to do it, maybe you don't need the src dir at all.
Where to go from here
Of course, conftest modules are not just some files to help the source code discovery; it's where all the project-specific enhancements of the pytest framework and the customization of your test suite happen. pytest has a lot of information on conftest modules scattered throughout their docs; start with conftest.py: local per-directory plugins
Also, SO has an excellent question on conftest modules: In py.test, what is the use of conftest.py files?
I had the same problem. I fixed it by adding an empty __init__.py file to my tests directory.
Yes, the source folder is not in Python's path if you cd to the tests directory.
You have two choices:
Add the path manually to the test files. Something like this:
import sys, os
myPath = os.path.dirname(os.path.abspath(__file__))
sys.path.insert(0, myPath + '/../')
Run the tests with the env var PYTHONPATH=../.
Run pytest itself as a module with:
python -m pytest tests
This happens when the project hierarchy is, for example, package/src package/tests and in tests you import from src. Executing as a module will consider imports as absolute rather than relative to the execution location.
You can run with PYTHONPATH in project root
PYTHONPATH=. py.test
Or use pip install as editable import
pip install -e . # install package using setup.py in editable mode
I had the same problem in Flask.
When I added:
__init__.py
to the tests folder, the problem disappeared :)
Probably the application couldn't recognize folder tests as a module.
I created this as an answer to your question and my own confusion. I hope it helps. Pay attention to PYTHONPATH in both the py.test command line and in the tox.ini.
https://github.com/jeffmacdonald/pytest_test
Specifically: You have to tell py.test and tox where to find the modules you are including.
With py.test you can do this:
PYTHONPATH=. py.test
And with tox, add this to your tox.ini:
[testenv]
deps= -r{toxinidir}/requirements.txt
commands=py.test
setenv =
PYTHONPATH = {toxinidir}
I fixed it by removing the top-level __init__.py in the parent folder of my sources.
I started getting weird ConftestImportFailure: ImportError('No module named ... errors when I had accidentally added __init__.py file to my src directory (which was not supposed to be a Python package, just a container of all source).
It is a bit of a shame that this is an issue in Python... But just adding this environment variable is the most comfortable way, IMO:
export PYTHONPATH=$PYTHONPATH:.
You can put this line in you .zshrc or .bashrc file.
I was having the same problem when following the Flask tutorial and I found the answer on the official Pytest documentation.
It's a little shift from the way I (and I think many others) are used to do things.
You have to create a setup.py file in your project's root directory with at least the following two lines:
from setuptools import setup, find_packages
setup(name="PACKAGENAME", packages=find_packages())
where PACKAGENAME is your app's name. Then you have to install it with pip:
pip install -e .
The -e flag tells pip to install the package in editable or "develop" mode. So the next time you run pytest it should find your app in the standard PYTHONPATH.
I had a similar issue. pytest did not recognize a module installed in the environment I was working in.
I resolved it by also installing pytest into the same environment.
Also if you run pytest within your virtual environment make sure pytest module is installed within your virtual environment. Activate your virtual environment and run pip install pytest.
For me the problem was tests.py generated by Django along with tests directory. Removing tests.py solved the problem.
I got this error as I used relative imports incorrectly. In the OP example, test_app.py should import functions using e.g.
from repo.app import *
However liberally __init__.py files are scattered around the file structure, this does not work and creates the kind of ImportError seen unless the files and test files are in the same directory.
from app import *
Here's an example of what I had to do with one of my projects:
Here’s my project structure:
microbit/
microbit/activity_indicator/activity_indicator.py
microbit/tests/test_activity_indicator.py
To be able to access activity_indicator.py from test_activity_indicator.py I needed to:
start test_activity_indicatory.py with the correct relative import:
from microbit.activity_indicator.activity_indicator import *
put __init__.py files throughout the project structure:
microbit/
microbit/__init__.py
microbit/activity_indicator/__init__.py
microbit/activity_indicator/activity_indicator.py
microbit/tests/__init__.py
microbit/tests/test_activity_indicator.py
According to a post on Medium by Dirk Avery (and supported by my personal experience) if you're using a virtual environment for your project then you can't use a system-wide install of pytest; you have to install it in the virtual environment and use that install.
In particular, if you have it installed in both places then simply running the pytest command won't work because it will be using the system install. As the other answers have described, one simple solution is to run python -m pytest instead of pytest; this works because it uses the environment's version of pytest. Alternatively, you can just uninstall the system's version of pytest; after reactivating the virtual environment the pytest command should work.
I was getting this error due to something even simpler (you could even say trivial). I hadn't installed the pytest module. So a simple apt install python-pytest fixed it for me.
'pytest' would have been listed in setup.py as a test dependency. Make sure you install the test requirements as well.
Since no one has suggested it, you could also pass the path to the tests in your pytest.ini file:
[pytest]
...
testpaths = repo/tests
See documentation: https://docs.pytest.org/en/6.2.x/customize.html#pytest-ini
Side effect for Visual Studio Code: it should pick up the unit test in the UI.
We have fixed the issue by adding the following environment variable.
PYTHONPATH=${PYTHONPATH}:${PWD}/src:${PWD}/test
As pointed out by Luiz Lezcano Arialdi, the correct solution is to install your package as an editable package.
Since I am using Pipenv, I thought about adding to his answer a step-by-step how to install the current path as an edible with Pipenv, allowing to run pytest without the need of any mangling code or lose files.
You will need to have the following minimal folder structure (documentation):
package/
package/
__init__.py
module.py
tests/
module_test.py
setup.py
setup.py mostly has the following minium code (documentation):
import setuptools
setuptools.setup(name='package', # Change to your package name
packages=setuptools.find_packages())
Then you just need to run pipenv install --dev -e . and Pipenv will install the current path as an editable package (the --dev flag is optional) (documentation).
Now you should be able to run pytest without problems.
If this pytest error appears not for your own package, but for a Git-installed package in your package's requirements.txt, the solution is to switch to editable installation mode.
For example, suppose your package's requirements.txt had the following line:
git+https://github.com/foo/bar.git
You would instead replace it with the following:
-e git+https://github.com/foo/bar.git#egg=bar
If nothing works, make sure your test_module.py is listed under the correct src directory.
Sometimes it will give ModuleNotFoundError not because modules are misplaced or export PYTHONPATH="${PWD}:${PYTHONPATH}" is not working, its because test_module.py is placed into a wrong directory under the tests folder.
it should be 1-to-1 mapping relation recursively instead of the root folder should be named as "tests" and the name of the file that include test code should starts with "test_",
for example,
./nlu_service/models/transformers.py
./tests/models/test_transformers.py
This was my experience.
Very often the tests were interrupted due to module being unable to be imported.
After research, I found out that the system is looking at the file in the wrong place and we can easily overcome the problem by copying the file, containing the module, in the same folder as stated, in order to be properly imported.
Another solution proposal would be to change the declaration for the import and show MutPy the correct path of the unit. However, due to the fact that multiple units can have this dependency, meaning we need to commit changes also in their declarations, we prefer to simply move the unit to the folder.
My solution:
Create the conftest.py file in the test directory containing:
import os
import sys
sys.path.insert(0, os.path.dirname(os.path.realpath(__file__)) + "/relative/path/to/code/")
This will add the folder of interest to the Python interpreter path without modifying every test file, setting environment variable or messing with absolute/relative paths.

Absolute import in conda env

I met with a problem with absolute import when using conda env. Here is the structure of my project.
project/
package_1/
__init__.py
file_1.py
subpackage_1/
run.py
In package_1.subpackage_1.run.py there is an absolute import import package_1.file_1. However, when I ran python package_1/subpackage_1/run.py in package folder, I got an error:
ModuleNotFoundError: No module named 'package_1'. I tried to print sys.path. project.package_1.subpackage_1 is in sys.path, but the folder from where I ran the command, project is not. I tried to add project in PATH or PYTHONPATH, but it doesn't work in conda env. Does anyone know how to fix this? Thanks!!!
One of the ways to do this is to add the directory to your sys.path with this code at the top of run.py
import sys
import os
sys.path.append(os.path.join(os.path.abspath(os.path.dirname(__file__)), 'package_1'))
And then change the line in run.py
import package_1.file_1
to
import file_1
Now python can import file1 directly since its directory is on the path.
Summary
You can accomplish what you want with relative imports, or with absolute imports if you restructure your project. Modifying your sys.path or PYTHONPATH should not be your go-to solution. If you really want global availability, you could install your local package with conda.
Option 1: Relative Imports
If you want to be able to run a file inside a sub-module directly (i.e. python package_1/subpackage_1/run.py) then you should consider using relative imports.
Example:
project/
package_1/
__init__.py
file_1.py
subpackage_1/
__init__.py
run.py
# run.py
import ..file_1
Option 2: Absolute Imports
If you want to use absolute imports, then your entry point (the script that you run) should be in the top level (package_1) instead of inside a sub-package.
Example:
project/
package_1/
__init__.py
run.py
file_1.py
subpackage_1/
__init__.py
stuff.py
# run.py
import package_1.subpackage_1.stuff
stuff.run()
# stuff.py
import package_1.file_1
Option 3: Installing your local package with conda
Once you configure your package correctly you should be able to simply run
conda install .
Which will install your local package as if it were a published package. This is likely overkill for your needs.
Why not modify PYTHONPATH or sys.path?
If you rely on having your local package path on PYTHONPATH, you every time you move the project or copy it onto a new computer.
Appending entries to sys.path in code often accomplishes a similar effect to what you can do with relative imports, but later import statements lose semantics.

How do I permanently add paths to PYTHONPATH in a script? [duplicate]

I've tried reading through questions about sibling imports and even the
package documentation, but I've yet to find an answer.
With the following structure:
├── LICENSE.md
├── README.md
├── api
│   ├── __init__.py
│   ├── api.py
│   └── api_key.py
├── examples
│   ├── __init__.py
│   ├── example_one.py
│   └── example_two.py
└── tests
│   ├── __init__.py
│   └── test_one.py
How can the scripts in the examples and tests directories import from the
api module and be run from the commandline?
Also, I'd like to avoid the ugly sys.path.insert hack for every file. Surely
this can be done in Python, right?
Tired of sys.path hacks?
There are plenty of sys.path.append -hacks available, but I found an alternative way of solving the problem in hand.
Summary
Wrap the code into one folder (e.g. packaged_stuff)
Create setup.py script where you use setuptools.setup(). (see minimal setup.py below)
Pip install the package in editable state with pip install -e <myproject_folder>
Import using from packaged_stuff.modulename import function_name
Setup
The starting point is the file structure you have provided, wrapped in a folder called myproject.
.
└── myproject
├── api
│ ├── api_key.py
│ ├── api.py
│ └── __init__.py
├── examples
│ ├── example_one.py
│ ├── example_two.py
│ └── __init__.py
├── LICENCE.md
├── README.md
└── tests
├── __init__.py
└── test_one.py
I will call the . the root folder, and in my example case it is located at C:\tmp\test_imports\.
api.py
As a test case, let's use the following ./api/api.py
def function_from_api():
return 'I am the return value from api.api!'
test_one.py
from api.api import function_from_api
def test_function():
print(function_from_api())
if __name__ == '__main__':
test_function()
Try to run test_one:
PS C:\tmp\test_imports> python .\myproject\tests\test_one.py
Traceback (most recent call last):
File ".\myproject\tests\test_one.py", line 1, in <module>
from api.api import function_from_api
ModuleNotFoundError: No module named 'api'
Also trying relative imports wont work:
Using from ..api.api import function_from_api would result into
PS C:\tmp\test_imports> python .\myproject\tests\test_one.py
Traceback (most recent call last):
File ".\tests\test_one.py", line 1, in <module>
from ..api.api import function_from_api
ValueError: attempted relative import beyond top-level package
Steps
Make a setup.py file to the root level directory
The contents for the setup.py would be*
from setuptools import setup, find_packages
setup(name='myproject', version='1.0', packages=find_packages())
Use a virtual environment
If you are familiar with virtual environments, activate one, and skip to the next step. Usage of virtual environments are not absolutely required, but they will really help you out in the long run (when you have more than 1 project ongoing..). The most basic steps are (run in the root folder)
Create virtual env
python -m venv venv
Activate virtual env
source ./venv/bin/activate (Linux, macOS) or ./venv/Scripts/activate (Win)
To learn more about this, just Google out "python virtual env tutorial" or similar. You probably never need any other commands than creating, activating and deactivating.
Once you have made and activated a virtual environment, your console should give the name of the virtual environment in parenthesis
PS C:\tmp\test_imports> python -m venv venv
PS C:\tmp\test_imports> .\venv\Scripts\activate
(venv) PS C:\tmp\test_imports>
and your folder tree should look like this**
.
├── myproject
│ ├── api
│ │ ├── api_key.py
│ │ ├── api.py
│ │ └── __init__.py
│ ├── examples
│ │ ├── example_one.py
│ │ ├── example_two.py
│ │ └── __init__.py
│ ├── LICENCE.md
│ ├── README.md
│ └── tests
│ ├── __init__.py
│ └── test_one.py
├── setup.py
└── venv
├── Include
├── Lib
├── pyvenv.cfg
└── Scripts [87 entries exceeds filelimit, not opening dir]
pip install your project in editable state
Install your top level package myproject using pip. The trick is to use the -e flag when doing the install. This way it is installed in an editable state, and all the edits made to the .py files will be automatically included in the installed package.
In the root directory, run
pip install -e . (note the dot, it stands for "current directory")
You can also see that it is installed by using pip freeze
(venv) PS C:\tmp\test_imports> pip install -e .
Obtaining file:///C:/tmp/test_imports
Installing collected packages: myproject
Running setup.py develop for myproject
Successfully installed myproject
(venv) PS C:\tmp\test_imports> pip freeze
myproject==1.0
Add myproject. into your imports
Note that you will have to add myproject. only into imports that would not work otherwise. Imports that worked without the setup.py & pip install will work still work fine. See an example below.
Test the solution
Now, let's test the solution using api.py defined above, and test_one.py defined below.
test_one.py
from myproject.api.api import function_from_api
def test_function():
print(function_from_api())
if __name__ == '__main__':
test_function()
running the test
(venv) PS C:\tmp\test_imports> python .\myproject\tests\test_one.py
I am the return value from api.api!
* See the setuptools docs for more verbose setup.py examples.
** In reality, you could put your virtual environment anywhere on your hard disk.
Seven years after
Since I wrote the answer below, modifying sys.path is still a quick-and-dirty trick that works well for private scripts, but there has been several improvements
Installing the package (in a virtualenv or not) will give you what you want, though I would suggest using pip to do it rather than using setuptools directly (and using setup.cfg to store the metadata)
Using the -m flag and running as a package works too (but will turn out a bit awkward if you want to convert your working directory into an installable package).
For the tests, specifically, pytest is able to find the api package in this situation and takes care of the sys.path hacks for you
So it really depends on what you want to do. In your case, though, since it seems that your goal is to make a proper package at some point, installing through pip -e is probably your best bet, even if it is not perfect yet.
Old answer
As already stated elsewhere, the awful truth is that you have to do ugly hacks to allow imports from siblings modules or parents package from a __main__ module. The issue is detailed in PEP 366. PEP 3122 attempted to handle imports in a more rational way but Guido has rejected it one the account of
The only use case seems to be running scripts that happen
to be living inside a module's directory, which I've always seen as an
antipattern.
(here)
Though, I use this pattern on a regular basis with
# Ugly hack to allow absolute import from the root folder
# whatever its name is. Please forgive the heresy.
if __name__ == "__main__" and __package__ is None:
from sys import path
from os.path import dirname as dir
path.append(dir(path[0]))
__package__ = "examples"
import api
Here path[0] is your running script's parent folder and dir(path[0]) your top level folder.
I have still not been able to use relative imports with this, though, but it does allow absolute imports from the top level (in your example api's parent folder).
Here is another alternative that I insert at top of the Python files in tests folder:
# Path hack.
import sys, os
sys.path.insert(0, os.path.abspath('..'))
You don't need and shouldn't hack sys.path unless it is necessary and in this case it is not. Use:
import api.api_key # in tests, examples
Run from the project directory: python -m tests.test_one.
You should probably move tests (if they are api's unittests) inside api and run python -m api.test to run all tests (assuming there is __main__.py) or python -m api.test.test_one to run test_one instead.
You could also remove __init__.py from examples (it is not a Python package) and run the examples in a virtualenv where api is installed e.g., pip install -e . in a virtualenv would install inplace api package if you have proper setup.py.
I don't yet have the comprehension of Pythonology necessary to see the intended way of sharing code amongst unrelated projects without a sibling/relative import hack. Until that day, this is my solution. For examples or tests to import stuff from ..\api, it would look like:
import sys.path
import os.path
# Import from sibling directory ..\api
sys.path.append(os.path.dirname(os.path.abspath(__file__)) + "/..")
import api.api
import api.api_key
For siblings package imports, you can use either the insert or the append method of the [sys.path][2] module:
if __name__ == '__main__' and if __package__ is None:
import sys
from os import path
sys.path.append( path.dirname( path.dirname( path.abspath(__file__) ) ) )
import api
This will work if you are launching your scripts as follows:
python examples/example_one.py
python tests/test_one.py
On the other hand, you can also use the relative import:
if __name__ == '__main__' and if __package__ is not None:
import ..api.api
In this case you will have to launch your script with the '-m' argument (note that, in this case, you must not give the '.py' extension):
python -m packageName.examples.example_one
python -m packageName.tests.test_one
Of course, you can mix the two approaches, so that your script will work no matter how it is called:
if __name__ == '__main__':
if __package__ is None:
import sys
from os import path
sys.path.append( path.dirname( path.dirname( path.abspath(__file__) ) ) )
import api
else:
import ..api.api
For readers in 2021: If you're not confident with pip install -e :
Consider this hierarchy, as recommended by an answer from Relative imports in Python 3:
MyProject
├── src
│ ├── bot
│ │   ├── __init__.py
│ │   ├── main.py
│ │   └── sib1.py
│ └── mod
│ ├── __init__.py
│ └── module1.py
└── main.py
The content of main.py, which is the starting point and we use absolute import (no leading dots) here:
from src.bot import main
if __name__ == '__main__':
main.magic_tricks()
The content of bot/main.py, which takes advantage of explicit relative imports:
from .sib1 import my_drink # Both are explicit-relative-imports.
from ..mod.module1 import relative_magic
def magic_tricks():
# Using sub-magic
relative_magic(in=["newbie", "pain"], advice="cheer_up")
my_drink()
# Do your work
...
Now here comes the reasoning:
When executing python MyProject/main.py, the path/to/MyProject is added into the sys.path.
The absolute import import src.bot will read it.
The from ..mod part means it will go up one level to MyProject/src.
Can we see it? YES, since path/to/MyProject is added into the sys.path.
So the point is:
We should put the main script next to MyProject/src, since that when doing relative-referencing, we won't go out of the src, and the absolute import import src. provides the just-fit scope for us: the src/ scope.
See also: ModuleNotFoundError: No module named 'sib1'
TLDR
This method does not require setuptools, path hacks, additional command line arguments, or specifying the top level of the package in every single file of your project.
Just make a script in the parent directory of whatever your are calling to be your __main__ and run everything from there. For further explanation continue reading.
Explanation
This can be accomplished without hacking a new path together, extra command line args, or adding code to each of your programs to recognize its siblings.
The reason this fails as I believe was mentioned before is the programs being called have their __name__ set as __main__. When this occurs the script being called accepts itself to be on the top level of the package and refuses to recognize scripts in sibling directories.
However, everything under the top level of the directory will still recognize ANYTHING ELSE under the top level. This means the ONLY thing you have to do to get files in sibling directories to recognize/utilize each other is to call them from a script in their parent directory.
Proof of Concept
In a dir with the following structure:
.
|__Main.py
|
|__Siblings
|
|___sib1
| |
| |__call.py
|
|___sib2
|
|__callsib.py
Main.py contains the following code:
import sib1.call as call
def main():
call.Call()
if __name__ == '__main__':
main()
sib1/call.py contains:
import sib2.callsib as callsib
def Call():
callsib.CallSib()
if __name__ == '__main__':
Call()
and sib2/callsib.py contains:
def CallSib():
print("Got Called")
if __name__ == '__main__':
CallSib()
If you reproduce this example you will notice that calling Main.py will result in "Got Called" being printed as is defined in sib2/callsib.py even though sib2/callsib.py got called through sib1/call.py. However if one were to directly call sib1/call.py (after making appropriate changes to the imports) it throws an exception. Even though it worked when called by the script in its parent directory, it will not work if it believes itself to be on the top level of the package.
You need to look to see how the import statements are written in the related code. If examples/example_one.py uses the following import statement:
import api.api
...then it expects the root directory of the project to be in the system path.
The easiest way to support this without any hacks (as you put it) would be to run the examples from the top level directory, like this:
PYTHONPATH=$PYTHONPATH:. python examples/example_one.py
Just in case someone using Pydev on Eclipse end up here: you can add the sibling's parent path (and thus the calling module's parent) as an external library folder using Project->Properties and setting External Libraries under the left menu Pydev-PYTHONPATH. Then you can import from your sibling, e. g. from sibling import some_class.
I wanted to comment on the solution provided by np8 but I don't have enough reputation so I'll just mention that you can create a setup.py file exactly as they suggested, and then do pipenv install --dev -e . from the project root directory to turn it into an editable dependency. Then your absolute imports will work e.g. from api.api import foo and you don't have to mess around with system-wide installations.
Documentation
If you're using pytest then the pytest docs describe a method of how to reference source packages from a separate test package.
The suggested project directory structure is:
setup.py
src/
mypkg/
__init__.py
app.py
view.py
tests/
__init__.py
foo/
__init__.py
test_view.py
bar/
__init__.py
test_view.py
Contents of the setup.py file:
from setuptools import setup, find_packages
setup(name="PACKAGENAME", packages=find_packages())
Install the packages in editable mode:
pip install -e .
The pytest article references this blog post by Ionel Cristian Mărieș.
I made a sample project to demonstrate how I handled this, which is indeed another sys.path hack as indicated above. Python Sibling Import Example, which relies on:
if __name__ == '__main__': import os import sys sys.path.append(os.getcwd())
This seems to be pretty effective so long as your working directory remains at the root of the Python project.
in your main file add this:
import sys
import os
sys.path.append(os.path.abspath(os.path.join(__file__,mainScriptDepth)))
mainScriptDepth = the depth of the main file from the root of the project.
Here is your case mainScriptDepth = "../../". Then you can import by specifying the path (from api.api import * ) from the root of your project.
The problem:
You simply can not get import mypackage to work in test.py. You will need either an editable install, change to path, or changes to __name__ and path
demo
├── dev
│ └── test.py
└── src
└── mypackage
├── __init__.py
└── module_of_mypackage.py
--------------------------------------------------------------
ValueError: attempted relative import beyond top-level package
The solution:
import sys; sys.path += [sys.path[0][:-3]+"src"]
Put the above before attempting imports in test.py. Thats it. You can now import mypackage.
This will work both on Windows and Linux. It will also not care from which path you run your script. It is short enough to slap it anywhere you might need it.
Why it works:
The sys.path contains the places, in order, where to look for packages when attempting imports if they are not found in installed site packages. When you run test.py the first item in sys.path will be something like /mnt/c/Users/username/Desktop/demo/dev i.e.: where you ran your file. The oneliner will simply add the sibling folder to path and everything works. You will not have to worry about Windows vs Linux file paths since we are only editing the last folder name and nothing else. If you project structure is already set in stone for your repository we can also reasonably just use the magic number 3 to slice away dev and substitute src
for the main question:
call sibling folder as module:
from .. import siblingfolder
call a_file.py from sibling folder as module:
from ..siblingfolder import a_file
call a_function inside a file in sibling folder as module:
from..siblingmodule.a_file import func_name_exists_in_a_file
The easiest way.
go to lib/site-packages folder.
if exists 'easy_install.pth' file, just edit it and add your directory that you have script that you want make it as module.
if not exists, just make it one...and put your folder that you want there
after you add it..., python will be automatically perceive that folder as similar like site-packages and you can call every script from that folder or subfolder as a module.
i wrote this by my phone, and hard to set it to make everyone comfortable to read.
First, you should avoid having files with the same name as the module itself. It may break other imports.
When you import a file, first the interpreter checks the current directory and then searchs global directories.
Inside examples or tests you can call:
from ..api import api
Project
1.1 User
1.1.1 about.py
1.1.2 init.py
1.2 Tech
1.2.1 info.py
1.1.2 init.py
Now, if you want to access about.py module in the User package, from the info.py module in Tech package then you have to bring the cmd (in windows) path to Project i.e.
**C:\Users\Personal\Desktop\Project>**as per the above Package example. And from this path you have to enter, python -m Package_name.module_name
For example for the above Package we have to do,
C:\Users\Personal\Desktop\Project>python -m Tech.info
Imp Points
Don't use .py extension after info module i.e. python -m Tech.info.py
Enter this, where the siblings packages are in the same level.
-m is the flag, to check about it you can type from the cmd python --help

ModuleNotFoundError with pytest

I want my tests folder separate to my application code. My project structure is like so
myproject/
myproject/
myproject.py
moduleone.py
tests/
myproject_test.py
myproject.py
from moduleone import ModuleOne
class MyProject(object)
....
myproject_test.py
from myproject.myproject import MyProject
import pytest
...
I use myproject.myproject since I use the command
python -m pytest
from the project root directory ./myproject/
However, then the imports within those modules fail with
E ModuleNotFoundError: No module named 'moduleone'
I am running Python 3.7 and have read that since 3.3, empty __init__ files are no longer needed which means my project becomes an implicit namespace package
However, I have tried adding an __init__.py file in myproject/myproject/ and also tried adding a conftest.py file in myproject/ but neither works
I have read answers that say to mess with the paths and then upvoted comments in other questions saying not to.
What is the correct way and what am I missing?
EDIT;
Possibly related, I used a requirements.txt to install pytest using pip. Could this be related? And if so, what is the correct way to install pytest in this case?
EDIT 2:
One of the paths in sys.path is /usr/src/app/ which is a docker volume lined to /my/local/path/myproject/.
Should the volume be /my/local/path/myproject/myproject/ instead?
Not sure if this solution was specific to my problem, but I simply add __init__.py to my tests folder and that solved the problem.
Solution: use the PYTHONPATH env. var
PYTHONPATH=. pytest
As mentioned by #J_H, you need to explicitly add the root directory of your project, since pytest only adds to sys.path directories where test files are (which is why #Mak2006's answer worked.)
Good practice: use a Makefile or some other automation tool
If you do not want to type that long command all the time, one option is to create a Makefile in your project's root dir with, e.g., the following:
.PHONY: test
test:
PYTHONPATH=. pytest
Which allows you to simply run:
make test
Another common alternative is to use some standard testing tool, such as tox.
Be sure to include . dot in the $PYTHONPATH env var.
Use $ python -m site, or this code fragment to debug such issues:
import pprint
import sys
pprint.pprint(sys.path)
Your question managed to use myproject at three different levels. At least during debugging you might want to use three distinct names, to reduce possible confusion.
In my case I added a __init__.py to my test directory with this inside it:
import sys
sys.path.append('.')
My app code is at the same level as my test directory.
In my case it is because I installed pytest on the system level but not in my virtual environment.
You can test this by python -m pytest. If you see ModuleNotFoundError: No module named 'pytest' then your pytest is at the system level.
Install pytest when the virtual environment is activated will fix this.
Kept everything same and just added a blank test file at the root folder .. Solved
Here are the findings, this problem really bugged me for a while.
My folder structure was
mathapp/
- server.py
- configuration.py
- __init__.py
- static/
- home.html
tests/
- functional
- test_errors.py
- unit
- test_add.py
and pytest would complain with the ModuleNotFoundError.
I introduced a mock test file at the same level as mathsapp and tests directory. The file contained nothing. Now pytest does not complain.
Result without the file
$ pytest
============================= test session starts =============================
platform win32 -- Python 3.8.2, pytest-5.4.2, py-1.8.1, pluggy-0.13.1
rootdir: C:\mak2006\workspace\0github\python-rest-app-cont
collected 1 item / 1 error
=================================== ERRORS ====================================
_______________ ERROR collecting tests/functional/test_func.py ________________
ImportError while importing test module 'C:\mainak\workspace\0github\python-rest-app-cont\tests\functional\test_func.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
tests\functional\test_func.py:4: in <module>
from mathapp.service import sum
E ModuleNotFoundError: No module named 'mathapp'
=========================== short test summary info ===========================
ERROR tests/functional/test_func.py
!!!!!!!!!!!!!!!!!!! Interrupted: 1 error during collection !!!!!!!!!!!!!!!!!!!!
============================== 1 error in 0.24s ===============================
Results with the file
$ pytest
============================= test session starts =============================
platform win32 -- Python 3.8.2, pytest-5.4.2, py-1.8.1, pluggy-0.13.1
rootdir: C:\mak2006\workspace\0github\python-rest-app-cont
collected 2 items
tests\functional\test_func.py . [ 50%]
tests\unit\test_unit.py . [100%]
============================== 2 passed in 0.11s ==============================
Better Solution
Try adding a single __init__.py to your tests directory (a level up from your module) with this contents:
import sys
sys.path.append('.')
sys.path.append('./my_module')
Your file structure should look like this:
project
my_module
package.py
tests
__init__.py
my_tests.py
The first append to sys.path will enable you to import from <your-module-name> and the second will enable your packages to import as normal.
In your tests you can import by using from my_module.package import function whereas in your module import using simply from package import function.
Edit: Seems like this solution is not universal (like the others).
I was able to solve this issue using help from this answer.
Add an __init__.py to your main module directory that contains
import pathlib, sys
sys.path.append(str(pathlib.Path(__file__).parent))
I also added another __init__.py to my tests directory (thanks to this answer) with
import sys
sys.path.append('.')
So it seems that the sys.path has to include the application directory rather than the project root folder containing the application directory and test directory.
So in my case /my/local/path/myproject/myproject/ had to be in sys.path rather than /my/local/path/myproject/.
Then I could run pytest in /my/local/path/myproject/ (didn't need python -m pytest). This meant that the modules within /myproject/myproject/ could find each other and the tests as well without any namespace nesting.
So my tests looked like
from moduleone import ModuleOne
import pytest
def test_fun():
assert ModuleOne.example_func() == True
That said, there seem to be many gotchas, so I have no idea if this is correct..
I suggest you have a code structure like this:
myproject/
helpers/
moduleone.py
moduletwo.py
tests/
myproject_test.py
conftest.py
And the content of conftest.py file is:
pytest_plugins = ['helpers']
Run pytest again.
Using poetry and pytest 5.4.3, I had the following structure (some folders / files have been removed for clarity):
project structure
.
├── my_app
│   ├── __init__.py
│   ├── main.py
│   ├── model.py
│   └── orm.py
├── poetry.lock
├── pyproject.toml
├── README.rst
└── tests
├── __init__.py
├── conftest.py
├── test_my_app.py
└── utilities
└── db_postgresql_inmemory.py
tests/conftest.py
pytest_plugins = [
"utilities.db_postgresql_inmemory",
]
which generated a module not found error for the fixture:
ImportError: Error importing plugin "utilities.db_postgresql_inmemory": No module named 'utilities'
None of the other answers have worked for me, as I have tried to add:
[me#linux ~/code/my_app]touch tests/utilities/__init__.py
[me#linux ~/code/my_app]touch ./test_blank.py
I could make the import from conftest.py work by REMOVING both __init__.py files:
[me#linux ~/code/my_app]rm tests/utilities/__init__.py tests/__init__.py
In 2023.02, according to the document of pytest, you can simply add following config to your pyproject.toml to solve this problem
[tool.pytest.ini_options]
pythonpath = "src"
addopts = [
"--import-mode=importlib",
]
I ran into this issue as well and am using poetry for dependency management and direnv for my project specific environment variables. Please note, I am relatively new to Python so I don't know if this is the correct fix.
Here is my entire .envrc file:
layout_poetry() {
if [[ ! -f pyproject.toml ]]; then
log_error 'No pyproject.toml found. Use `poetry new` or `poetry init` to create one first.'
exit 2
fi
local VENV=$(poetry env list --full-path | cut -d' ' -f1)
if [[ -z $VENV || ! -d $VENV/bin ]]; then
log_error 'No created poetry virtual environment found. Use `poetry install` to create one first.'
exit 2
fi
VENV=$VENV/bin
export VIRTUAL_ENV=$(echo "$VENV" | rev | cut -d'/' -f2- | rev)
export POETRY_ACTIVE=1
PATH_add "$VENV"
}
layout poetry
export PYTHONDONTWRITEBYTECODE=1
export PYTHONPATH="$PWD/project_name"
I don't know if I need to layout poetry because it is supposed to be creating virtual environments for us already but this is what I coworker recommended so I went with it. Layout poetry also didn't work without that function and it didn't like when I added it to my zshenv so I added it here.
For this specific question, the last line is the money maker.
ANOTHER SUGGESTION
See this answer: https://stackoverflow.com/a/69691436/595305
I was facing the issue which i resolved by
Installing pytest at the root of my project using pip install pytest
Adding blank __init__.py in the sibling of my test_file.py which i wanted to execute.
I have resolved it by adding export PYTHONPATH="your root dir/src"
i.e.
export PYTHONPATH="/builds/project/src"
poetry run pytest .....
The simplest solution I found was to manually add my target module to syspath. Lets say you have a structure like this:
flaskapp
- src
-- app.py
-- utils
-- ...
- tests
docs
venv
This makes my test folder a sibling to my module's src folder. If I start putting test_* files that need to import some of the module's code, I can simply:
import src.utils.calculator
And this would be fine until I try to import a file that imports another file from the module. The solution is simple: add a __init__.py to your tests folder, and put this line inside:
import sys, os
sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), '../src')))
And just modify the last part relative to your module location and folder name
For me, when I was checking my project structure I found parent directory and sub directory having same names. When I changed the directory name, I got it working. So,
# Did not work
- same_name_project/
- same_name_project/
- tests/
# Worked
- different_named_project/
- a_unique_directory/
- tests/

Python / Import and design of program

I’m looking for solution for designing my program.
My program consists of 3 blocks:
Classes
Functions
Other utilities
I want to structure my program this way:
program_folder/
main.py
classes_folder/
class_1.py
class_2.py
functions_folder/
set_of_func_1.py
set_of_func_1.py
utilities_folder/
set_of_utilities_1.py
set_of_utilities_1.py
I want to:
any scripts in «classes_folder» were able to import any of scripts in
«functions_folder».
any scripts in «functions_folder» were able
to import any of scripts in «utilities_folder».
all scripts were
normally used by main.py.
all scripts in «classes_folder»,
«functions_folder» and «utilities_folder» could be tested when worked
as «main» (if __name__ == “__main__”: some tests)
«program_folder»
could be in any place in my computer (there shouldn’t be dependency
on exact path to «program_folder»).
From all the above I thought I have to:
Change import search path for all scripts in «classes_folder»,
«functions_folder» and «utilities_folder».
Set current working
directory to «program_folder» for all scripts?
Is there a way I can do it?
Does my idea look good or have I put there some unexpected problems?
You can create a skeleton project like the following:
/path/to/project/
setup.py
my_project/
__init__.py
a/
   __init__.py
  b/
  __init__.py
==> ./my_project/__init__.py <==
print('my_project/__init__.py')
==> ./my_project/a/__init__.py <==
import my_project
print('my_project/a/__init__.py')
==> ./my_project/b/__init__.py <==
import my_project.a
print('my_project/b/__init__.py')
==> ./setup.py <==
from distutils.core import setup
setup(name='my_project',
version='1.0',
description='my_project',
author='author',
packages=['my_project'])
Then you can install the project locally using pip install -e /path/to/project/ (the project folder is not copied, just gets registered; there's a dependency on the exact path, but this dependency is not hard-coded in project files themselves).
As the result, import my_project, import my_project.a etc. do that they mean:
$ python my_project/b/__init__.py
my_project/__init__.py
my_project/a/__init__.py
my_project/b/__init__.py
A common Python project structure could look like this:
project_name/
setup.py
requirements.txt
project_name/
__main__.py
classes/
__init__.py
class1.py
class2.py
functions/
__init__.py
functions.py
utils/
__init__.py
utils.py
Then, you could modify your imports from absolute to relative and run your package using something like:
$ /path/to/project_name> python -m project_name
Note that setup.py is only required if you want to install your package under some of your interpreters.
Note: see comments below also

Categories