How to package my python project to be runnable from command line? - python

I've coded my python project and have succeeded in publishing it to test pypi. However, now I can't figure out how to correctly configure it as a console script. Upon running my_project on the command line, I get the following stack trace:
Traceback (most recent call last):
File "/home/thatcoolcoder/.local/bin/my_project", line 5, in <module>
from my_project.__main__ import main
ModuleNotFoundError: No module named 'my_project'
Clearly, it's created a script to run but the script is then failing to import my actual code.
Folder structure:
pyproject.toml
setup.cfg
my_project
├── __init__.py (empty)
├── __main__.py
Relevant sections of setup.cfg:
[metadata]
name = my-project
version = 1.0.5
...
[options]
package_dir =
= my_project
packages = find:
...
[options.packages.find]
where = my_project
[options.entry_points]
console_scripts =
my_project = my_project.__main__:main
pyproject.toml (probably not relevant)
[build-system]
requires = [
"setuptools>=42",
"wheel"
]
__main__.py:
from my_project import foo
def main():
foo.bar()
if __name__ == '__main__':
main()
To build and upload, I'm running the following: (python is python 3.10)
python -m build
python -m twine upload --repository testpypi dist/*
Then to install and run:
pip install -i https://test.pypi.org/pypi/ --extra-index-url https://pypi.org/simple my-project --upgrade
my_project
How can I make this console script work?
Also, this current method of setting console_scripts only allows it to be run as my_project; is it possible to also make it work by python -m my_project? Or perhaps this will work once my main issue is fixed.

It's funny, but I had the same frustration when trying to install scripts on multiple platforms. (As Python calls them; posix and nt.)
So I wrote setup-py-script in 2020. It's up on github now.
It installs scripts that use their own modules as a self-contained zip-file. (This method was inspired by youtube-dl.) That means no more leftover files when you delete a script but forget to remove the module et cetera.
It does not require root or administrator privileges; installation is done in user-accessible directories.
You might have to structure your project slightly differently; the script itself is not in the module directory. See the project README.

I finally got back to this problem today and it appears that I was using an incorrect source layout, which caused the pip module installation to not work. I switched to a directory structure like this one:
├── src
│ └── mypackage
│ ├── __init__.py
│ └── mod1.py
├── setup.py
└── setup.cfg
and modified the relevant parts of my setup.cfg:
[options]
package_dir=
=src
packages=find:
[options.packages.find]
where=src
Then I can run it like python -m mypackage. This also made the console scripts work. It works on Linux but I presume it also works on other systems.

Related

Python package raises ModuleNotFoundError on calling an entry-point except if installed in editable mode

I have a Python package that at first appears to install just fine, but when calling on one of the entry points raises a ModuleNotFoundException. The module is otherwise found just fine with both import package from the interactive interpreter as well as with python -m package.etc. But if I try to call on the entry-point directly (flike python -m package.etc.main) it will raise an AttributeError saying that the module has no attribute __path__.
I can see the package if I do pip list.
The project is currently set up with the "template" pyproject.toml and only setup.cfg, but the behaviour is essentially the same (the traceback looks slightly different but the error is the same) when using setup.py over pyproject.toml, both with pip but also if I invoke setup.py directly. The structure of the project is:
package
├── __init__.py
├── cli
│   ├── __init__.py
│   ├── entry.py
├── file.py
I get the same behaviour if doing this in a virtual environment as when I do it with a userspace (--user) install.
Modifying the environment variable ${PYTHONPATH} fixes the issue, and installing the package in editable mode works just fine.
Turns out that the issue was that I had something like:
[options]
packages = find:
[options.packages.find]
include =
README.md
in my setup.cfg, and it appears as if the declaration of include was exclusive which led to the package not being included in the installation, which still worked when installed in editable mode (presumably because editable mode only sets up some sort of links or appends the source code directories to some path).

ImportError when using pytest in terminal [duplicate]

I used easy_install to install pytest on a Mac and started writing tests for a project with a file structure likes so:
repo/
|--app.py
|--settings.py
|--models.py
|--tests/
|--test_app.py
Run py.test while in the repo directory, and everything behaves as you would expect.
But when I try that same thing on either Linux or Windows (both have pytest 2.2.3 on them), it barks whenever it hits its first import of something from my application path. For instance, from app import some_def_in_app.
Do I need to be editing my PATH to run py.test on these systems?
I'm not sure why py.test does not add the current directory in the PYTHONPATH itself, but here's a workaround (to be executed from the root of your repository):
python -m pytest tests/
It works because Python adds the current directory in the PYTHONPATH for you.
Recommended approach for pytest>=7: use the pythonpath setting
Recently, pytest has added a new core plugin that supports sys.path modifications via the pythonpath configuration value. The solution is thus much simpler now and doesn't require any workarounds anymore:
pyproject.toml example:
[tool.pytest.ini_options]
pythonpath = [
"."
]
pytest.ini example:
[pytest]
pythonpath = .
The path entries are calculated relative to the rootdir, thus . adds repo directory to sys.path in this case.
Multiple path entries are also allowed: for a layout
repo/
├── src/
| └── lib.py
├── app.py
└── tests
├── test_app.py
└── test_lib.py
the configuration
[tool.pytest.ini_options]
pythonpath = [
".", "src",
]
or
[pytest]
pythonpath = . src
will add both app and lib modules to sys.path, so
import app
import lib
will both work.
Original answer (not recommended for recent pytest versions; use for pytest<7 only): conftest solution
The least invasive solution is adding an empty file named conftest.py in the repo/ directory:
$ touch repo/conftest.py
That's it. No need to write custom code for mangling the sys.path or remember to drag PYTHONPATH along, or placing __init__.py into dirs where it doesn't belong (using python -m pytest as suggested in Apteryx's answer is a good solution though!).
The project directory afterwards:
repo
├── conftest.py
├── app.py
├── settings.py
├── models.py
└── tests
└── test_app.py
Explanation
pytest looks for the conftest modules on test collection to gather custom hooks and fixtures, and in order to import the custom objects from them, pytest adds the parent directory of the conftest.py to the sys.path (in this case the repo directory).
Other project structures
If you have other project structure, place the conftest.py in the package root dir (the one that contains packages but is not a package itself, so does not contain an __init__.py), for example:
repo
├── conftest.py
├── spam
│ ├── __init__.py
│ ├── bacon.py
│ └── egg.py
├── eggs
│ ├── __init__.py
│ └── sausage.py
└── tests
├── test_bacon.py
└── test_egg.py
src layout
Although this approach can be used with the src layout (place conftest.py in the src dir):
repo
├── src
│ ├── conftest.py
│ ├── spam
│ │ ├── __init__.py
│ │ ├── bacon.py
│ │ └── egg.py
│ └── eggs
│ ├── __init__.py
│ └── sausage.py
└── tests
├── test_bacon.py
└── test_egg.py
beware that adding src to PYTHONPATH mitigates the meaning and benefits of the src layout! You will end up with testing the code from repository and not the installed package. If you need to do it, maybe you don't need the src dir at all.
Where to go from here
Of course, conftest modules are not just some files to help the source code discovery; it's where all the project-specific enhancements of the pytest framework and the customization of your test suite happen. pytest has a lot of information on conftest modules scattered throughout their docs; start with conftest.py: local per-directory plugins
Also, SO has an excellent question on conftest modules: In py.test, what is the use of conftest.py files?
I had the same problem. I fixed it by adding an empty __init__.py file to my tests directory.
Yes, the source folder is not in Python's path if you cd to the tests directory.
You have two choices:
Add the path manually to the test files. Something like this:
import sys, os
myPath = os.path.dirname(os.path.abspath(__file__))
sys.path.insert(0, myPath + '/../')
Run the tests with the env var PYTHONPATH=../.
Run pytest itself as a module with:
python -m pytest tests
This happens when the project hierarchy is, for example, package/src package/tests and in tests you import from src. Executing as a module will consider imports as absolute rather than relative to the execution location.
You can run with PYTHONPATH in project root
PYTHONPATH=. py.test
Or use pip install as editable import
pip install -e . # install package using setup.py in editable mode
I had the same problem in Flask.
When I added:
__init__.py
to the tests folder, the problem disappeared :)
Probably the application couldn't recognize folder tests as a module.
I created this as an answer to your question and my own confusion. I hope it helps. Pay attention to PYTHONPATH in both the py.test command line and in the tox.ini.
https://github.com/jeffmacdonald/pytest_test
Specifically: You have to tell py.test and tox where to find the modules you are including.
With py.test you can do this:
PYTHONPATH=. py.test
And with tox, add this to your tox.ini:
[testenv]
deps= -r{toxinidir}/requirements.txt
commands=py.test
setenv =
PYTHONPATH = {toxinidir}
I fixed it by removing the top-level __init__.py in the parent folder of my sources.
I started getting weird ConftestImportFailure: ImportError('No module named ... errors when I had accidentally added __init__.py file to my src directory (which was not supposed to be a Python package, just a container of all source).
It is a bit of a shame that this is an issue in Python... But just adding this environment variable is the most comfortable way, IMO:
export PYTHONPATH=$PYTHONPATH:.
You can put this line in you .zshrc or .bashrc file.
I was having the same problem when following the Flask tutorial and I found the answer on the official Pytest documentation.
It's a little shift from the way I (and I think many others) are used to do things.
You have to create a setup.py file in your project's root directory with at least the following two lines:
from setuptools import setup, find_packages
setup(name="PACKAGENAME", packages=find_packages())
where PACKAGENAME is your app's name. Then you have to install it with pip:
pip install -e .
The -e flag tells pip to install the package in editable or "develop" mode. So the next time you run pytest it should find your app in the standard PYTHONPATH.
I had a similar issue. pytest did not recognize a module installed in the environment I was working in.
I resolved it by also installing pytest into the same environment.
Also if you run pytest within your virtual environment make sure pytest module is installed within your virtual environment. Activate your virtual environment and run pip install pytest.
For me the problem was tests.py generated by Django along with tests directory. Removing tests.py solved the problem.
I got this error as I used relative imports incorrectly. In the OP example, test_app.py should import functions using e.g.
from repo.app import *
However liberally __init__.py files are scattered around the file structure, this does not work and creates the kind of ImportError seen unless the files and test files are in the same directory.
from app import *
Here's an example of what I had to do with one of my projects:
Here’s my project structure:
microbit/
microbit/activity_indicator/activity_indicator.py
microbit/tests/test_activity_indicator.py
To be able to access activity_indicator.py from test_activity_indicator.py I needed to:
start test_activity_indicatory.py with the correct relative import:
from microbit.activity_indicator.activity_indicator import *
put __init__.py files throughout the project structure:
microbit/
microbit/__init__.py
microbit/activity_indicator/__init__.py
microbit/activity_indicator/activity_indicator.py
microbit/tests/__init__.py
microbit/tests/test_activity_indicator.py
According to a post on Medium by Dirk Avery (and supported by my personal experience) if you're using a virtual environment for your project then you can't use a system-wide install of pytest; you have to install it in the virtual environment and use that install.
In particular, if you have it installed in both places then simply running the pytest command won't work because it will be using the system install. As the other answers have described, one simple solution is to run python -m pytest instead of pytest; this works because it uses the environment's version of pytest. Alternatively, you can just uninstall the system's version of pytest; after reactivating the virtual environment the pytest command should work.
I was getting this error due to something even simpler (you could even say trivial). I hadn't installed the pytest module. So a simple apt install python-pytest fixed it for me.
'pytest' would have been listed in setup.py as a test dependency. Make sure you install the test requirements as well.
Since no one has suggested it, you could also pass the path to the tests in your pytest.ini file:
[pytest]
...
testpaths = repo/tests
See documentation: https://docs.pytest.org/en/6.2.x/customize.html#pytest-ini
Side effect for Visual Studio Code: it should pick up the unit test in the UI.
We have fixed the issue by adding the following environment variable.
PYTHONPATH=${PYTHONPATH}:${PWD}/src:${PWD}/test
As pointed out by Luiz Lezcano Arialdi, the correct solution is to install your package as an editable package.
Since I am using Pipenv, I thought about adding to his answer a step-by-step how to install the current path as an edible with Pipenv, allowing to run pytest without the need of any mangling code or lose files.
You will need to have the following minimal folder structure (documentation):
package/
package/
__init__.py
module.py
tests/
module_test.py
setup.py
setup.py mostly has the following minium code (documentation):
import setuptools
setuptools.setup(name='package', # Change to your package name
packages=setuptools.find_packages())
Then you just need to run pipenv install --dev -e . and Pipenv will install the current path as an editable package (the --dev flag is optional) (documentation).
Now you should be able to run pytest without problems.
If this pytest error appears not for your own package, but for a Git-installed package in your package's requirements.txt, the solution is to switch to editable installation mode.
For example, suppose your package's requirements.txt had the following line:
git+https://github.com/foo/bar.git
You would instead replace it with the following:
-e git+https://github.com/foo/bar.git#egg=bar
If nothing works, make sure your test_module.py is listed under the correct src directory.
Sometimes it will give ModuleNotFoundError not because modules are misplaced or export PYTHONPATH="${PWD}:${PYTHONPATH}" is not working, its because test_module.py is placed into a wrong directory under the tests folder.
it should be 1-to-1 mapping relation recursively instead of the root folder should be named as "tests" and the name of the file that include test code should starts with "test_",
for example,
./nlu_service/models/transformers.py
./tests/models/test_transformers.py
This was my experience.
Very often the tests were interrupted due to module being unable to be imported.
After research, I found out that the system is looking at the file in the wrong place and we can easily overcome the problem by copying the file, containing the module, in the same folder as stated, in order to be properly imported.
Another solution proposal would be to change the declaration for the import and show MutPy the correct path of the unit. However, due to the fact that multiple units can have this dependency, meaning we need to commit changes also in their declarations, we prefer to simply move the unit to the folder.
My solution:
Create the conftest.py file in the test directory containing:
import os
import sys
sys.path.insert(0, os.path.dirname(os.path.realpath(__file__)) + "/relative/path/to/code/")
This will add the folder of interest to the Python interpreter path without modifying every test file, setting environment variable or messing with absolute/relative paths.

How to make setup.py for standalone python application in a right way?

I have read several similar topics but haven't succeeded yet. I feel I miss or misunderstand some fundamental thing and this is the reason of my failure.
I have an 'application' written in a python which I want to deploy with help of standard setup.py. Due to complex functionality it consists of different python modules. But there is no sense in separate release of this modules as they are too specific.
Expected result is to have package installed in a system with help of pip install and be available from OS command line with simple app command.
Simplifying long story to reproducible example - I have following directory structure:
<root>
├─ app
| ├─ aaa
| | └── module_a.py
| ├─ bbb
| | └── module_b.py
| └── app.py
├─ docs
| └── .....
├─ tests
| └── .....
└─ setup.py
Below is code of modules:
app.py
#!/usr/bin/python
from aaa.module_a import method1
from bbb.module_b import method2
def main():
print("APP main executed")
method1()
method2()
if __name__ == '__main__':
main()
module_a.py
def method1():
print("A1 executed")
module_b.py
def method2():
print("B2 executed")
When I run app.py from console it works fine and gives expected output:
APP main executed
A1 executed
B2 executed
So, this simple 'application' works fine and I want to distribute it with help of following
setup.py
from setuptools import setup
setup(
name="app",
version="1.0",
packages=['app', 'app.aaa', 'app.bbb'],
package_dir={'app': 'app'},
entry_points={
'console_scripts': ['app=app.app:main', ]
}
)
Again, everything looks good and test installation looks good:
(venv) [user#test]$ pip install <root>
Processing /home/user/<root>
Using legacy 'setup.py install' for app, since package 'wheel' is not installed.
Installing collected packages: app
Running setup.py install for app ... done
Successfully installed app-1.0
(venv) [user#test]$
And now comes the problem. With aforementioned entry_points from setup.py I expect to be able execute my application with ./app command. Indeed it works. But application itself fails with error message:
File "/test/venv/lib/python3.9/site-packages/app/app.py", line 3, in <module>
from aaa.module_a import method1
ModuleNotFoundError: No module named 'aaa'
I understand the reason of the error - it is because pip install put directories aaa and bbb together with app.py in one directory app. I.e. from this point of view app.py should use import app.aaa instead of import aaa. But if I do so then my app during development runs with error:
ModuleNotFoundError: No module named 'app.aaa'; 'app' is not a package
that is also logical as there are no app package available at that time... (it is under development and isn't installed in the system...)
Finally. The question is - what is a correct way to create directory structure and setup.py for standalone python application that consist of several own modules?
UPD
The most promising result (but proved to be wrong after discussion in coments) that I have now came after following changes:
moved app.py from <root>/app into <root> itself
I referenced it in setup.py by py_modules=['app']
I changed imports from import aaa.method1 to import app.aaa.method1 etc.
This way package works both in my development environment and after installation.
But I got a problem with entry_points - I see no way how to configure entry point to use main() from app.py that is not a part of app package but is a separate module....
I.e. new structure is
<root>
├─ app
| ├─ aaa
| | └── module_a.py
| ├─ bbb
| | └── module_b.py
| └──__init__.py
├─ docs
| └── .....
├─ tests
| └── .....
├─ app.py
└─ setup.py
I.e. the logic here - to have 2 separate entities:
An empty package app (consists of init.py only) with subpackages aaa, bbb etc.
A script app.py that uses functions from subpackages app.aaa, app.bbb
But as I wrote - I see no way how to define entry point for app.py to allow it's run from OS command line directly.
With that directory (package) structure, in your app.py you should import as one of the following:
from app.aaa.module_a import method1
from .aaa.module_a import method1
Then make sure to call you application like one of the following:
app
(this should work thanks to the console entry point)
python -m app.app
(this should work even without the console entry point)
I try to recreate the complete project here
Directory structure:
.
├── app
│   ├── aaa
│   │   └── module_a.py
│   ├── app.py
│   └── bbb
│   └── module_b.py
└── setup.py
setup.py
import setuptools
setuptools.setup(
name="app",
version="1.0",
packages=['app', 'app.aaa', 'app.bbb'],
entry_points={
'console_scripts': ['app=app.app:main', ]
},
)
app/app.py
#!/usr/bin/python
from .aaa.module_a import method1
from .bbb.module_b import method2
def main():
print("APP main executed")
method1()
method2()
if __name__ == '__main__':
main()
app/aaa/module_a.py
def method1():
print("A1 executed")
app/bbb/module_b.py
def method2():
print("B2 executed")
Then I run following commands:
$ python3 -V
Python 3.6.9
$ python3 -m venv .venv
$ .venv/bin/python -m pip install -U pip setuptools wheel
# [...]
$ .venv/bin/python -m pip list
Package Version
------------- -------------------
pip 20.3.3
pkg-resources 0.0.0
setuptools 51.1.0.post20201221
wheel 0.36.2
$ .venv/bin/python -m pip install .
# [...]
$ .venv/bin/python -m app.app
APP main executed
A1 executed
B2 executed
$ .venv/bin/app
APP main executed
A1 executed
B2 executed
$ .venv/bin/python -m pip uninstall app
# [...]
$ .venv/bin/python -m pip install --editable .
# [...]
$ .venv/bin/python -m app.app
APP main executed
A1 executed
B2 executed
$ .venv/bin/app
APP main executed
A1 executed
B2 executed
The answer from sinoroc is mostly right - he executed a correct example that highlights all options. It shows a correct way to structure python package. But before any run this package should be first installed into venv. Then pip install --editable option is required in order to continue develop/debug the package inside IDE (it installs package in venv but keeps source files in original place).
After long discussion in comments I came to the page An Overview of Packaging for Python that explains all options and highlights that Python’s native packaging is mostly built for distributing reusable code, called libraries. I assume this is a reason of my misunderstanding and initial question.
As an alternative solution I plan to play with PEP 441 -- Improving Python ZIP Application Support that looks like a correct approach for my case.

Python setup.py for unusual folder structure [duplicate]

I have a Git repository cloned into myproject, with an __init__.py at the root of the repository, making the whole thing an importable Python package.
I'm trying to write a setuptools setup.py for the package, which will also sit in the root of the repository, next to the __init__.py file. I want setup.py to install the directory it resides in as a package. It's fine if setup.py itself comes along as part of the installation, but it would be better if it didn't. Ideally this should work also in editable mode (pip install -e .)
Is this configuration at all supported? I can kind of make it work by having a package_dir= {"": ".."}, argument to setup(), telling it to look for myproject in the directory above the current one. However, this requires the package to always be installed from a directory named myproject, which does not appear to be the case if, say, it's being installed through pip, or if someone is working out of a Git clone named myproject-dev, or in any number of other cases.
Another hack I'm contemplating is a symlink to . named mypackage inside of the repository. That ought to work, but I wanted to check if there was a better way first.
See also Create editable package setup.py in the same root folder as __init__.py
As far as I know this should work:
myproject-dev/
├── __init__.py
├── setup.py
└── submodule
└── __init__.py
#!/usr/bin/env python3
import setuptools
setuptools.setup(
name='MyProject',
version='0.0.0.dev0',
packages=['myproject', 'myproject.submodule'],
package_dir={
'myproject': '.',
},
)
One way to make this work for editable or develop installations is to manually modify the easy-install.pth file.
Assuming:
the project lives in: /home/user/workspace/empty/project;
a virtual environment .venv is used;
the project is installed with python3 -m pip install -e . or python3 setup.py develop;
the Python version is 3.6.
Then:
the file is found at a location such as /home/user/workspace/empty/project/.venv/lib/python3.6/site-packages/easy-install.pth;
its content is: /home/user/workspace/empty/project.
In order to let the imports work as expected one can edit this line to read the following:
/home/user/workspace/empty
Note:
Everything in /home/user/workspace/empty that looks like a Python package is then susceptible to be imported, that is why it is a good idea to place the project in its own directory, in this case the directory empty contains nothing else but the directory project.
The module project.setup is also importable.

How do I permanently add paths to PYTHONPATH in a script? [duplicate]

I've tried reading through questions about sibling imports and even the
package documentation, but I've yet to find an answer.
With the following structure:
├── LICENSE.md
├── README.md
├── api
│   ├── __init__.py
│   ├── api.py
│   └── api_key.py
├── examples
│   ├── __init__.py
│   ├── example_one.py
│   └── example_two.py
└── tests
│   ├── __init__.py
│   └── test_one.py
How can the scripts in the examples and tests directories import from the
api module and be run from the commandline?
Also, I'd like to avoid the ugly sys.path.insert hack for every file. Surely
this can be done in Python, right?
Tired of sys.path hacks?
There are plenty of sys.path.append -hacks available, but I found an alternative way of solving the problem in hand.
Summary
Wrap the code into one folder (e.g. packaged_stuff)
Create setup.py script where you use setuptools.setup(). (see minimal setup.py below)
Pip install the package in editable state with pip install -e <myproject_folder>
Import using from packaged_stuff.modulename import function_name
Setup
The starting point is the file structure you have provided, wrapped in a folder called myproject.
.
└── myproject
├── api
│ ├── api_key.py
│ ├── api.py
│ └── __init__.py
├── examples
│ ├── example_one.py
│ ├── example_two.py
│ └── __init__.py
├── LICENCE.md
├── README.md
└── tests
├── __init__.py
└── test_one.py
I will call the . the root folder, and in my example case it is located at C:\tmp\test_imports\.
api.py
As a test case, let's use the following ./api/api.py
def function_from_api():
return 'I am the return value from api.api!'
test_one.py
from api.api import function_from_api
def test_function():
print(function_from_api())
if __name__ == '__main__':
test_function()
Try to run test_one:
PS C:\tmp\test_imports> python .\myproject\tests\test_one.py
Traceback (most recent call last):
File ".\myproject\tests\test_one.py", line 1, in <module>
from api.api import function_from_api
ModuleNotFoundError: No module named 'api'
Also trying relative imports wont work:
Using from ..api.api import function_from_api would result into
PS C:\tmp\test_imports> python .\myproject\tests\test_one.py
Traceback (most recent call last):
File ".\tests\test_one.py", line 1, in <module>
from ..api.api import function_from_api
ValueError: attempted relative import beyond top-level package
Steps
Make a setup.py file to the root level directory
The contents for the setup.py would be*
from setuptools import setup, find_packages
setup(name='myproject', version='1.0', packages=find_packages())
Use a virtual environment
If you are familiar with virtual environments, activate one, and skip to the next step. Usage of virtual environments are not absolutely required, but they will really help you out in the long run (when you have more than 1 project ongoing..). The most basic steps are (run in the root folder)
Create virtual env
python -m venv venv
Activate virtual env
source ./venv/bin/activate (Linux, macOS) or ./venv/Scripts/activate (Win)
To learn more about this, just Google out "python virtual env tutorial" or similar. You probably never need any other commands than creating, activating and deactivating.
Once you have made and activated a virtual environment, your console should give the name of the virtual environment in parenthesis
PS C:\tmp\test_imports> python -m venv venv
PS C:\tmp\test_imports> .\venv\Scripts\activate
(venv) PS C:\tmp\test_imports>
and your folder tree should look like this**
.
├── myproject
│ ├── api
│ │ ├── api_key.py
│ │ ├── api.py
│ │ └── __init__.py
│ ├── examples
│ │ ├── example_one.py
│ │ ├── example_two.py
│ │ └── __init__.py
│ ├── LICENCE.md
│ ├── README.md
│ └── tests
│ ├── __init__.py
│ └── test_one.py
├── setup.py
└── venv
├── Include
├── Lib
├── pyvenv.cfg
└── Scripts [87 entries exceeds filelimit, not opening dir]
pip install your project in editable state
Install your top level package myproject using pip. The trick is to use the -e flag when doing the install. This way it is installed in an editable state, and all the edits made to the .py files will be automatically included in the installed package.
In the root directory, run
pip install -e . (note the dot, it stands for "current directory")
You can also see that it is installed by using pip freeze
(venv) PS C:\tmp\test_imports> pip install -e .
Obtaining file:///C:/tmp/test_imports
Installing collected packages: myproject
Running setup.py develop for myproject
Successfully installed myproject
(venv) PS C:\tmp\test_imports> pip freeze
myproject==1.0
Add myproject. into your imports
Note that you will have to add myproject. only into imports that would not work otherwise. Imports that worked without the setup.py & pip install will work still work fine. See an example below.
Test the solution
Now, let's test the solution using api.py defined above, and test_one.py defined below.
test_one.py
from myproject.api.api import function_from_api
def test_function():
print(function_from_api())
if __name__ == '__main__':
test_function()
running the test
(venv) PS C:\tmp\test_imports> python .\myproject\tests\test_one.py
I am the return value from api.api!
* See the setuptools docs for more verbose setup.py examples.
** In reality, you could put your virtual environment anywhere on your hard disk.
Seven years after
Since I wrote the answer below, modifying sys.path is still a quick-and-dirty trick that works well for private scripts, but there has been several improvements
Installing the package (in a virtualenv or not) will give you what you want, though I would suggest using pip to do it rather than using setuptools directly (and using setup.cfg to store the metadata)
Using the -m flag and running as a package works too (but will turn out a bit awkward if you want to convert your working directory into an installable package).
For the tests, specifically, pytest is able to find the api package in this situation and takes care of the sys.path hacks for you
So it really depends on what you want to do. In your case, though, since it seems that your goal is to make a proper package at some point, installing through pip -e is probably your best bet, even if it is not perfect yet.
Old answer
As already stated elsewhere, the awful truth is that you have to do ugly hacks to allow imports from siblings modules or parents package from a __main__ module. The issue is detailed in PEP 366. PEP 3122 attempted to handle imports in a more rational way but Guido has rejected it one the account of
The only use case seems to be running scripts that happen
to be living inside a module's directory, which I've always seen as an
antipattern.
(here)
Though, I use this pattern on a regular basis with
# Ugly hack to allow absolute import from the root folder
# whatever its name is. Please forgive the heresy.
if __name__ == "__main__" and __package__ is None:
from sys import path
from os.path import dirname as dir
path.append(dir(path[0]))
__package__ = "examples"
import api
Here path[0] is your running script's parent folder and dir(path[0]) your top level folder.
I have still not been able to use relative imports with this, though, but it does allow absolute imports from the top level (in your example api's parent folder).
Here is another alternative that I insert at top of the Python files in tests folder:
# Path hack.
import sys, os
sys.path.insert(0, os.path.abspath('..'))
You don't need and shouldn't hack sys.path unless it is necessary and in this case it is not. Use:
import api.api_key # in tests, examples
Run from the project directory: python -m tests.test_one.
You should probably move tests (if they are api's unittests) inside api and run python -m api.test to run all tests (assuming there is __main__.py) or python -m api.test.test_one to run test_one instead.
You could also remove __init__.py from examples (it is not a Python package) and run the examples in a virtualenv where api is installed e.g., pip install -e . in a virtualenv would install inplace api package if you have proper setup.py.
I don't yet have the comprehension of Pythonology necessary to see the intended way of sharing code amongst unrelated projects without a sibling/relative import hack. Until that day, this is my solution. For examples or tests to import stuff from ..\api, it would look like:
import sys.path
import os.path
# Import from sibling directory ..\api
sys.path.append(os.path.dirname(os.path.abspath(__file__)) + "/..")
import api.api
import api.api_key
For siblings package imports, you can use either the insert or the append method of the [sys.path][2] module:
if __name__ == '__main__' and if __package__ is None:
import sys
from os import path
sys.path.append( path.dirname( path.dirname( path.abspath(__file__) ) ) )
import api
This will work if you are launching your scripts as follows:
python examples/example_one.py
python tests/test_one.py
On the other hand, you can also use the relative import:
if __name__ == '__main__' and if __package__ is not None:
import ..api.api
In this case you will have to launch your script with the '-m' argument (note that, in this case, you must not give the '.py' extension):
python -m packageName.examples.example_one
python -m packageName.tests.test_one
Of course, you can mix the two approaches, so that your script will work no matter how it is called:
if __name__ == '__main__':
if __package__ is None:
import sys
from os import path
sys.path.append( path.dirname( path.dirname( path.abspath(__file__) ) ) )
import api
else:
import ..api.api
For readers in 2021: If you're not confident with pip install -e :
Consider this hierarchy, as recommended by an answer from Relative imports in Python 3:
MyProject
├── src
│ ├── bot
│ │   ├── __init__.py
│ │   ├── main.py
│ │   └── sib1.py
│ └── mod
│ ├── __init__.py
│ └── module1.py
└── main.py
The content of main.py, which is the starting point and we use absolute import (no leading dots) here:
from src.bot import main
if __name__ == '__main__':
main.magic_tricks()
The content of bot/main.py, which takes advantage of explicit relative imports:
from .sib1 import my_drink # Both are explicit-relative-imports.
from ..mod.module1 import relative_magic
def magic_tricks():
# Using sub-magic
relative_magic(in=["newbie", "pain"], advice="cheer_up")
my_drink()
# Do your work
...
Now here comes the reasoning:
When executing python MyProject/main.py, the path/to/MyProject is added into the sys.path.
The absolute import import src.bot will read it.
The from ..mod part means it will go up one level to MyProject/src.
Can we see it? YES, since path/to/MyProject is added into the sys.path.
So the point is:
We should put the main script next to MyProject/src, since that when doing relative-referencing, we won't go out of the src, and the absolute import import src. provides the just-fit scope for us: the src/ scope.
See also: ModuleNotFoundError: No module named 'sib1'
TLDR
This method does not require setuptools, path hacks, additional command line arguments, or specifying the top level of the package in every single file of your project.
Just make a script in the parent directory of whatever your are calling to be your __main__ and run everything from there. For further explanation continue reading.
Explanation
This can be accomplished without hacking a new path together, extra command line args, or adding code to each of your programs to recognize its siblings.
The reason this fails as I believe was mentioned before is the programs being called have their __name__ set as __main__. When this occurs the script being called accepts itself to be on the top level of the package and refuses to recognize scripts in sibling directories.
However, everything under the top level of the directory will still recognize ANYTHING ELSE under the top level. This means the ONLY thing you have to do to get files in sibling directories to recognize/utilize each other is to call them from a script in their parent directory.
Proof of Concept
In a dir with the following structure:
.
|__Main.py
|
|__Siblings
|
|___sib1
| |
| |__call.py
|
|___sib2
|
|__callsib.py
Main.py contains the following code:
import sib1.call as call
def main():
call.Call()
if __name__ == '__main__':
main()
sib1/call.py contains:
import sib2.callsib as callsib
def Call():
callsib.CallSib()
if __name__ == '__main__':
Call()
and sib2/callsib.py contains:
def CallSib():
print("Got Called")
if __name__ == '__main__':
CallSib()
If you reproduce this example you will notice that calling Main.py will result in "Got Called" being printed as is defined in sib2/callsib.py even though sib2/callsib.py got called through sib1/call.py. However if one were to directly call sib1/call.py (after making appropriate changes to the imports) it throws an exception. Even though it worked when called by the script in its parent directory, it will not work if it believes itself to be on the top level of the package.
You need to look to see how the import statements are written in the related code. If examples/example_one.py uses the following import statement:
import api.api
...then it expects the root directory of the project to be in the system path.
The easiest way to support this without any hacks (as you put it) would be to run the examples from the top level directory, like this:
PYTHONPATH=$PYTHONPATH:. python examples/example_one.py
Just in case someone using Pydev on Eclipse end up here: you can add the sibling's parent path (and thus the calling module's parent) as an external library folder using Project->Properties and setting External Libraries under the left menu Pydev-PYTHONPATH. Then you can import from your sibling, e. g. from sibling import some_class.
I wanted to comment on the solution provided by np8 but I don't have enough reputation so I'll just mention that you can create a setup.py file exactly as they suggested, and then do pipenv install --dev -e . from the project root directory to turn it into an editable dependency. Then your absolute imports will work e.g. from api.api import foo and you don't have to mess around with system-wide installations.
Documentation
If you're using pytest then the pytest docs describe a method of how to reference source packages from a separate test package.
The suggested project directory structure is:
setup.py
src/
mypkg/
__init__.py
app.py
view.py
tests/
__init__.py
foo/
__init__.py
test_view.py
bar/
__init__.py
test_view.py
Contents of the setup.py file:
from setuptools import setup, find_packages
setup(name="PACKAGENAME", packages=find_packages())
Install the packages in editable mode:
pip install -e .
The pytest article references this blog post by Ionel Cristian Mărieș.
I made a sample project to demonstrate how I handled this, which is indeed another sys.path hack as indicated above. Python Sibling Import Example, which relies on:
if __name__ == '__main__': import os import sys sys.path.append(os.getcwd())
This seems to be pretty effective so long as your working directory remains at the root of the Python project.
in your main file add this:
import sys
import os
sys.path.append(os.path.abspath(os.path.join(__file__,mainScriptDepth)))
mainScriptDepth = the depth of the main file from the root of the project.
Here is your case mainScriptDepth = "../../". Then you can import by specifying the path (from api.api import * ) from the root of your project.
The problem:
You simply can not get import mypackage to work in test.py. You will need either an editable install, change to path, or changes to __name__ and path
demo
├── dev
│ └── test.py
└── src
└── mypackage
├── __init__.py
└── module_of_mypackage.py
--------------------------------------------------------------
ValueError: attempted relative import beyond top-level package
The solution:
import sys; sys.path += [sys.path[0][:-3]+"src"]
Put the above before attempting imports in test.py. Thats it. You can now import mypackage.
This will work both on Windows and Linux. It will also not care from which path you run your script. It is short enough to slap it anywhere you might need it.
Why it works:
The sys.path contains the places, in order, where to look for packages when attempting imports if they are not found in installed site packages. When you run test.py the first item in sys.path will be something like /mnt/c/Users/username/Desktop/demo/dev i.e.: where you ran your file. The oneliner will simply add the sibling folder to path and everything works. You will not have to worry about Windows vs Linux file paths since we are only editing the last folder name and nothing else. If you project structure is already set in stone for your repository we can also reasonably just use the magic number 3 to slice away dev and substitute src
for the main question:
call sibling folder as module:
from .. import siblingfolder
call a_file.py from sibling folder as module:
from ..siblingfolder import a_file
call a_function inside a file in sibling folder as module:
from..siblingmodule.a_file import func_name_exists_in_a_file
The easiest way.
go to lib/site-packages folder.
if exists 'easy_install.pth' file, just edit it and add your directory that you have script that you want make it as module.
if not exists, just make it one...and put your folder that you want there
after you add it..., python will be automatically perceive that folder as similar like site-packages and you can call every script from that folder or subfolder as a module.
i wrote this by my phone, and hard to set it to make everyone comfortable to read.
First, you should avoid having files with the same name as the module itself. It may break other imports.
When you import a file, first the interpreter checks the current directory and then searchs global directories.
Inside examples or tests you can call:
from ..api import api
Project
1.1 User
1.1.1 about.py
1.1.2 init.py
1.2 Tech
1.2.1 info.py
1.1.2 init.py
Now, if you want to access about.py module in the User package, from the info.py module in Tech package then you have to bring the cmd (in windows) path to Project i.e.
**C:\Users\Personal\Desktop\Project>**as per the above Package example. And from this path you have to enter, python -m Package_name.module_name
For example for the above Package we have to do,
C:\Users\Personal\Desktop\Project>python -m Tech.info
Imp Points
Don't use .py extension after info module i.e. python -m Tech.info.py
Enter this, where the siblings packages are in the same level.
-m is the flag, to check about it you can type from the cmd python --help

Categories