Forgive me for another "relative imports" post but I've read every stack overflow post about them and I'm still confused.
I have a file structure that looks something like this:
|---MyProject
| |---constants.py
| |---MyScripts
| | |---script.py
Inside of script.py I have from .. import constants.py
And of course when I run python3 script.py, I get the error ImportError: attempted relative import with no known parent package.
From what I understand, this doesn't work because when you run a python file directly you can't use relative imports. After hours of searching it seems that the most popular solution is to add "../" to sys.path and then do import constants.py but this feels a little hacky. I've got to believe that there is a better solution than this right? Importing constants has got to be a fairly common task.
I've also seen some people recommend adding __init__.py to make the project into a package but I don't really understand how this solves anything?
Are there other solutions out there that I've either missed or just don't understand?
Your library code should live in a package. For your folder to be a package, it needs to have __init__.py.
Your scripts can live in the top directory in development. Here is how you would structure this layout:
|---MyProject
| |---myproject
| | |---__init__.py
| | |---constants.py
| |---script.py
If you were to productionize this, you could place scripts into a special folder, but then they would rely on the package being installed on your system. The layout could look like this:
|---MyProject
| |---myproject
| | |---__init__.py
| | |---constants.py
| |---scripts
| | |---script.py
In both cases your imports would be normal absolute (non-relative) imports like
from myproject.constants import ...
OR
from myproject import constants
You are correct that attempting a relative import or a path modification in a standalone script is hacky. If you want to use the more flexible second layout, make a setup.py install script in the root folder, and run python setup.py develop.
|---MyProject
| |---__init__.py # File1
| |---constants.py
| |---MyScripts
| | |---__init__.py # File 2
| | |---script.py
And then the content of File1 __init__.py is:
from . import constants
from . import MyScripts
And then the content of File2 __init__.py is:
from . import script
By converting MyProject to a package, you would be able to import constants like:
from MyProject import constants
# Rest of your code
After that, you need to add the root of your MyProject to your python path. If you are using PyCharm, you do not need to do anything. By default the following line is added to your Python Console or Debugging sessions:
import sys
sys.path.extend(['path/to/MyProject'])
If you are not using PyCharm, one is is to add the above script before codes you run or for debugging, add that as a code to run before every session, or you define a setup.py for your MyProject package, install it in your environment and nothing else need to be changed.
The syntax from .. import x is only intended to be used inside packages (thus your error message). If you don't want to make a package, the other way to accomplish what you want is to manipulate the path.
In absolute terms, you can do:
import sys
sys.path.append(r'c:\here\is\MyProject')
import constants
where "MyProject" is the folder you described.
If you want to use relative paths, you need to do a little more work:
import inspect
import os
import sys
parent_dir = os.path.split(
os.path.dirname(inspect.getfile(inspect.currentframe())))[0]
sys.path.append(parent_dir)
import constants
To put it concisely, the error message is telling you exactly what's happening; t's just not explaining why. .. isn't a package. It resolves to MyProject but MyProject isn't a package. To make it a package, you must include an “__init__.py” file directly in the “MyProject” directory. It can even be an empty file if you're feeling lazy.
But given what you've shown us, it doesn't seem like your project is supposed to be a package. Try changing the import in script.py to import constants and running python3 MyScripts/script.py from your project directory.
Related
I created a python package which I use for brain preprocessing - named bpt. I would also like to use the same package as a sub-module in another project named brain_seg.
As a results, I am having problems with import statements which either results in errors when running the package stand-alone or nested.
The structure of the bpt package is as follows:
bpt
|--__init__.py
|--module1.py
|--module2.py
|--run.py
Import statements in the file run.py look something like:
import os
import module1
import module2
.
.
.
The structure of the brain_seg package is as follows:
brain_seg
|--bpt
| |--__init__.py
| |--module1.py
| |--module2.py
| |--run.py
|
|--package2
| |--__init__.py
| |--module1a.py
|
|--__init__.py
When running the run.py script which is part of the stand alone bpt package, everything runs as expected.
If I try to run the same script as part of the brain_seg package, the following error is dispatched:
ModuleNotFoundError: No module named 'module1'
I tried to use relative imports so the nested project will work as expected, resulting in the following run.py file:
import os
from . import module1
from . import module2
.
.
.
Following the change, execution of the brain_seg project worked as expected but when I tried to call the run.py script from the stand alone bpt package, it resulted in the following error being dispatched:
ImportError: attempted relative import with no known parent package
How should I handle the imports such that both options will be supported?
Imports in bpt.run must be absolute or relative:
from bpt.module1 import ... # absolute
from .module1 import ... # relative
To run bpt.run from your repository root with your pythonpath getting correctly set, use -m:
python bpt/run.py # bad
python -m bpt.run # good
If brain_seg is not truly meant to be a package itself, but just a project consisting of multiple packages and modules, get rid of brain_seg/__init__.py.
If you do mean to use it as a package, then move it below the repo root, i.e.
- README.md (etc.)
- .gitignore (etc.)
- brain_seg
- __init__.py
- bps
- __init__.py
- main.py
and use e.g. python -m brain_seg.bps.run
from the repository root.
If you also intend to use bps as a standalone thing, I wouldn't recommend keeping it in the brain_seg hierarchy at all, but to make it pip installable (see https://packaging.python.org/tutorials/packaging-projects/) and then pip install -e ../somewhere/bps to have pip set up an editable link between your projects.
I have looked up solutions online for this issue, but none of them seem to address my specific situation.
I have a pycharm project with multiple directories, subdirs and files. When I invoke the main entry point method from the command line, I get
ModuleNotFoundError on all the imports across ~20 files. The solutions I found online recommend modifying the PYTHONPATH. This is not a viable solution for my use case because
I would have to add the sys.path.append call in all my files. That's the only way that it seems to work.
I cannot use any third party libs.
I will be sharing the project as a zip file. Note: I cannot use github to share the project nor is my intention to create a distributable. So when someone else unzips the project on their PC, they should be able to run it from the command line with out any issues. They should not have to modify their env variables to run it.
What are my options?
EDIT:
Project structure:
Project-
|
|
mod-
|
|
data-
|
|
ClassA.py
ClassB.py
|
config-
|
|
ClassC.py
ClassD.py
...
main.py
Import statements look like this:
from mod.data.ClassA import ClassA
Error:
ModuleNotFoundError: No module named 'ClassA'
This error shows up for every import statement in my python files.
When I add sys.path.append it works, but I have to do it for every import statement and its a hard coded abs path that will need to be updated by anyone with the zip.
If your modules are contained in the same directory as your executable, as you seem to indicate, it's this simple:
File structure:
f1.py
m1
__init__.py
C1.py
C1.py
class C1:
def test(self):
print("Hi!")
f1.py
#!/usr/bin/env python
from m1.C1 import C1
C1().test()
Execution:
> python f1.py
Hi!
You do have __init__.py files in your module directories, right? If not, then that's probably why you're having trouble with this.
I'm trying to build a simple project in Python,however I can't seem to wrap my head around how it's done despite reading the documentations over and over and reading example code.
I'm semi-new to Python, having a background of only single-file scripting.
However I made numerous Node.js and Java projects with multiple folders and packages.
In Python However I can't seem to grasp the project structure and workflow mechanisms- init.py, virtualenvs, setuptools etc. it all seems very alien and unintuitive to me, hurting my overall productivity(ironically Python is supposed to be a language aimed at productivity, right?)
My project has this example structure:
package_test (package)
|
|------- __init__.py
|------- main.py (entry point)
|
|------- /lib (modules)
| |
| |----- __init__.py
| |----- lib.py
|
|------- /Tests
| |
| |----- __init__.py
| |----- test.py
in main.py:
from lib import lib
lib.libtest()
in lib.py:
def libtest():
print("libtest")
in test.py:
from lib import lib
lib.libtest()
Running main.py works, However when running test.py it cannot find the module. I have tried solutions such as appending to sys.path, putting '..' before lib, and more- none have worked.
This is just an example, but I wish to develop more complex projects in Python with multiple subfolders in the future(I think Python has some cool features and nice libraries), yet this problem keeps bothering me. I never had to think about these issues when developing in Java or in Node, not to mention stuff such as virtualenv etc.
Thank you in Advance
Your test.py needs to know lib.py is one level above it and inside the lib folder. So all you should need to do is append the path of your lib folder before running the import of lib.py into test.py. Something like
import sys
sys.path.append("..//lib")
import lib
Please note that the answer above assumes you run on Windows. A more comprehensive approach would be to replace the windows paths with proper os.path syntax. Also another assumption made by this code is that you will run your test.py from it's folder, not with an absolute/relative path.
import sys
import os.path
sys.path.append(os.path.join(os.path.pardir, 'lib'))
import lib
If you want to run your test.py with a path, your code could look like this:
import sys
import os.path
# os.path.abspath(__file__) gets the absolute path of the current python scrip
# os.path.dirname gets the absolute path of whatever file you ask it to (minus the filename)
# The line below appends the full path of your lib folder to the current script's path
sys.path.append(os.path.join(os.path.dirname(os.path.abspath(__file__)), os.path.pardir, 'lib'))
# This should allow test.py to import libtest regardless of where your run it from. Assuming your provided project structure.
from lib import libtest
libtest()
I've seen all sorts of examples and other similar questions, but I can't seem to find an example that exactly matches my scenario. I feel like a total goon asking this because there are so many similar questions, but I just can't seem to get this working "correctly." Here is my project:
user_management (package)
|
|------- __init__.py
|
|------- Modules/
| |
| |----- __init__.py
| |----- LDAPManager.py
| |----- PasswordManager.py
|
|------- Scripts/
| |
| |----- __init__.py
| |----- CreateUser.py
| |----- FindUser.py
If I move "CreateUser.py" to the main user_management directory, I can easily use: "import Modules.LDAPManager" to import LDAPManager.py --- this works. What I can't do (which I want to do), is keep CreateUser.py in the Scripts subfolder, and import LDAPManager.py. I was hoping to accomplish this by using "import user_management.Modules.LDAPManager.py". This doesn't work. In short, I can get Python files to easily look deeper in the hierarchy, but I can't get a Python script to reference up one directory and down into another.
Note that I am able to solve my problem using:
sys.path.append(os.path.join(os.path.dirname(__file__), '..'))
import Modules.LDAPManager as LDAPManager
I've heard that this is bad practice and discouraged.
The files in Scripts are intended to be executed directly (is the init.py in Scripts even necessary?). I've read that in this case, I should be executing CreateUser.py with the -m flag. I've tried some variations on this and just can't seem to get CreateUser.py to recognize LDAPManager.py.
If I move CreateUser.py to the main user_management directory, I can
easily use: import Modules.LDAPManager to import LDAPManager.py
--- this works.
Please, don't. In this way the LDAPManager module used by CreateUser will not be the same as the one imported via other imports. This can create problems when you have some global state in the module or during pickling/unpickling. Avoid imports that work only because the module happens to be in the same directory.
When you have a package structure you should either:
Use relative imports, i.e if the CreateUser.py is in Scripts/:
from ..Modules import LDAPManager
Note that this was (note the past tense) discouraged by PEP 8 only because old versions of python didn't support them very well, but this problem was solved years ago. The current version of PEP 8 does suggest them as an acceptable alternative to absolute imports. I actually like them inside packages.
Use absolute imports using the whole package name(CreateUser.py in Scripts/):
from user_management.Modules import LDAPManager
In order for the second one to work the package user_management should be installed inside the PYTHONPATH. During development you can configure the IDE so that this happens, without having to manually add calls to sys.path.append anywhere.
Also I find it odd that Scripts/ is a subpackage. Because in a real installation the user_management module would be installed under the site-packages found in the lib/ directory (whichever directory is used to install libraries in your OS), while the scripts should be installed under a bin/ directory (whichever contains executables for your OS).
In fact I believe Script/ shouldn't even be under user_management. It should be at the same level of user_management.
In this way you do not have to use -m, but you simply have to make sure the package can be found (this again is a matter of configuring the IDE, installing the package correctly or using PYTHONPATH=. python Scripts/CreateUser.py to launch the scripts with the correct path).
In summary, the hierarchy I would use is:
user_management (package)
|
|------- __init__.py
|
|------- Modules/
| |
| |----- __init__.py
| |----- LDAPManager.py
| |----- PasswordManager.py
|
Scripts/ (*not* a package)
|
|----- CreateUser.py
|----- FindUser.py
Then the code of CreateUser.py and FindUser.py should use absolute imports to import the modules:
from user_management.Modules import LDAPManager
During installation you make sure that user_management ends up somewhere in the PYTHONPATH, and the scripts inside the directory for executables so that they are able to find the modules. During development you either rely on IDE configuration, or you launch CreateUser.py adding the Scripts/ parent directory to the PYTHONPATH (I mean the directory that contains both user_management and Scripts):
PYTHONPATH=/the/parent/directory python Scripts/CreateUser.py
Or you can modify the PYTHONPATH globally so that you don't have to specify this each time. On unix OSes (linux, Mac OS X etc.) you can modify one of the shell scripts to define the PYTHONPATH external variable, on Windows you have to change the environmental variables settings.
Addendum I believe, if you are using python2, it's better to make sure to avoid implicit relative imports by putting:
from __future__ import absolute_import
at the top of your modules. In this way import X always means to import the toplevel module X and will never try to import the X.py file that's in the same directory (if that directory isn't in the PYTHONPATH). In this way the only way to do a relative import is to use the explicit syntax (the from . import X), which is better (explicit is better than implicit).
This will make sure you never happen to use the "bogus" implicit relative imports, since these would raise an ImportError clearly signalling that something is wrong. Otherwise you could use a module that's not what you think it is.
From Python 2.5 onwards, you can use
from ..Modules import LDAPManager
The leading period takes you "up" a level in your heirarchy.
See the Python docs on intra-package references for imports.
In the "root" __init__.py you can also do a
import sys
sys.path.insert(1, '.')
which should make both modules importable.
I faced the same issues. To solve this, I used export PYTHONPATH="$PWD". However, in this case, you will need to modify imports in your Scripts dir depending on the below:
Case 1: If you are in the user_management dir, your scripts should use this style from Modules import LDAPManager to import module.
Case 2: If you are out of the user_management 1 level like main, your scripts should use this style from user_management.Modules import LDAPManager to import modules.
I have the following setup of a library in python:
library
| - __init__.py
| - lib1.py
| - ...
| - tools
| - __init__.py
| - testlib1.py
so in other words, the directory library contain python modules and the directory tools, which is a subdirectory of library contains e.g. one file testlib1.pyto test the library lib1.py.
testlib1.py therefore need to import lib1.py from the directory above to do some tests etc., just by calling python testlib1.py from somewhere on the computer, assuming the file is in the search path.
In addition, I only want ONE PYTHONPATH to be specified.
But we all know the following idea for testlib1.py does not work because the relative import does not work:
from .. import lib1
...
do something with lib1
I accept two kind of answers:
An answer which describes how still to be possible to call testlib1.py directly as the executing python script.
An answer that explains a better conceptual setup of of the modules etc, with the premise that everything has to be in the directory project and the tools has to be in a different directory than the actual libraries.
If something is not clear I suggest to ask and I will update the question.
Try adding a __init__.py to the tools directory. The relative import should work.
You can't. If you plan to use relative imports then you can't run the module by itself.
Either drop the relative imports or drop the idea of running testlib1.py directly.
Also I believe a test file should never use relative imports. Test should check whether the library works, and thus the code should be as similar as possible to the code that the users would actually use. And usually users do not add files to the library to use relative imports but use absolute imports.
By the way, I think your file organization is too much "java-like": mixing source code and tests. If you want to do tests then why not have something like:
project/
|
+-- src/
| |
| +--library/
| | |
| | +- lib1.py
| | |
| | #...
| +--library2/ #etc.
|
+-- tests/
|
+--testlibrary/
| |
| +- testlib1.py
#etc
To run the tests just use a tool like nosetests which automatically looks for this kind of folder structure and provide tons of options to change the search/test settings.
Actually, I found a solution, which does
works with the current implementation
does not require any changes in PYTHONPATH
without the need to hardcode the top-lebel directory
In testlib1.py the following code will do (has been tested):
import os
import sys
dirparts = os.path.dirname(os.path.abspath(__file__)).split('/')
sys.path.append('/'.join(dirparts[:-1]))
import mylib1
Not exactly sure this is a very clean or straightforward solution, but it allows to import any module from the directory one level up, which is what I need (as the test code or extra code or whetever is always located one level below the actual module files).