Python has a module named "os". It also has some other module named "os.path" which is categorized under the "os".
I can use "os.path" methods even if only import the "os" module.
import os
print(os.path.join("sdfs","x"))
I wonder how can I define a sub-module like this?
That's the __init__.py 'magic' of the os module - it imports its submodule path to its namespace, essentially giving you a way to access the latter even if you only import os.
os
|- path
|- __init.__.py # 2
|- __init__.py # 1
The first __init__.py (#1) essentially has import .path so whenever you import just os, it imports path in its namespace, and therefore you can access it as os.path.
(NOTE: This is not exactly the case with the os module, but that's how to essentially achieve it)
Use this structure:
/ Package
├── __init__.py
├── file.py
│
├─┐ subpackage
│ ├── __init__.py
│ └── file.py
│
└─┐ subpackage2
├── __init__.py
└── file.py
Note each subpackage has its own __init__.py file. This will make Package.subpackage behave like os.path, importation speaking (considering you do not import .subpackage under the main __init__ file of Package).
Related
I always have the same problem and I finally want to get rid of it. My folder structure looks like this
project
├── scipts
│ └── folder
│ └── file.py
└── submodules
└── lab_devices
└── optical_devices
└── __init__.py
└── powermeter_driver.py
I now want to include the powermeter_driver.py in file.py. So what I do in the file.py is:
from submodules.lab_devices.optical_devices.powermeter_driver import PowermeterDriver
but this gives ModuleNotFoundError: No module named 'submodules'. I don't want to use
import sys
sys.path.insert(0, '../submodules')
Is there an easy workaround?
The imports will be resolved correctly if you run the script in the correct way, which is from the parent directory and using the -m switch. So you should cd in the parent folder, add __init__.py files as in:
project
├── scripts
└── __init__.py
│ └── folder
└── __init__.py
│ └── file.py
└── submodules
└── __init__.py
└── lab_devices
└── __init__.py
└── optical_devices
└── __init__.py
└── powermeter_driver.py
so that python knows these are packages then run
python -m scripts.folder.file # note no .py
In file.py you can then use the absolute import as you are cause submodules will be detected as a package. You should indeed avoid hacking the sys.path by all means.
You need to consider that if you write from submodules.... this is an absolute import. It means Python starts searching for the submodules in all directories in sys.path. Python usually adds your current working directory as first item to sys.path, so if you cd to your project directory and then run it as a module using python -m it could work.
Of course absolute imports suck if you have files in a relative location to each other. I've had similar issues and I've created an experimental, new import library ultraimport that allows to do file system based imports. It could solve your issue if you are willing to add a new library for this.
Instead of:
from submodules.lab_devices.optical_devices.powermeter_driver import PowermeterDriver
In file.py you would then write:
import ultraimport
PowermeterDriver = ultraimport('__dir__/../../submodules/lab_devices/optical_devices/powermeter_driver.py', 'PowermeterDriver')
The file path is relative to file.py and thus this will always work, no matter how you run your code or what is in sys.path.
One caveat when importing scripts like this is if they contain further relative imports. ultraimport has a builtin preprocessor to rewrite subsequent relative imports so they continue to work.
I'm making a python library and i want to be able to run the code i'm developing, in another folder i have a python file, but I get the error: ModuleNotFoundError: No module named
this is my folder structure
.
└── project
└── library_directory
├── __init__.py
└── main.py
└── examples_directory
├── __init__.py
└── code_directory
├── __init__.py
└── test.py
init.py from library_directory
from library_directory.main import Class
test.py file
from library_directory import Class
when I run test.py file it says: ModuleNotFoundError: No module named 'fpdf_table'
if i put test.py file at project level this configuration of init and test works, but i want to run the test.py in the code_directory because i will have a lot of files and don't want 15+ single files at project level
.
└── project
└── library_directory
├── __init__.py
└── main.py
└── examples_directory
├── __init__.py
└── code_directory
├── __init__.py
└── test.py
i already tried absolute and relative imports but they don't work
Im not sure if this is the correct solution, but you can try this:
In your test module:
import sys
sys.path.insert(0, '/Your_project_root_path'
Now can access to the packages in the root directory.
I took this solution from here.
Your library_directory folder is a Python package. Within the package, each .py file is a module.
In your library_directory/init.py file, insert the line from .main import Class. Now you have a package called "library_directory", which has a module called "Main", and a class called "Class".
Now, in your environment variables, create a user variable called PYTHONPATH and add to this, the path to your project directory.
When you import in your project file, you should import using the structure: from package.module import class, or in your case: from library_directory.main import Class. The python interpreter will be able to find the package due to being in your PYTHONPATH and the init file directory means Python will recognise it as a package.
(You may wish to rename "library_directory" to be a bit more project specific)
Let's say I have a project with following structure:
├── modules
│ ├── modulea.py
│ └── moduleb.py
└── program.py
program.py is the main executable, which imports module a like so
from modules import modulea
in modulea I would like to import something from moduleb. Now, logically, I should be doing import moduleb as it is in the same directory. But that would yield ModuleNotFoundError: No module named 'moduleb'
So in order for this chain import to work, I either have to do from modules import moduleb. which angers IDEs as there is no "modules" directory around or from . import moduleb which angers pylint.
While both of the methods above work, they feel confusing and unpythonic as one is assuming that imports are going to happen from one level above and one is using relative imports where it doesn't seem necessary.
Is there a way to handle this with more grace?
Thank you.
You can make modules a python package by creating a __init__.py file in it.
├── modules
│ ├── __init__.py
| ├── modulea.py
│ └── moduleb.py
└── program.py
Then use from modules.moduleb import some_object in modulea.
My project structure is the following. Inside api.py i need some functions written in the upper level.
Project1
├── model.py
├── audio_utils.py
├── audio.py
└── backend
├── static
│ ├──js
│ ├──img
└── api.py
Why am I unable to import inside api.py the functions in the upper level?
When i try to do:
from audio_utils import *
I got the following:
No module named 'audio_utils'
Modules are imported from paths prefixes specified in sys.path. It usually contains '' that means that modules from current working directory are gonna be loaded.
(https://docs.python.org/3/tutorial/modules.html#packages)
I think you are starting your Python interpret while being in the backend directory. Then I think there is no way to access the modules in the upper directory -- not even with the .. (https://realpython.com/absolute-vs-relative-python-imports/#syntax-and-practical-examples_1) unless you change the sys.path which would be a really messy solution.
I suggest you create __init__.py files to indicate that the directories containing them are Python packages:
Project1
├── model.py
├── audio_utils.py
├── audio.py
└── backend
|-- __init__.py
├── static
│ ├──js
│ ├──img
└── api.py
And always start the interpret from the Project1 dir. Doing so, you should be able to import any module like this:
import model
from backed import api
import audio_utils
no matter in which module in the Project1 you are writing this in. The current directory of the interpret will be tried.
Note there is also the PYTHONPATH env variable and that you can use to your advantage.
Note that for publishing your project it is encouraged to put all the modules in a package (in other words: don't put the modues to the top level). This is to help prevent name collisions. I think this may help you to understand: https://realpython.com/pypi-publish-python-package/
You have __init__.py files in both directories right?
Try from ..audio_utils import *
If you create the dir structure this way:
$ tree
.
├── bar
│ ├── den.py
│ └── __init__.py # This indicates the bar is python package.
└── baz.py
1 directory, 3 files
$ cat bar/den.py
import baz
Then in the dir containing the bar/ and baz.py (the top level) you can start the Python interpret and use the absolute imports:
In [1]: import bar.den
In [2]: import baz
In [3]: bar.den.baz
Out[3]: <module 'baz' from '/tmp/Project1/baz.py'>
As you can see, we were able to import bar.den which also could import the baz from the top-level.
Let's say I have a directory structure as such
src\
__init__.py
notebooks\
__init__.py
foo.py
utils\
__init__.py
db_connection.py
and in foo.py I have
from utils.db_connection import *
Why does this fail?
The reason is, you do not have the "utils" directory in the searching list sys.path. You have two solutions. First is the move the foo.py into the higher folder, like:
│ foo.py
│ __init__.py
│
├─notebooks
│ __init__.py
│
└─utils
db_connection.py
db_connection.pyc
__init__.py
__init__.pyc
Otherwise, you could add the directory into the sys.path, like:
import sys
sys.path.append("..")
import utils.db_connection
But the second one is really ugly~
You don't have src/ (or its full path) in sys.path.