Import a custom module in the parent folder in python - python

Although I think this should be pretty simple I still can't make it run.
I have following folder structure:
├── apartment
│ ├── src
│ ├── train_model
│ ├── __init__.py
│ ├── train_model.py
│ ├── utils.py
│ ├── interference.py
│ └── __init__.py
In utils.py I tried:
from src.interference import create_sample
Error: ModuleNotFoundError: No module named 'src'
from .interference import create_sample
Error: ImportError: attempted relative import with no known parent package
from interference import create_features_sample
ModuleNotFoundError: No module named 'interference'
What is the way to make it work? I am not a huge fan of non-pythonic ways as it looks dirty.

The structure starting with a src/ is aimed explicitly at not enabling import by from src.intereference import ..., and you should not put an __init__.py file in src/ folder.
Instead, following nice explanation and examples here: https://blog.ionelmc.ro/2014/05/25/python-packaging/, here is what I recommend:
install the package:
add a setup.py file at root of your folder (this is clearly not as hard as it seems)
maybe create a virtual environment
using pip install -e . (with trailing dot!) command
then simply import your package by from interference import ...
To answer to your primary request, you could update src/__init__.py with from intereference import create_sample, to expose this function at a higher level, then the chained import would work. However, I do not recommend this, as it renders everything very rigid.

You need to add the directory that contains interference to PYTHONPATH.
You can use OS depending path in "module search path" which is listed in sys.path. So you can easily add parent directory like following:
import sys
sys.path.insert(0, '..')
from interference import create_features_sample
Note that the previous code uses a relative path, so you must launch the file inside the same location or it will likely not work.
To launch from anywhere, you can use Path from the pathlib module.
from pathlib import Path
import sys
path = str(Path(Path(__file__).parent.absolute()).parent.absolute())
sys.path.insert(0, path)
from interference import create_features_sample

If you want add a custom path to your PYTHONPATH permanently, go to the 'site-packages' folder of your current Python environment and add the file 'custompaths.pth' in which each line should consist of a directory which will then be checked if you try to import the module.
Assuming 'src' being the module you want to import you should add the following line to the .pth file:
your_preceding_path/apartment

Error because of python can't able to find the specific file or package. Python usually find only for the child packages or files, otherwise you need to specify the absolute path.
import sys, os
sys.path.append(os.path.abspath(os.path.join('../..', 'src')))
Please refer the "How to access a module from outside your file folder in Python?"

Have you tried from ..interference import create_sample ?
Or for the whole module from .. import interference.
I've checked here, that I'm using another command, something that is missing from the question at the time of writing this. The command that I'm using is python -m src.train_model.utils from apartment folder.
Thanks to RMPR for helping me.
This problem is similar to this one.

Related

Import top level file inside subdirectory

I have the following structure in my Python project:
project
├── subdir
│ ├── __init__.py
│ └── script_to_run.py
├── __init__.py
└── functions.py
In script_to_run.py file, I want to import a function from top-level functions.py file as
from functions import function_to_import
When I try to run the script from either root directory (project) or its subdirectory (subdir), I get the following error:
ModuleNotFoundError: No module named 'functions'
How do I import it then?
I've seen several approaches, including messing with your sys.path or installing your local package through pip, but that seems like too much work for something this simple, right?
Python packages are a great thing, but come at a cost: you cannot directly run a file lying deep in the package.
If the file is imported from the package, the correct way would be:
from ..functions import function_to_import
If the file is intended to be launched as a top level script, it should not be in the package. If you really want to go that way, the solution is to add the project directory to sys.path and use:
from functions import function_to_import
but it is an anti-pattern... You have been warned...

Python and the imports

I have a python project where I use grpc.
I create the files with python -m grpc_tools.protoc -I "pathToMyProtoFile" --python_out=. --grpc_python_out=. "pathToMyProtoFile\module.proto"
I want all the grpc-stuff to be in a python package. So I created a sub folder "my_package_folder" and added an empty __init__.py in it.
My Problem: How to access and where to place the generated module_pb2.py and module_pb2_grpc.py.
If I place them into the root folder of my application I cannot access them from my package with from .. import module_pb2_grpc "attempted relative import beyond top-level package"
If I place them into my "my_package_folder" the 2 generated files do not find each other.
(import module_pb2 as module__pb2 in "module_pb2_grpc.py")
This import mechanism in python is so extremely confusing... I have no idea where to start to solve this problem.
My folder structure is just the main project folder and a sub folder "my_package_folder" for all the grpc stuff.
Let's say you have a folder structure like this. I'm just taking the example of one file.
├── module_pb2_grpc.py
├── my_package_folder
│ ├── __init__.py
Then to resolve the attempted relative import beyond top-level package, you can add this.
init.py
import os
import sys
SCRIPT_DIR = os.path.dirname(os.path.abspath(__file__))
sys.path.append(os.path.dirname(SCRIPT_DIR))
from module_pb2_grpc import *

Cannot run Python3 script from subfolder // relative import error [duplicate]

This question already has answers here:
How to do relative imports in Python?
(18 answers)
Closed 4 years ago.
I have a simple project structure like this:
➜ (venv:evernote) evernote_bear_project git:(master) ✗ tree | grep -v pyc
.
├── README.md
...
(snip)
...
├── manage.py
├── sample
│ ├── EDAMTest.py <==== here is an import that won't work
│ └── enlogo.png
└── util
├── __init__.py
├── files.py <====== This is being imported
└── test_files.py
Now I have a relative import in sample/EDAMTest.py:
from ..util.files import *
When I try to run python sample/EDAMTest.py from project root folder in command line, I get an error saying:
ValueError: attempted relative import beyond top-level package
I know this has been asked many times, but I still don't get it.
Since I'm running the script from the project root, in my understanding Python should be able to "know" that when I try to import from ..util.files import *, that it should go up one directory, no?
EDIT
Thanks for all the answers.
So what I understand from the link above is this:
I was running the module sample/EDAMTest.py directly via python sample/EDAMTest.py, and that meant
that the __name__ of the module was __main__
and now the path (sample/) was my "root" so to speak.
So now Python searches only this path and any path that's below it for modules / packages. Hence the error message attempted relative import _beyond top-level package_, so it cannot go one more level up, because it is at the root already.
Also Python cannot look one level "up", since this syntax from ..util.files import * does not go up a level in directory, but in a list of modules / packages it keeps on the "import search path" (sys.path)?
Is that correct?
sys.path.append() is a tweak, if your directory structure is fixed and there is nothing you can do about it.
Otherwise, you can try rearranging the folders. The easiest is moving util under sample, another option is making both of the folders psrt of a larger package.
Also import * is not ensorsed.
The relative import syntax is for importing other modules from the same package, not from the file system.
Depending on what you want, you could...
Fix the package so that the relative import works
Put __init__.py files in the project root and sample directory and run the script from one level up. This doesn't seem like what you want.
Tell python where to find the package
Set the PYTHONPATH environment variable so that python can find the package.
PYTHONPATH='.':$PYTHONPATH python samples/EDAMTest.py
Install util so that python can find it
Add a setup script and use it to install the util package and avoid setting PYTHONPATH.
"The relative import syntax is for importing other modules from the same package, not from the file system.", This is right as stated by George G.
Put __init__.py in your subfolders, which will make them package.
__init__.py can be an empty file but it is often used to perform
setup needed for the package(import things, load things into path, etc).
However you can import File into your __init__.py to make it
available at the package level:
# in your __init__.py
from util import files
# now import File from util package
from util import files
of if you want to import some specific method or class, you can do
from util.files import some_function
Another thing to do is at the package level make util/modules
available with the __all__ variable. When the interpeter sees
an __all__ variable defined in an __init__.py it imports the
modules listed in the __all__ variable when you do:
from package import *
__all__ is a list containing the names of modules that you want to be
imported with import * so looking at our above example again if we
wanted to import the submodules in util the all variable
in util/init.py would be:
__all__ = ['files', 'test_files']
With the __all__ variable populated like that, when you perform
from util import *
it would import files and test_files.

How to import from a python package which is in a sibling directory?

I am new to python and i am having some issues with packages and imports.
My structure is as follows:
src/
base/
packcore/
__init__.py
utils/
__init__.py
util_1.py
util_2.py
helpers/
__init__.py
helper_1.py
helper_2.py
some_script.py
app.py
packcore is an external package that has been installed using pip and put into the --target=base.
some of the helpers in the packcore uses some of the utils.
From app.py i want to be able to import a helper/util.
But when i use from base.packcore.utils import some_util i get an error saying that no module named packcore from inside the helper/util
and if i do from packcore.utils import some_util i get an error no module named packcorefrom theapp.py`
help would be much appreciated :)
If you add an __init__.py to base/, you can make it a Python package to import from. You also need to make the parent a package (Which is currently called src) so it is actually a sibling module, rather than many isolated modules.
From there, you can either do an absolute import from the main package:
from src.base.packcore.helpers import helper_1
Or relative (Assuming you are in some_script.py or app.py):
from .base.packcore.helpers import helper_1

How can I create imports that always work?

I am struggling a bit to set up a working structure in one of my projects. The problem is, that I have main package and a subpackage in a structure like this (I left out all unnecessary files):
code.py
mypackage/__init__.py
mypackage/work.py
mypackage/utils.py
The utils.py has some utility code that is normally only used in the mypackage package.
I normally have some test code each module file, that calls some methods of the current module and prints some things to quickcheck if everything is working correctly. This code is placed in a if __name__ == "__main__" block at the end of the file. So I include the utils.py directly via import utils. E.g mypackage/work.py looks like:
import utils
def someMethod():
pass
if __name__ == "__main__":
print(someMethod())
But now when I use this module in the parent package (e.g. code.py) and I import it like this
import mypackage.work
I get the following error:
ImportError: No module named 'utils'
After some research I found out, that this can be fixed by adding the mypackage/ folder to the PYTHONPATH environment variable, but this feels strange for me. Isn't there any other way to fix this? I have heard about relative imports, but this is mentioned in the python docs about modules
Note that relative imports are based on the name of the current module. Since the name of the main module is always "main", modules intended for use as the main module of a Python application must always use absolute imports.
Any suggestions how I can have a if __name__ == "__main__" section in the submodule and also can use this file from the parent package without messing up the imports?
EDIT: If I use a relative import in work.py as suggested in a answer to import utils:
from . import utils
I get the following error:
SystemError: Parent module '' not loaded, cannot perform relative import
Unfortunately relative imports and direct running of submodules don't mix.
Add the parent directory of mypackage to your PYTHONPATH or always cd into the parent directory when you want to run a submodule.
Then you have two possibilities:
Use absolute (from mypackage import utils) instead of relative imports (from . import utils) and run them directly as before. The drawback with that solution is that you'll always need to write the fully qualified path, making it more work to rename mypackage later, among other things.
or
Run python3 -m mypackage.utils etc. to run your submodules instead of running python3 mypackage/utils.py.
This may take some time to adapt to, but it's the more correct way (a module in a package isn't the same as a standalone script) and you can continue to use relative imports.
There are more "magical" solutions involving __package__ and sys.path but they all require extra code at the top of every file with relative imports you want to run directly. I wouldn't recommend these.
You should create a structure like this:
flammi88
├── flammi88
│ ├── __init__.py
│   ├── code.py
│   └── mypackage
│   ├── __init__.py
│   ├── utils.py
│   └── work.py
└── setup.py
then put at least this in the setup.py:
import setuptools
from distutils.core import setup
setup(
name='flammi88',
packages=['flammi88'],
)
now, from the directory containing setup.py, run
pip install -e .
This will make the flammi88 package available in development mode. Now you can say:
from flammi88.mypackage import utils
everywhere. This is the normal way to develop packages in Python, and it solves all of your relative import problems. That being said, Guido frowns upon standalone scripts in sub-packages. With this structure I would move the tests inside flammi88/tests/... (and run them with e.g. py.test), but some people like to keep the tests next to the code.
Update:
an extended setup.py that describes external requirements and creates executables of the sub-packages you want to run can look like:
import setuptools
from distutils.core import setup
setup(
name='flammi88',
packages=['flammi88'],
install_requires=[
'markdown',
],
entry_points={
'console_scripts': [
'work = flammi88.mypackage.work:someMethod',
]
}
)
now, after pip installing your package, you can just type work at the command line.
Import utils inside the work.py as follows:
import mypackage.utils
or if you want to use shorter name:
from mypackage import utils
EDIT: If you need to run work.py outside of the package, then try:
try:
from mypackage import utils
except ImportError:
import utils
Use:
from . import utils
as suggested by Peter
In your code.py you should use:
from mypackage import work

Categories