Clashing imports between Python modules - python

I have the following directory structure:
.
├── main.py
├── package
│ ├── module.py
│ └── utils.py
└── utils.py
Inside package, I have (lots of) code in which all imports are relative to package, e.g. package/module.py contains import utils, and it expects to import package/utils.py (not utils.py).
All the code outside package expects imports to be relative to the root directory ..
This is causing an issue for me because if main.py contains import package.module and I have PYTHONPATH=., then package/module.py ends up importing utils.py instead of the desired package/utils.py (since it contains import utils).
How do I resolve this without having to rename scripts? I would like to install the code in package in a way so that I can import it in main.py without its imports clashing with my other files.
What I tried: I added a minimal setup.py file inside package and ran pip install -e . but that didn't resolve the issue.
Thanks a lot for the help!

Have you tried a relative import for Submodules?
So that in general you would use
import utils # Import ./utils.py
import .utils # Import relative ./<eg. package>/utils.py
That would cause Scripts under 'package' to always import their local utils.py

Related

How can I use relative imports in Python to import a function in another directory

I have a directory structure with 2 basic python files inside seperate directories:
├── package
│ ├── subpackage1
│ │ └── module1.py
└── subpackage2
└── module2.py
module1.py:
def module1():
print('hello world')
module2.py:
from ..subpackage1.module1 import module1
module1()
When running python3 module2.py I get the error: ImportError: attempted relative import with no known parent package
However when I run it with the imports changed to use sys.path.append() it runs successfully
import sys
sys.path.append('../subpackage1/')
from module1 import module1
module1()
Can anyone help me understand why this is and how to correct my code so that I can do this with relative imports?
To be considered a package, a Python directory has to include an __init__.py file. Since your module2.py file is not below a directory that contains an __init__.py file, it isn't considered to be part of a package. Relative imports only work inside packages.
UPDATE:
I only gave part of the answer you needed. Sorry about that. This business of running a file inside a package as a script is a bit of a can of worms. It's discussed pretty well in this SO question:
Relative imports in Python 3
The main take-away is that you're better off (and you're doing what Guido wants you to) if you don't do this at all, but rather move directly executable code outside of any module. You can usually do this by adding an extra file next to your package root dir that just imports the module you want to run.
Here's how to do that with your setup:
.
├── package
│   ├── __init__.py
│   ├── subpackage1
│   │   └── module1.py
│   └── subpackage2
│   └── module2.py
└── test.py
test.py:
import package.subpackage2.module2
You then run test.py directly. Because the directory containing the executed script is included in sys.path, this will work regardless of what the working directory is when you run the script.
You can also do basically this same thing without changing any code (you don't need test.py) by running the "script" as a module.
python3 -m package.subpackage2.module2
If you have to make what you're trying to do work, I think I'd take this approach:
import os, sys
sys.path.append(os.path.join(os.path.dirname(__file__), '..'))
from subpackage1.module1 import module1
module1()
So you compute in a relative way where the root of the enclosing package is in the filesystem, you add that to the Python path, and then you use an absolute import rather than a relative import.
There are other solutions that involve extra tools and/or installation steps. I can't think why you could possibly prefer those solutions to the last solution I show.
By default, Python just considers a directory with code in it to be a directory with code in it, not a package/subpackage. In order to make it into a package, you'll need to add an __init__.py file to each one, as well as an __init__.py file to within the main package directory.
Even adding the __init__.py files won't be enough, but you should. You should also create a setup.py file next to your package directory. Your file tree would look like this:
├── setup.py
└── package
├── __init__.py
└── subpackage1
│ ├── __init__.py
│ └── module1.py
└── subpackage2
├── __init__.py
└── module2.py
This setup.py file could start off like this:
from setuptools import setup
setup(
name='package',
packages=['package'],
)
These configurations are enough to get you started. Then, on the root of your directory (parent folder to package and setup.py), you will execute next command in you terminal pip install -e . to install your package, named package, in development mode. Then you'll be able to navigate to package/subpackage2/ and execute python module2.py having your expected result. You could even execute python package/subpackage2/module2.py and it works.
The thing is, modules and packages don't work the same way they work in another programming languages. Without the creation of setup.py if you were to create a program in your root directory, named main.py for example, then you could import modules from inside package folder tree. But if you're looking to execute package\subpackage2\module2.py.
If you want relative imports without changing your directory structure and without adding a lot of boilerplate you could use my import library: ultraimport
It gives the programmer more control over their imports and lets you do file system based relative or absolute imports.
Your module2.py could then look like this:
import ultraimport
module1 = ultraimport('__dir__/../subpackage1/module1.py')
This will always work, no matter how you run your code or if you have any init files and independent of sys.path.

Importing a Python module from subfolder of another folder using relative path

I have the following folder structure,
└── project
├── A
│ ├── main.py
│ └── __init__.py
└── B
├── __init__.py
├── C
├── __init__.py
└── module_x.py
I want to import all the methods in module_x.py into main.py. I have tried
from ..B.C.module_x import *
But I get the following error:
ImportError: attempted relative import with no known parent package
I wonder what am I doing wrong? How can this be done using relative import?
from project.B.C import foo
from ...b.c.module_x import foo
However, relative imports are only meant to work within one package. If project is a package, then you can use relative imports here. If project is not a package, you cannot.
However, if you're running a script in / and doing something like import project.A.b.foo, then that relative import will succeed because project is now a package. In that case, the following two would be equivalent:
from ...B.C import foo
from project.B.C import foo
You must use the -m switch to run python modules as scripts:\
$ cd project
$ python -m A.main # note no .py
This tells python that A.main is a module - python will also scan the current working dir (project) and detect package B - this will make your imports work correctly.

Not able to import module from other directory in Python 3.7.1

I have a package structured as:
Classes in those packages are named exactly like the file names. Also, init.py has following code
from tableau_util import tableau_util
from sftp_util import sftp_util
from s3_util import s3_util
I have another file e.g. test.py which is outside this folder 'utils'. I want to import those classes into test.py so my code is
from utils.tableau_util import tableau_util
from utils.sftp_util import sftp_util
from utils.s3_util import s3_util
I am still getting the error:
ModuleNotFoundError: No module named 'tableau_util'
What can I try to resolve this?
Without knowing everything I would guess that you are trying to run your test.py as a normal python script. Given this folder structure
.
├── __init__.py
├── test
│   ├── __init__.py
│   └── test.py
└── utils
├── __init__.py
├── s3_util.py
└── tableau_util.py
with these files test.py
from utils.s3_util import s3_util
from utils.tableau_util import tableau_util
s3_util()
tableau_util()
import sys
print(sys.path)
s3_util.py
def s3_util():
print('Im a s3 util!')
tableau_util.py
def tableau_util():
print('Im a tableu util!')
if you just tried to run python test/test.py in the main folder it will give you the ModuleNotFoundError. That's because it sets the ./test folder as the python path and it won't be able to see the utils folder and therefore be able to import it. However if you run it as python -m test.test (note the lack of .py you don't need it when you run it as a module) that will tell python to load it as a module and then it will run correctly with this output:
Im a s3 util!
Im a tableau util!
If you don't want to put the test.py in another folder you can simply keep it in the parent folder of utils and be able to run it in the traditional python test.py and get the same results. Error while finding spec for 'fibo.py' (<class 'AttributeError'>: 'module' object has no attribute '__path__') has some more reading on the matter.
For the record all my __init__.py files are empty and don't import anything and this is normally how they are setup unless you want to specify certain functions that need to be imported when the module is imported automatically.
I used PyCharm's create package option to create folders and files again and it is working now. Here are my new (working) folder structure:
My main script has following lines of code to import those classes:
from utils_pkg import tableau_util
from utils_pkg import s3_util
from utils_pkg import sftp_util
First, inside __init__.py (or in any sub-module that tries to import one of its siblings from the same package) you should add a "relative import" dot to the beginning of the module name, so that it reads:
from .tableau_util import tableau_util
# ^right here
Second, make sure your current working directory is not utils. A good place to start, for testing, might be to cd to the parent directory of utils instead.

Module not found error when importing

I am trying to import * from a file classes.py. My directory is as follows
mypkg
├── main.py
├── classes.py
When I try
from classes import *
It does not recognise classes. Looking it up I saw that I should use the explicit import
from .classes import *
which does recognise classes but gives the error below when I try to run it.
ModuleNotFoundError: No module named '__main__.classes'; '__main__' is not a package
Any advice on what to do and why this is happening would be hugely appreciated.
For python to recognize a folder as a package, you need an __init__.py file in it:
mypkg
├── __init__.py
├── main.py
├── classes.py
The directory where python is invoked is also important (running from inside a package folder is different to running from outside), and there's the PYTHONPATH environment variable as well.
The python documentation as a section on import and the package system, with good explanations.

How can I create imports that always work?

I am struggling a bit to set up a working structure in one of my projects. The problem is, that I have main package and a subpackage in a structure like this (I left out all unnecessary files):
code.py
mypackage/__init__.py
mypackage/work.py
mypackage/utils.py
The utils.py has some utility code that is normally only used in the mypackage package.
I normally have some test code each module file, that calls some methods of the current module and prints some things to quickcheck if everything is working correctly. This code is placed in a if __name__ == "__main__" block at the end of the file. So I include the utils.py directly via import utils. E.g mypackage/work.py looks like:
import utils
def someMethod():
pass
if __name__ == "__main__":
print(someMethod())
But now when I use this module in the parent package (e.g. code.py) and I import it like this
import mypackage.work
I get the following error:
ImportError: No module named 'utils'
After some research I found out, that this can be fixed by adding the mypackage/ folder to the PYTHONPATH environment variable, but this feels strange for me. Isn't there any other way to fix this? I have heard about relative imports, but this is mentioned in the python docs about modules
Note that relative imports are based on the name of the current module. Since the name of the main module is always "main", modules intended for use as the main module of a Python application must always use absolute imports.
Any suggestions how I can have a if __name__ == "__main__" section in the submodule and also can use this file from the parent package without messing up the imports?
EDIT: If I use a relative import in work.py as suggested in a answer to import utils:
from . import utils
I get the following error:
SystemError: Parent module '' not loaded, cannot perform relative import
Unfortunately relative imports and direct running of submodules don't mix.
Add the parent directory of mypackage to your PYTHONPATH or always cd into the parent directory when you want to run a submodule.
Then you have two possibilities:
Use absolute (from mypackage import utils) instead of relative imports (from . import utils) and run them directly as before. The drawback with that solution is that you'll always need to write the fully qualified path, making it more work to rename mypackage later, among other things.
or
Run python3 -m mypackage.utils etc. to run your submodules instead of running python3 mypackage/utils.py.
This may take some time to adapt to, but it's the more correct way (a module in a package isn't the same as a standalone script) and you can continue to use relative imports.
There are more "magical" solutions involving __package__ and sys.path but they all require extra code at the top of every file with relative imports you want to run directly. I wouldn't recommend these.
You should create a structure like this:
flammi88
├── flammi88
│ ├── __init__.py
│   ├── code.py
│   └── mypackage
│   ├── __init__.py
│   ├── utils.py
│   └── work.py
└── setup.py
then put at least this in the setup.py:
import setuptools
from distutils.core import setup
setup(
name='flammi88',
packages=['flammi88'],
)
now, from the directory containing setup.py, run
pip install -e .
This will make the flammi88 package available in development mode. Now you can say:
from flammi88.mypackage import utils
everywhere. This is the normal way to develop packages in Python, and it solves all of your relative import problems. That being said, Guido frowns upon standalone scripts in sub-packages. With this structure I would move the tests inside flammi88/tests/... (and run them with e.g. py.test), but some people like to keep the tests next to the code.
Update:
an extended setup.py that describes external requirements and creates executables of the sub-packages you want to run can look like:
import setuptools
from distutils.core import setup
setup(
name='flammi88',
packages=['flammi88'],
install_requires=[
'markdown',
],
entry_points={
'console_scripts': [
'work = flammi88.mypackage.work:someMethod',
]
}
)
now, after pip installing your package, you can just type work at the command line.
Import utils inside the work.py as follows:
import mypackage.utils
or if you want to use shorter name:
from mypackage import utils
EDIT: If you need to run work.py outside of the package, then try:
try:
from mypackage import utils
except ImportError:
import utils
Use:
from . import utils
as suggested by Peter
In your code.py you should use:
from mypackage import work

Categories