nightmare with relative imports, how does pep 366 work? - python

I have a "canonical file structure" like that (I'm giving sensible names to ease the reading):
mainpack/
__main__.py
__init__.py
- helpers/
__init__.py
path.py
- network/
__init__.py
clientlib.py
server.py
- gui/
__init__.py
mainwindow.py
controllers.py
In this structure, for example modules contained in each package may want to access the helpers utilities through relative imports in something like:
# network/clientlib.py
from ..helpers.path import create_dir
The program is runned "as a script" using the __main__.py file in this way:
python mainpack/
Trying to follow the PEP 366 I've put in __main__.py these lines:
___package___ = "mainpack"
from .network.clientlib import helloclient
But when running:
$ python mainpack
Traceback (most recent call last):
File "/usr/lib/python2.6/runpy.py", line 122, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/usr/lib/python2.6/runpy.py", line 34, in _run_code
exec code in run_globals
File "path/mainpack/__main__.py", line 2, in <module>
from .network.clientlib import helloclient
SystemError: Parent module 'mainpack' not loaded, cannot perform relative import
What's wrong? What is the correct way to handle and effectively use relative imports?
I've tried also to add the current directory to the PYTHONPATH, nothing changes.

The "boilerplate" given in PEP 366 seems incomplete. Although it sets the __package__ variable, it doesn't actually import the package, which is also needed to allow relative imports to work. extraneon's solution is on the right track.
Note that it is not enough to simply have the directory containing the module in sys.path, the corresponding package needs to be explicitly imported. The following seems like a better boilerplate than what was given in PEP 366 for ensuring that a python module can be executed regardless of how it is invoked (through a regular import, or with python -m, or with python, from any location):
# boilerplate to allow running as script directly
if __name__ == "__main__" and __package__ is None:
import sys, os
# The following assumes the script is in the top level of the package
# directory. We use dirname() to help get the parent directory to add to
# sys.path, so that we can import the current package. This is necessary
# since when invoked directly, the 'current' package is not automatically
# imported.
parent_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
sys.path.insert(1, parent_dir)
import mypackage
__package__ = str("mypackage")
del sys, os
# now you can use relative imports here that will work regardless of how this
# python file was accessed (either through 'import', through 'python -m', or
# directly.
If the script is not at the top level of the package directory and you need to import a module below the top level, then the os.path.dirname has to be repeated until the parent_dir is the directory containing the top level.

The loading code seems to be something like this:
try:
return sys.modules[pkgname]
except KeyError:
if level < 1:
warn("Parent module '%s' not found while handling "
"absolute import" % pkgname, RuntimeWarning, 1)
return None
else:
raise SystemError, ("Parent module '%s' not loaded, cannot "
"perform relative import" % pkgname)
which makes me think that maybe your module is not on sys.path. If you start Python (normally) and just type "import mainpack" on the prompt, what does it do? It should be able to find it.
I have tried it myself and got the same error. After reading a bit I found the following solution:
# foo/__main__.py
import sys
mod = __import__('foo')
sys.modules["foo"]=mod
__package__='foo'
from .bar import hello
hello()
It seems a bit hackish to me but it does work. The trick seems to be making sure package foo is loaded so the import can be relative.

Inspired by extraneon's and taherh's answers here is some code that runs up the file tree until it runs out of __init__.py files to build the full package name. This is definitely hacky, but does seem to work regardless of the depth of the file in your directory tree. It seems absolute imports are heavily encouraged.
import os, sys
if __name__ == "__main__" and __package__ is None:
d,f = os.path.split(os.path.abspath(__file__))
f = os.path.splitext(f)[0]
__package__ = [f] #__package__ will be a reversed list of package name parts
while os.path.exists(os.path.join(d,'__init__.py')): #go up until we run out of __init__.py files
d,name = os.path.split(d) #pull of a lowest level directory name
__package__.append(name) #add it to the package parts list
__package__ = ".".join(reversed(__package__)) #create the full package name
mod = __import__(__package__) #this assumes the top level package is in your $PYTHONPATH
sys.modules[__package__] = mod #add to modules

This is a minimal setup based on most of the other answers, tested on python 2.7 with a package layout like so. It also has the advantage that you can call the runme.py script from anywhere and it seems like it's doing the right thing - I haven't yet tested it in a more complex setup, so caveat emptor... etc.
This is basically Brad's answer above with the insert into sys.path others have described.
packagetest/
__init__.py # Empty
mylib/
__init__.py # Empty
utils.py # def times2(x): return x*2
scripts/
__init__.py # Empty
runme.py # See below (executable)
runme.py looks like this:
#!/usr/bin/env python
if __name__ == '__main__' and __package__ is None:
from os import sys, path
d = path.dirname(path.abspath(__file__))
__package__ = []
while path.exists(path.join(d, '__init__.py')):
d, name = path.split(d)
__package__.append(name)
__package__ = ".".join(reversed(__package__))
sys.path.insert(1, d)
mod = __import__(__package__)
sys.modules[__package__] = mod
from ..mylib.utils import times2
print times2(4)

Related

python file needs to imports a class from another python file that is in the parent folder

i'm struggling with something that feels like it should be simple.
my current dir looks like this:
root/
└─ __init__.py (tried with it and without)
└─ file_with_class.py
└─ tests_folder/
└─ __init__.py (tried with it and without)
└─ unittest_for_class.py
unittest_for_class.py needs to import the class from file_with_class to test it, i tried to import it in various ways i found online but i just keep getting errors like:
(class name is same as file name lets say its called file_with_class)
File "tests_folder/unittest_for_class.py", line 3, in <module>
from ..file_with_class import file_with_class
ValueError: Attempted relative import in non-package
File "tests_folder/unittest_for_class.py", line 3, in <module>
from file_with_class import file_with_class
ImportError: No module named file_with_class
and others..
what is the correct way to import a class from a .py file that is in the parent folder ?
As a short explanation
import * from ..parent works if your program started at the parent level.
You import submodules which can have cross relations to other submodules or files in the package -> They are only relative inside a package not the os structure.
Option 1
you actually enter via a script in your parent folder and import your mentioned file as a submodule. Nicest, cleanest and intended way, but then your file is no standalone.
Option 2 - Add the parent dictionary to your path
sys.path.append('/path/to/parent')
import parent
This is a little bit dirty as you now have an extra path for your imports but still one of most easiest ones without much trickery.
Further Options and theory
There are quite a few posts here covering this topic relative imports covers quite a few good definitions and concepts in the answers.
Option 3 - Deprecated and not future proof importlib.find_loader
import os
import importlib
current = os.getcwd() # for rollback
os.chdir("..") # change to arbitrary path
loader = importlib.find_loader("parent") # load filename
assert loader
parent = loader.load_module() # now this is your module
assert parent
os.chdir(current) # change back to working dictionary
(Half of an) Option 4
When working with an IDE this might work, Spyder allows the following code. Standard python console does NOT.
import os
current = os.getcwd()
os.chdir("..")
import parent
os.chdir(current)
Following up on #Daraan's answer:
You import submodules which can have cross relations to other submodules or files in the package -> They are only relative inside a package not the os structure.
I've written an experimental, new import library: ultraimport which allows to do just that, relative imports from anywhere in the file system. It will give you more control over your imports.
You could then write in your unittest_for_class.py:
import ultraimport
MyClass = ultraimport("__dir__/../file_with_class.py", "MyClass")
# or to import the whole module
file_with_class = ultraimport("__dir__/../file_with_class.py")
The advantage is that this will always work, independent of sys.path, no matter how you run your script and all the other things that were mentioned.
You can add the parent folder to the search path with sys.path.append() like so:
import sys
sys.path.append('/path/to/parentdir')
from file_with_class import file_with_class
...
See also the tutorial for how Python modules and packages are handled
Just keep from file_with_class import file_with_class. Then run python -m test_folder.unittest_for_class. This supports running the script as if it is a module.

Getting "ImportError: attempted relative import with no known parent package" when running from Python Interpreter

I'm creating a modular app using Flask blueprints feature. As a result, my directory structure is like this:
project
__init__.py
config.py
mould.py
modules
__init__.py
core
__init__.py
core.py
db.py
models.py
The modules directory here is not be confused with Python modules, they are for giving a modular structure to my project (core module, foo module, bar module, etc.). Now each folder in the modules directory (and a module inside it with same name such as core.core) is dynamically imported in my main flask app (mould.py) by doing this:
for item in os.listdir("modules"):
if not os.path.isfile("modules" + os.sep + item) and not item.startswith("__"):
ppath = "modules" + "." + item
fullpath = "modules" + "." + item + "." + item
module = importlib.import_module(fullpath)
app.register_blueprint(module.app)
print("Registered: " + ppath)
As a result of this, I'm unable to do this in the module scripts like db.py:
import models
Since it gives a path error as the entire module is being executed at the project level, so I had to do this:
from . import models
This solves the issue and I'm able to successfully import all modules. However, when I go to the core modules directory for some troubleshooting and start the python interpreter, it doesn't allow me to import the db module:
>>> import db
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "db.py", line 7, in <module>
from . import models
ImportError: attempted relative import with no known parent package
Is there a way around this? So, that I can import the db module successfully in the code as well as interpreter?
I know I'm late to the party, but I think I've found a solution to this problem. Hopefully this will be useful to someone else working on a large Python project.
The trick is to try one import format and fall back to the other format if the first fails.
Approach 1
db.py
try:
# Assume we're a sub-module in a package.
from . import models
except ImportError:
# Apparently no higher-level package has been imported, fall back to a local import.
import models
On the plus side, this approach is pretty simple, but doesn't scale well (module names are duplicated). Scaling can be improved by importing programmatically.
Approach 2 (not recommended)
db.py
import importlib
root = 'project.modules.core'
my_modules = ['core', 'models']
for m in my_modules
try:
globals()[m] = importlib.import_module(root + '.' + m)
except ImportError:
globals()[m] = importlib.import_module(m)
globals() is the global symbol table.
Of course, now this functionality needs to be duplicated in every module. I'm not sure that's actually an improvement over the first approach. However, you can separate this logic out into its own independent package that lives somewhere on pythonpath.
Approach 3
package_importer.py
import importlib
def import_module(global_vars, root, modules):
for m in modules
try:
global_vars[m] = importlib.import_module(root + '.' + m)
except ImportError:
global_vars[m] = importlib.import_module(m)
db.py
import package_importer
root = 'project.modules.core'
my_modules = ['core', 'models']
package_importer.import_module(globals(), root, my_modules)
This may be a bit outdated, but maybe someone else will benefit of my answer. Since python2 und python3 have different default import behavior, you have to determine between these two python versions.
Python 2.X
The default behavior for import models is to look up first the relative and then the absolute search path order. Therefore it should work.
However, in Python 3.X the default behavior for import models is to look for the model only in the absolute paths (called absolute imports). The current package core gets skipped and since the module db cannot be found anywhere else in the sys.path search path, it throws an error. To resolve this issue you have to use the import statement with dots from . import models to make clear that you are trying to import from a relative directory.
If you are interested to learn more about importing python modules, I suggest you to start your research with the following key words: module search path, python package import and relative package imports

Python 3.5+: How to dynamically import a module given the full file path (in the presence of implicit sibling imports)?

Question
The standard library clearly documents how to import source files directly (given the absolute file path to the source file), but this approach does not work if that source file uses implicit sibling imports as described in the example below.
How could that example be adapted to work in the presence of implicit sibling imports?
I already checked out this and this other Stackoverflow questions on the topic, but they do not address implicit sibling imports within the file being imported by hand.
Setup/Example
Here's an illustrative example
Directory structure:
root/
- directory/
- app.py
- folder/
- implicit_sibling_import.py
- lib.py
app.py:
import os
import importlib.util
# construct absolute paths
root = os.path.abspath(os.path.dirname(os.path.dirname(os.path.realpath(__file__))))
isi_path = os.path.join(root, 'folder', 'implicit_sibling_import.py')
def path_import(absolute_path):
'''implementation taken from https://docs.python.org/3/library/importlib.html#importing-a-source-file-directly'''
spec = importlib.util.spec_from_file_location(absolute_path, absolute_path)
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
return module
isi = path_import(isi_path)
print(isi.hello_wrapper())
lib.py:
def hello():
return 'world'
implicit_sibling_import.py:
import lib # this is the implicit sibling import. grabs root/folder/lib.py
def hello_wrapper():
return "ISI says: " + lib.hello()
#if __name__ == '__main__':
# print(hello_wrapper())
Running python folder/implicit_sibling_import.py with the if __name__ == '__main__': block commented out yields ISI says: world in Python 3.6.
But running python directory/app.py yields:
Traceback (most recent call last):
File "directory/app.py", line 10, in <module>
spec.loader.exec_module(module)
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/Users/pedro/test/folder/implicit_sibling_import.py", line 1, in <module>
import lib
ModuleNotFoundError: No module named 'lib'
Workaround
If I add import sys; sys.path.insert(0, os.path.dirname(isi_path)) to app.py, python app.py yields world as intended, but I would like to avoid munging the sys.path if possible.
Answer requirements
I'd like python app.py to print ISI says: world and I'd like to accomplish this by modifying the path_import function.
I'm not sure of the implications of mangling sys.path. Eg. if there was directory/requests.py and I added the path to directory to the sys.path, I wouldn't want import requests to start importing directory/requests.py instead of importing the requests library that I installed with pip install requests.
The solution MUST be implemented as a python function that accepts the absolute file path to the desired module and returns the module object.
Ideally, the solution should not introduce side-effects (eg. if it does modify sys.path, it should return sys.path to its original state). If the solution does introduce side-effects, it should explain why a solution cannot be achieved without introducing side-effects.
PYTHONPATH
If I have multiple projects doing this, I don't want to have to remember to set PYTHONPATH every time I switch between them. The user should just be able to pip install my project and run it without any additional setup.
-m
The -m flag is the recommended/pythonic approach, but the standard library also clearly documents How to import source files directly. I'd like to know how I can adapt that approach to cope with implicit relative imports. Clearly, Python's internals must do this, so how do the internals differ from the "import source files directly" documentation?
The easiest solution I could come up with is to temporarily modify sys.path in the function doing the import:
from contextlib import contextmanager
#contextmanager
def add_to_path(p):
import sys
old_path = sys.path
sys.path = sys.path[:]
sys.path.insert(0, p)
try:
yield
finally:
sys.path = old_path
def path_import(absolute_path):
'''implementation taken from https://docs.python.org/3/library/importlib.html#importing-a-source-file-directly'''
with add_to_path(os.path.dirname(absolute_path)):
spec = importlib.util.spec_from_file_location(absolute_path, absolute_path)
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
return module
This should not cause any problems unless you do imports in another thread concurrently. Otherwise, since sys.path is restored to its previous state, there should be no unwanted side effects.
Edit:
I realize that my answer is somewhat unsatisfactory but, digging into the code reveals that, the line spec.loader.exec_module(module) basically results in exec(spec.loader.get_code(module.__name__),module.__dict__) getting called. Here spec.loader.get_code(module.__name__) is simply the code contained in lib.py.
Thus a better answer to the question would have to find a way to make the import statement behave differently by simply injecting one or more global variables through the second argument of the exec-statement. However, "whatever you do to make the import machinery look in that file's folder, it'll have to linger beyond the duration of the initial import, since functions from that file might perform further imports when you call them", as stated by #user2357112 in the question comments.
Unfortunately the only way to change the behavior of the import statement seems to be to change sys.path or in a package __path__. module.__dict__ already contains __path__ so that doesn't seem to work which leaves sys.path (Or trying to figure out why exec does not treat the code as a package even though it has __path__ and __package__ ... - But I don't know where to start - Maybe it has something to do with having no __init__.py file).
Furthermore this issue does not seem to be specific to importlib but rather a general problem with sibling imports.
Edit2: If you don't want the module to end up in sys.modules the following should work (Note that any modules added to sys.modules during the import are removed):
from contextlib import contextmanager
#contextmanager
def add_to_path(p):
import sys
old_path = sys.path
old_modules = sys.modules
sys.modules = old_modules.copy()
sys.path = sys.path[:]
sys.path.insert(0, p)
try:
yield
finally:
sys.path = old_path
sys.modules = old_modules
add to the PYTHONPATH environment variable the path your application is on
Augment the default search path for module files. The format is the same as the shell’s PATH: one or more directory pathnames
separated by os.pathsep (e.g. colons on Unix or semicolons on
Windows). Non-existent directories are silently ignored.
on bash its like this:
export PYTHONPATH="./folder/:${PYTHONPATH}"
or run directly:
PYTHONPATH="./folder/:${PYTHONPATH}" python directory/app.py
The OP's idea is great, this work only for this example by adding sibling modules with proper name to the sys.modules, I would say it is the SAME as adding PYTHONPATH. tested and working with version 3.5.1.
import os
import sys
import importlib.util
class PathImport(object):
def get_module_name(self, absolute_path):
module_name = os.path.basename(absolute_path)
module_name = module_name.replace('.py', '')
return module_name
def add_sibling_modules(self, sibling_dirname):
for current, subdir, files in os.walk(sibling_dirname):
for file_py in files:
if not file_py.endswith('.py'):
continue
if file_py == '__init__.py':
continue
python_file = os.path.join(current, file_py)
(module, spec) = self.path_import(python_file)
sys.modules[spec.name] = module
def path_import(self, absolute_path):
module_name = self.get_module_name(absolute_path)
spec = importlib.util.spec_from_file_location(module_name, absolute_path)
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
return (module, spec)
def main():
pathImport = PathImport()
root = os.path.abspath(os.path.dirname(os.path.dirname(os.path.realpath(__file__))))
isi_path = os.path.join(root, 'folder', 'implicit_sibling_import.py')
sibling_dirname = os.path.dirname(isi_path)
pathImport.add_sibling_modules(sibling_dirname)
(lib, spec) = pathImport.path_import(isi_path)
print (lib.hello())
if __name__ == '__main__':
main()
Try:
export PYTHONPATH="./folder/:${PYTHONPATH}"
or run directly:
PYTHONPATH="./folder/:${PYTHONPATH}" python directory/app.py
Make sure your root is in a folder that is explicitly searched in the PYTHONPATH. Use an absolute import:
from root.folder import implicit_sibling_import #called from app.py
Make sure your root is in a folder that is explicitly searched in the PYTHONPATH
Use an absolute import:
from root.folder import implicit_sibling_import # called from app.py

Value Changes in python with Context Changes

I have a project with folder structure like this:
MainFolder/
__init__.py
Global.py
main.py
Drivers/
__init__.py
a.py
b.py
In Global.py I have declared like this:
#in Global.py file
global_value=''
Now when I tried the below script:
#in main.py
import Global
from Drivers import a
Global.global_value=5
a.print_value()
In a.py file
from MainFolder import Global
def print_value():
print Global.global_value
The output supposed to be like this:
5
But all I am getting is :
''
Anyone with this solution what happens when context changes??
In my opinion you should not do that. To have some form of common value, write the value to a file/db and then fetch the value from that file.
If that doesn't suite the needs, here's some resources I found, might help you out:
I've not tested this, but this one should work (fetched from Import a module from a relative path)
import os, sys, inspect
# realpath() will make your script run, even if you symlink it :)
cmd_folder = os.path.realpath(os.path.abspath(os.path.split(inspect.getfile( inspect.currentframe() ))[0]))
if cmd_folder not in sys.path:
sys.path.insert(0, cmd_folder)
# use this if you want to include modules from a subfolder
cmd_subfolder = os.path.realpath(os.path.abspath(os.path.join(os.path.split(inspect.getfile( inspect.currentframe() ))[0],"subfolder")))
if cmd_subfolder not in sys.path:
sys.path.insert(0, cmd_subfolder)
# Info:
# cmd_folder = os.path.dirname(os.path.abspath(__file__)) # DO NOT USE __file__ !!!
# __file__ fails if script is called in different ways on Windows
# __file__ fails if someone does os.chdir() before
# sys.argv[0] also fails because it doesn't not always contains the path
More:
Importing Python modules from different working directory
Python accessing modules from package that is distributed over different directories

How to import members of all modules within a package?

I am developing a package that has a file structure similar to the following:
test.py
package/
__init__.py
foo_module.py
example_module.py
If I call import package in test.py, I want the package module to appear similar to this:
>>> vars(package)
mapping_proxy({foo: <function foo at 0x…}, {example: <function example at 0x…})
In other words, I want the members of all modules in package to be in package's namespace, and I do not want the modules themselves to be in the namespace. package is not a sub-package.
Let's say my files look like this:
foo_module.py:
def foo(bar):
return bar
example_module.py:
def example(arg):
return foo(arg)
test.py:
print(example('derp'))
How do I structure the import statements in test.py, example_module.py, and __init__.py to work from outside the package directory (i.e. test.py) and within the package itself (i.e. foo_module.py and example_module.py)? Everything I try gives Parent module '' not loaded, cannot perform relative import or ImportError: No module named 'module_name'.
Also, as a side-note (as per PEP 8): "Relative imports for intra-package imports are highly discouraged. Always use the absolute package path for all imports. Even now that PEP 328 is fully implemented in Python 2.5, its style of explicit relative imports is actively discouraged; absolute imports are more portable and usually more readable."
I am using Python 3.3.
I want the members of all modules in package to be in package's
namespace, and I do not want the modules themselves to be in the
namespace.
I was able to do that by adapting something I've used in Python 2 to automatically import plug-ins to also work in Python 3.
In a nutshell, here's how it works:
The package's __init__.py file imports all the other Python files in the same package directory except for those whose names start with an '_' (underscore) character.
It then adds any names in the imported module's namespace to that of __init__ module's (which is also the package's namespace). Note I had to make the example_module module explicitly import foo from the .foo_module.
One important aspect of doing things this way is realizing that it's dynamic and doesn't require the package module names to be hardcoded into the __init__.py file. Of course this requires more code to accomplish, but also makes it very generic and able to work with just about any (single-level) package — since it will automatically import new modules when they're added and no longer attempt to import any removed from the directory.
test.py:
from package import *
print(example('derp'))
__init__.py:
def _import_all_modules():
""" Dynamically imports all modules in this package. """
import traceback
import os
global __all__
__all__ = []
globals_, locals_ = globals(), locals()
# Dynamically import all the package modules in this file's directory.
for filename in os.listdir(__name__):
# Process all python files in directory that don't start
# with underscore (which also prevents this module from
# importing itself).
if filename[0] != '_' and filename.split('.')[-1] in ('py', 'pyw'):
modulename = filename.split('.')[0] # Filename sans extension.
package_module = '.'.join([__name__, modulename])
try:
module = __import__(package_module, globals_, locals_, [modulename])
except:
traceback.print_exc()
raise
for name in module.__dict__:
if not name.startswith('_'):
globals_[name] = module.__dict__[name]
__all__.append(name)
_import_all_modules()
foo_module.py:
def foo(bar):
return bar
example_module.py:
from .foo_module import foo # added
def example(arg):
return foo(arg)
I think you can get the values you need without cluttering up your namespace, by using from module import name style imports. I think these imports will work for what you are asking for:
Imports for example_module.py:
from package.foo_module import foo
Imports for __init__.py:
from package.foo_module import foo
from package.example_module import example
__all__ = [foo, example] # not strictly necessary, but makes clear what is public
Imports for test.py:
from package import example
Note that this only works if you're running test.py (or something else at the same level of the package hierarchy). Otherwise you'd need to make sure the folder containing package is in the python module search path (either by installing the package somewhere Python will look for it, or by adding the appropriate folder to sys.path).

Categories