Relative import of package __init__.py - python

Suppose I have a package containing two submodules and also a substantial amount of code in __init__.py itself:
pkg/__init__.py
pkg/foo.py
pkg/bar.py
and, to make planned future refactorings easier, I want components of the package to exclusively use relative imports to refer to each other. In particular, import pkg should never appear.
From foo.py I can do
from __future__ import absolute_import
from . import bar
to get access to the bar.py module, and vice versa.
The question is, what do I write to import __init__.py in this manner? I want exactly the same effect as import pkg as local_name, only without having to specify the absolute name pkg.
#import pkg as local_name
from . import ??? as local_name
UPDATE: Inspired by maxymoo's answer, I tried
from . import __init__ as local_name
This does not set local_name to the the module defined by __init__.py; it instead gets what appears to be a bound method wrapper for the __init__ method of that module. I suppose I could do
from . import __init__ as local_name
local_name = local_name.__self__
to get the thing I want, but (a) yuck, and (b) this makes me worry that the module hasn't been fully initialized.
Answers need to work on both Python 2.7 and Python 3.4+.
Yes, it would probably be better to hollow out __init__.py and just have it reexport stuff from the submodules, but that can't happen just yet.

There's nothing special about the dunders (they're just discouraged when writing your own module/function names); you should just be able to do
from .__init__ import my_function as local_name

python2 and python3 (uses the discouraged __import__):
from 1st level module (pkg.foo, pgk.bar, ...):
local_name = __import__("", globals(), locals(), [], 1)
from module in subpackage (pkg.subpkg.foo, ...):
local_name = __import__("", globals(), locals(), [], 2)
python3 only*:
From pkg.foo or pkg.bar:
import importlib
local_name = importlib.import_module("..", __name__)
From pkg.subpkg.baz:
import importlib
local_name = importlib.import_module("...", __name__)
*import_module on python2 tries load pkg. in this case, unfortunately.

Related

Module 'mpi4py' has no attribute 'MPI' [duplicate]

Having already use flat packages, I was not expecting the issue I encountered with nested packages. Here is…
Directory layout
dir
|
+-- test.py
|
+-- package
|
+-- __init__.py
|
+-- subpackage
|
+-- __init__.py
|
+-- module.py
Content of init.py
Both package/__init__.py and package/subpackage/__init__.py are empty.
Content of module.py
# file `package/subpackage/module.py`
attribute1 = "value 1"
attribute2 = "value 2"
attribute3 = "value 3"
# and as many more as you want...
Content of test.py (3 versions)
Version 1
# file test.py
from package.subpackage.module import *
print attribute1 # OK
That's the bad and unsafe way of importing things (import all in a bulk), but it works.
Version 2
# file test.py
import package.subpackage.module
from package.subpackage import module # Alternative
from module import attribute1
A safer way to import, item by item, but it fails, Python don't want this: fails with the message: "No module named module". However …
# file test.py
import package.subpackage.module
from package.subpackage import module # Alternative
print module # Surprise here
… says <module 'package.subpackage.module' from '...'>. So that's a module, but that's not a module /-P 8-O ... uh
Version 3
# file test.py v3
from package.subpackage.module import attribute1
print attribute1 # OK
This one works. So you are either forced to use the overkill prefix all the time or use the unsafe way as in version #1 and disallowed by Python to use the safe handy way? The better way, which is safe and avoid unecessary long prefix is the only one which Python reject? Is this because it loves import * or because it loves overlong prefixes (which does not help to enforce this practice)?.
Sorry for the hard words, but that's two days I trying to work around this stupid‑like behavior. Unless I was totally wrong somewhere, this will leave me with a feeling something is really broken in Python's model of package and sub‑packages.
Notes
I don't want to rely on sys.path, to avoid global side effects, nor on *.pth files, which are just another way to play with sys.path with the same global effets. For the solution to be clean, it has to be local only. Either Python is able to handle subpackage, either it's not, but it should not require to play with global configuration to be able to handle local stuff.
I also tried use imports in package/subpackage/__init__.py, but it solved nothing, it do the same, and complains subpackage is not a known module, while print subpackage says it's a module (weird behavior, again).
May be I'm entirely wrong tough (the option I would prefer), but this make me feel a lot disappointed about Python.
Any other known way beside of the three I tried? Something I don't know about?
(sigh)
----- %< ----- edit ----- >% -----
Conclusion so far (after people's comments)
There is nothing like real sub‑package in Python, as all package references goes to a global dictionnary, only, which means there's no local dictionary, which implies there's is no way to manage local package reference.
You have to either use full prefix or short prefix or alias. As in:
Full prefix version
from package.subpackage.module import attribute1
# An repeat it again an again
# But after that, you can simply:
use_of (attribute1)
Short prefix version (but repeated prefix)
from package.subpackage import module
# Short but then you have to do:
use_of (module.attribute1)
# and repeat the prefix at every use place
Or else, a variation of the above.
from package.subpackage import module as m
use_of (m.attribute1)
# `m` is a shorter prefix, but you could as well
# define a more meaningful name after the context
Factorized version
If you don't mind about importing multiple entity all at once in a batch, you can:
from package.subpackage.module import attribute1, attribute2
# and etc.
Not in my first favorite taste (I prefer to have one import statement per imported entity), but may be the one I will personally favor.
Update (2012-09-14):
Finally appears to be OK in practice, except with a comment about the layout. Instead of the above, I used:
from package.subpackage.module import (
attribute1,
attribute2,
attribute3,
...) # and etc.
You seem to be misunderstanding how import searches for modules. When you use an import statement it always searches the actual module path (and/or sys.modules); it doesn't make use of module objects in the local namespace that exist because of previous imports. When you do:
import package.subpackage.module
from package.subpackage import module
from module import attribute1
The second line looks for a package called package.subpackage and imports module from that package. This line has no effect on the third line. The third line just looks for a module called module and doesn't find one. It doesn't "re-use" the object called module that you got from the line above.
In other words from someModule import ... doesn't mean "from the module called someModule that I imported earlier..." it means "from the module named someModule that you find on sys.path...". There is no way to "incrementally" build up a module's path by importing the packages that lead to it. You always have to refer to the entire module name when importing.
It's not clear what you're trying to achieve. If you only want to import the particular object attribute1, just do from package.subpackage.module import attribute1 and be done with it. You need never worry about the long package.subpackage.module once you've imported the name you want from it.
If you do want to have access to the module to access other names later, then you can do from package.subpackage import module and, as you've seen you can then do module.attribute1 and so on as much as you like.
If you want both --- that is, if you want attribute1 directly accessible and you want module accessible, just do both of the above:
from package.subpackage import module
from package.subpackage.module import attribute1
attribute1 # works
module.someOtherAttribute # also works
If you don't like typing package.subpackage even twice, you can just manually create a local reference to attribute1:
from package.subpackage import module
attribute1 = module.attribute1
attribute1 # works
module.someOtherAttribute #also works
The reason #2 fails is because sys.modules['module'] does not exist (the import routine has its own scope, and cannot see the module local name), and there's no module module or package on-disk. Note that you can separate multiple imported names by commas.
from package.subpackage.module import attribute1, attribute2, attribute3
Also:
from package.subpackage import module
print module.attribute1
If all you're trying to do is to get attribute1 in your global namespace, version 3 seems just fine. Why is it overkill prefix ?
In version 2, instead of
from module import attribute1
you can do
attribute1 = module.attribute1

Skip directory name in import path by importing subpackage in __init__.py

I'm baffled by the importing dynamics in __init__.py.
Say I have this structure:
package
├── __init__.py
└── subpackage
├── __init__.py
└── dostuff.py
I would like to import things in dostuff.py. I could do it like this: from package.subpackage.dostuff import thefunction, but I would like to remove the subpackage level in the import statement, so it would look like this:
from package.dostuff import thefunction
I tried putting this in package/__init__.py:
from .subpackage import dostuff
And what I don't understand is this:
# doing this works:
from package import dostuff
dostuff.thefunction()
# but this doesn't work:
from package.dostuff import thefunction
# ModuleNotFoundError: No module named 'package.dostuff'
Why is that, and how can I make from package.dostuff import thefunction work?
The only way I see to make what you intend would be to actually create a package/dostuff.py module and import all you need in it as from .subpackage.dostuff import thefunction.
The point is that when you use from .subpackage import dostuff in package/__init__.py, you do not rename the original module.
To be more explicit, here is an example of use with both your import and a package/dostuff.py file:
# We import the dostuff link from package
>>> from package import dostuff
>>> dostuff
<module 'package.subpackage.dostuff' from '/tmp/test/package/subpackage/dostuff.py'>
# We use our custom package.dostuff
>>> from package.dostuff import thefunction
>>> package.dostuff
<module 'package.dostuff' from '/tmp/test/package/dostuff.py'>
>>> from package import dostuff
>>> dostuff
<module 'package.dostuff' from '/tmp/test/package/dostuff.py'>
# The loaded function is the same
>>> dostuff.thefunction
<function thefunction at 0x7f95403d2730>
>>> package.dostuff.thefunction
<function thefunction at 0x7f95403d2730>
A clearer way of putting this is:
from X import Y only works when X is an actual module path.
Y on the contrary can be any item imported in this module.
This also applies to packages with anything being declared in their __init__.py. Here you declare the module package.subpackage.dostuff in package, hence you can import it and use it.
But if you try to use the module for a direct import, it has to exist on the filesystem
Resources:
Python documentation about module management in the import system:
https://docs.python.org/3/reference/import.html#submodules.
Python import system search behavior:
https://docs.python.org/3/reference/import.html#searching
https://docs.python.org/3/glossary.html#term-qualified-name
https://docs.python.org/2.0/ref/import.html
I hope that makes it clearer
You can in fact fake this quite easily by fiddling with Python's sys.modules dict. The question is whether you do really need this or whether it might be good to spend a second thought on your package structure.
Personally, I would consider this bad style, because it applies magic to the module and package names and people who might use and extend your package will have a hard time figuring out what's going on there.
Following your structure above, add the following code to your package/__init__.py:
import sys
from .subpackage import dostuff
# This will be package.dostuff; just avoiding to hard-code it.
_pkg_name = f"{__name__}.{dostuff.__name__.rsplit('.', 1)[1]}"
if _pkg_name not in sys.modules.keys():
dostuff.__name__ = _pkg_name # Will have no effect; see below
sys.modules[_pkg_name] = dostuff
This imports the dostuff module from your subpackage to the scope of package, changes its module path and adds it to the imported modules. Essentially, this just copies the binding of your module to another import path where member memory addresses remain the same. You just duplicate the references:
import package
print(package.dostuff)
print(package.subpackage.dostuff)
print(package.dostuff.something_to_do)
print(package.subpackage.dostuff.something_to_do)
... yields
<module 'package.subpackage.dostuff' from '/path/package/subpackage/dostuff.py'>
<module 'package.subpackage.dostuff' from '/path/package/subpackage/dostuff.py'>
<function something_to_do at 0x1029b8ae8>
<function something_to_do at 0x1029b8ae8>
Note that
The module name package.subpackage.dostuff has not changed even though being updated in package/__init__.py
The function reference is the same: 0x1029b8ae8
Now, you can also go
from package.dostuff import something_to_do
something_to_do()
However, be cautious. Changing the imported modules during import of a module might have unintended side-effects (also the order of updating sys.modules and importing other subpackages or submodules from within package might be relevant). Usually, you buy extra work and extra complexity by applying such kind of "improvement". Better yet set up a proper package structure and stick to it.

Import local packages in python

i've run through many posts about this, but still doesn't seem to work. The deal is pretty cut. I've the got the following hierarchy.
main.py
DirA/
__init__.py
hello.py
DirB/
__init__.py
foo.py
bla.py
lol.py
The__init__.py at DirA is empty. The respective one at DirB just contains the foo module.
__all__.py = ["foo"]
The main.py has the following code
import DirA
import DirB
hey() #Def written at hello.py
foolish1() #Def written at foo.py
foolish2() #Def written at foo.py
Long story short, I got NameError: name 'foo' is not defined. Any ideas? Thanks in advance.
You only get what you import. Therefore, in you main, you only get DirA and DirB. You would use them in one of those ways:
import DirA
DirA.something_in_init_py()
# Importing hello:
import DirA.hello
DirA.hello.something_in_hello_py()
# Using a named import:
from DirA.hello import something_in_hello_py
something_in_hello_py()
And in DirB, just make the __init__.py empty as well. The only use of __all__ is for when you want to import *, which you don't want because, as they say, explicit is better than implicit.
But in case you are curious, it would work this way:
from DirB import *
something_in_dirb()
By default the import * will import everything it can find that does not start with an underscore. Specifying a __all__ restricts what it imported to the names defined in __all__. See this question for more details.
Edit: about init.
The __init__.py is not really connected to the importing stuff. It is just a special file with the following properties:
Its existence means the directory is a python package, with several modules in it. If it does not exist, python will refuse to import anything from the directory.
It will always be loaded before loading anything else in the directory.
Its content will be available as the package itself.
Just try it put this in DirA/__init__.py:
foo = 42
Now, in your main:
from DirA import foo
print(foo) # 42
It can be useful, because you can import some of your submodules in the __init__.py to hide the inner structure of your package. Suppose you build an application with classes Author, Book and Review. To make it easier to read, you give each class its own file in a package. Now in your main, you have to import the full path:
from myapp.author import Author
from myapp.book import Book
from myapp.review import Review
Clearly not optimal. Now suppose you put those exact lines above in your __init__.py, you may simplify you main like this:
from myapp import Author, Book, Review
Python will load the __init__.py, which will in turn load all submodules and import the classes, making them available on the package. Now your main does not need to know where the classes are actually implemented.
Have you tried something like this:
One way
from DirA import hello
Another way
from DirA.hello import hey
If those don't work then append a new system path
You need to import the function itself:
How to call a function from another file in Python?
In your case:
from DirA import foolish1, foolish2

python import function from other file using __init__

I got the following files
1 ./run.py
2 ./code/util.py
3 ./code/__init__.py
and inside util.py I have
def funA():
print 'Hello World !'
and inside ./code/init.py I have
__all__=['util'];
from util import *
now I open python prompt (actually, ipython) in the current directory then I type
from code import *
and all I've got is the util module
util module <module 'code.util' from 'code/util.pyc'>
and I have to include the package name in order to use funA().
I expected that funA is now in my namespace and I can use it without the package name code.funA . However, this is not the case and I am wondering where the problem is.
I guess I am still somewhat confused to how exactly __ init __.py should be used.
The purpose of __all__ (as documented) is to indicate that you want only the names listed there to be available via from mymodule import *. By specifying only 'util', you are explicitly telling your package to not make anything but util available. If you remove that __all__, then everything you import from within util will also be available in code, and so if you do from code import *, then everything from util will also be available.
Whether this is a good idea is another matter. Importing * often leads to confusion.

How to import members of all modules within a package?

I am developing a package that has a file structure similar to the following:
test.py
package/
__init__.py
foo_module.py
example_module.py
If I call import package in test.py, I want the package module to appear similar to this:
>>> vars(package)
mapping_proxy({foo: <function foo at 0x…}, {example: <function example at 0x…})
In other words, I want the members of all modules in package to be in package's namespace, and I do not want the modules themselves to be in the namespace. package is not a sub-package.
Let's say my files look like this:
foo_module.py:
def foo(bar):
return bar
example_module.py:
def example(arg):
return foo(arg)
test.py:
print(example('derp'))
How do I structure the import statements in test.py, example_module.py, and __init__.py to work from outside the package directory (i.e. test.py) and within the package itself (i.e. foo_module.py and example_module.py)? Everything I try gives Parent module '' not loaded, cannot perform relative import or ImportError: No module named 'module_name'.
Also, as a side-note (as per PEP 8): "Relative imports for intra-package imports are highly discouraged. Always use the absolute package path for all imports. Even now that PEP 328 is fully implemented in Python 2.5, its style of explicit relative imports is actively discouraged; absolute imports are more portable and usually more readable."
I am using Python 3.3.
I want the members of all modules in package to be in package's
namespace, and I do not want the modules themselves to be in the
namespace.
I was able to do that by adapting something I've used in Python 2 to automatically import plug-ins to also work in Python 3.
In a nutshell, here's how it works:
The package's __init__.py file imports all the other Python files in the same package directory except for those whose names start with an '_' (underscore) character.
It then adds any names in the imported module's namespace to that of __init__ module's (which is also the package's namespace). Note I had to make the example_module module explicitly import foo from the .foo_module.
One important aspect of doing things this way is realizing that it's dynamic and doesn't require the package module names to be hardcoded into the __init__.py file. Of course this requires more code to accomplish, but also makes it very generic and able to work with just about any (single-level) package — since it will automatically import new modules when they're added and no longer attempt to import any removed from the directory.
test.py:
from package import *
print(example('derp'))
__init__.py:
def _import_all_modules():
""" Dynamically imports all modules in this package. """
import traceback
import os
global __all__
__all__ = []
globals_, locals_ = globals(), locals()
# Dynamically import all the package modules in this file's directory.
for filename in os.listdir(__name__):
# Process all python files in directory that don't start
# with underscore (which also prevents this module from
# importing itself).
if filename[0] != '_' and filename.split('.')[-1] in ('py', 'pyw'):
modulename = filename.split('.')[0] # Filename sans extension.
package_module = '.'.join([__name__, modulename])
try:
module = __import__(package_module, globals_, locals_, [modulename])
except:
traceback.print_exc()
raise
for name in module.__dict__:
if not name.startswith('_'):
globals_[name] = module.__dict__[name]
__all__.append(name)
_import_all_modules()
foo_module.py:
def foo(bar):
return bar
example_module.py:
from .foo_module import foo # added
def example(arg):
return foo(arg)
I think you can get the values you need without cluttering up your namespace, by using from module import name style imports. I think these imports will work for what you are asking for:
Imports for example_module.py:
from package.foo_module import foo
Imports for __init__.py:
from package.foo_module import foo
from package.example_module import example
__all__ = [foo, example] # not strictly necessary, but makes clear what is public
Imports for test.py:
from package import example
Note that this only works if you're running test.py (or something else at the same level of the package hierarchy). Otherwise you'd need to make sure the folder containing package is in the python module search path (either by installing the package somewhere Python will look for it, or by adding the appropriate folder to sys.path).

Categories