I have made a package in the following structure:
test.py
pakcage1/
__init__.py
module1.py
module2.py
In the test.py file, with the code
from package1 import *
what I want it to do is to
from numpy import *
from module1 import *
from module2 import *
What should I write in __init__.py file to achieve this?
Currently in my __init__.py file I have
from numpy import *
__all__ = ['module1','module2']
and this doesn't give me what I wanted. In this way numpy wan't imported at all, and the modules are imported as
import module1
rather than
from module1 import *
If you want this, your __init__.py should contain just what you want:
from numpy import *
from module1 import *
from module2 import *
When you do from package import *, it imports all names defined in the package's __init__.py.
Note that this could become awkward if there are name clashes among the modules you import. If you just want convenient access to the functions in those modules, I would suggest using instead something like:
import numpy as np
import module1 as m1
import module2 as m2
That is, import the modules (not their contents), but under shorter names. You can then still access numpy stuff with something like np.add, which adds only three characters of typing but guards against name clashes among different modules.
I second BrenBarn's suggestion, but be warned though, importing everything into one single namespace using from x import * is generally a bad idea, unless you know for certain that there won't be any conflicting names.
I think it's still safer to use import package.module, though it does take extra keystrokes.
Related
I have a Python project with several Git submodules structured like so:
myproject/
__init__.py
src/
__init__.py
main.py
submodules/
__init__.py
submodule1/
__init__.py
utils.py
submodule2/
...
I want to be able to import code in main.py from my submodules with the following constraints:
The import statement must not contain submodules
IntelliSense must function. This rules out modifying the PATH, e.g. via sys.path.append().
The submodules must not be installed using a package manager.
I am hoping for an import statement in main.py that looks something like from submodule1 import utils or from submodule1.utils import myUtilFunction.
If this is impossible, any explanation or resources to help me understand why would be appreciated. I have been reading the docs about the import function, modules, and namespaces, but I do not understand how to achieve this or if it's even possible.
I've tried using various combinations of import statements and __all__ declarations in my __init__.py and files like so:
myproject/__init__.py:
__all__ = ['submodule1', 'submodules/submodule1']
from .submodules.submodule1 import *
from .submodules import *
import .submodules.submodule1
myproject/submodules:
__all__ = ['submodule1']
from .submodule1 import *
myproject/src/main.py:
from submodule1.utils import myUtilFunction
But so far nothing has allowed from submodule1.utils import myUtilFunction to work in main.py.
I am writing a library in python. It has one package and 3 modules, such as
mypackage
__init__.py
utils.py
fileio.py
math.py
If I just leave the __init__.py empty, my users must figure out which functions are in which modules, which is annoying. The fact that I have three modules is an implementation detail that is not important.
Maybe I should import the main functions into the __init__.py, like this
from .utils import create_table
from .fileio import save_rows, load_rows
from .math import matrix_inverse
so that my users can just do
import mypackage as myp
rows = myp.load_rows()
Is that best practice?
What about the alternative to put ALL symbols into the __init__.py, such as
from .utils import *
from .fileio import *
from .math import *
And if there are any functions that I don't want to expose, I will prefix them with underscore. Is that better? It certainly is easier for me.
What if the fileio.py needs to call some functions in the utils.py? I could put
from .utils import *
into the fileio.py, but won't that create a circular or redundant reference? What's the best way to handle this?
Maybe I should import the main functions into the init.py, like this [...] Is that best practice?
I wouldn't say there is a "best practice", it depends on the specific case, but this is surely pretty common: you define a bunch of stuff in each module, and import the relevant ones in __init__. This is an easy way to not bother the users with remembering which submodule has the needed function, however it can get pretty annoying if you have a lot of functions to import from each module.
What about the alternative to put ALL symbols into the init.py, such as
from .utils import *
from .fileio import *
from .math import *
You most likely don't want to do this. This will import everything in user scripts, including other imported modules and internal functions. You should avoid it.
What if the fileio.py needs to call some functions in the utils.py? [...] won't that create a circular or redundant reference?
Yeah, that is something that can happen and you usually want to avoid it at all costs. If you need some functions from utils.py in fileio.py, you should import them explicitly as from .utils import x, y, z. Remember to also always use relative imports when importing things between modules of the same package (i.e. use from .utils import x, not from package.utils import x).
A good compromise between these two options you mention which solves most of the above problems (although not circular imports, you would have to avoid those yourself) would be to define an __all__ list inside each one of your modules to specify which functions should be exported when using from x import *, like this:
# utils.py
import sys
__all__ = ['user_api_one', 'user_api_two']
def user_api_one():
...
def user_api_two():
...
def internal_function():
...
If you properly define an __all__ list in all your modules, then in your __init__.py you will be able to safely do:
from .utils import *
from .fileio import *
from .math import *
This will only import relevant functions (for example user_api_one and user_api_two for utils, and not internal_function nor sys).
So, this is a set of questions about how to use __init__.py in packages/sub-packages. I have searched, and surprisingly not found a decent answer to this.
If I have the following structure (which is just a simplified example obviously):
my_package/
__init__.py
module1.py
my_sub_package/
__init__.py
module2.py
the contents of module1.py is
my_string = 'Hello'
the contents of module2.py is
my_int = 42
First question: importing multiple modules from a package by default
What should be in the __init__.py files?
I can leave them empty, in which case, import my_package does nothing really (obviously it imports the package, but the package effectively contains nothing). This is fine obviously, and what should happen in most cases.
What I'd like in this case though is for import my_package to allow me to use my_package.module1.my_string and my_package.my_sub_package.module2.my_int.
I can add __all__ = [ 'module1' ] to my_package/__init__.py and __all__ = [ 'module2' ] to my_package/my_sub_package/__init__.py, but this only affects imports using a wildcard as I understand it (so only from my_package import * and from my_package.my_sub_package import *).
I can achieve this by putting
import my_package.module1
import my_package.my_sub_package
in my_package/__init__.py and
import my_package.my_sub_package.module2
in my_package/my_sub_package/__init__.py, but is this a bad idea? It creates a (seemingly) infinite series of my_package.my_package.my_package.... when I do this in the Python interpreter (3.5.5).
Separate, but highly related, question: using modules to keep files reasonably sized
If I wanted instead to be able to do the following
import my_package
print(my_package.my_string)
print(str(my_package.my_sub_package.my_int))
i.e. I wanted to use module1 and module2 purely for separating code into smaller more readable files if I actually had lots of modules in each package (which obviously doesn't apply in this trivial example, but can easily)
is doing from my_package.module1 import * in my_package/__init__.py and from my_package.my_sub_package.module2 import * in my_package/my_sub_package/__init__.py a reasonable way to do that? I don't like the use of the wildcard import, but it seems like it would be impractically verbose to import everything defined in a (real) module, listing them all.
Third (also highly related) question: avoiding writing the package names in multiple places
Is there a way I can achieve the above without having to put the names of the packages into the source code in them? I ask because I'd like to avoid having to change it in multiple places if I renamed the package (again, simple in this trivial example, can be done by an IDE or script in reality, but would be nice to know how to avoid).
In my_package/__init__.py, use
from . import my_sub_package
etc.
See for example NumPy's __init__.py, which has from . import random, and allows
import numpy as np
np.random.random
Wildcard imports inside a single package tend to be common, provided you have __all__ defined in the modules and subpackages you import from.
Again an example from NumPy's __init__.py, which has several wildcard imports.
Here's part of that __init__.py:
from . import core
from .core import *
from . import compat
from . import lib
from .lib import *
from . import linalg
from . import fft
from . import polynomial
from . import random
from . import ctypeslib
from . import ma
from . import matrixlib as _mat
from .matrixlib import *
from .compat import long
Notice also the two core import lines. Both numpy.core and the core definitions (functions, classes etc) are then available.
When in doubt how to do something, or whether something is good practice, have a look at a few well-known libraries or packages. That can help gaining some valuable insights.
I am having a lot of trouble understanding the python module import system.
I am trying to create a simple folder structure as follows.
SomeModule
__init__.py
AnotherModule
AnotherModule.py
__init__.py
Utils
Utils.py
__init__.py
To use SomeModule i can do:
SomeModule.Foo()
Now inside AnotherModule.py I would like to import my Utils directory.
How come I have to do
import SomeModule.AnotherModule.Utils.Foo
why cannot I just do
import Utils.Foo
To shorten up the actual function name that you'll have to call in your code, you can always do:
from SomeModule.AnotherModule.Utils import *
While this still won't allow you to get away with a shorter import statement at the top of your script, you'll be able to access all of the functions within .Utils just by calling their function name (i.e. foo(x) instead of SomeModule.AnotherModule.Utils.foo(x).
Part of the reason for the lengthy import statement goes to the comment from #wim . Have a look by typing import this in a python interpreter.
put
import sys
import SomeModule.AnotherModule
sys.modules['AnotherModule'] = SomeModule.AnotherModule
in SomeModules __init__.py
I have a package whose structure is like this:
/backends/
__init__.py
abc.py
def.py
ghi.py
xyz.py
common.py
The modules abc.py, def.py, ghi.py and xyz.py contain some common functions e.g. func0() and func1().
In the common.py module I am importing * from all modules like this:
from abc import *
from def import *
from ghi import *
from xyz import *
I don't think this is a Pythonic way to do this. Think about a few tens of such modules.
What I want is to have a single line statement which imports * from all the modules in the package. Something like this:
from backends import *
I tried this link, but couldn't get what I wanted. I created a variable __all__ in the __init__.py and assigned the list of all modules to it. The I put this import line in the common.py module:
from . import *
And then I tried to access the function func0() which is present in any module (except in __init__.py and common.py) in the package. But it raised an error which reads
ValueError: Attempted relative import in non-package
I need a detailed answer.
Here is a solution I tried myself and it worked,
I'll presume that you will be working with common.py as your main module where you'll be importing the rest modules in it, so:
1 - In the __init__.py file, add:
import os
import glob
modules = glob.glob(os.path.dirname(__file__)+"/*.py")
__all__ = [ os.path.basename(f)[:-3] for f in modules if os.path.basename(f)[:-3] != 'common']
2 - In common.py add:
import sys
from os import path
sys.path.append( path.dirname(path.dirname(path.abspath(__file__))))
#print sys.path #for debugging
import backends
#from backends import * #up to you, but previous is better
Voila! Found a solution which I am completely satisfied with. :-)
I left the __init__.py module untouched. And instead put the following codes in the beginning of the common.py module.
import os
pwd = os.path.dirname(os.path.realpath(__file__))
file_names = os.listdir(pwd)
for name in file_names:
if ".pyc" not in name and "__init__" not in name and "common" not in name and ".py" in name:
exec "from "+name[:-3]+" import *"
It worked like charm. Though I don't know whether this is the best solution. I guess in Python 3.x, exec statement will be like following:
exec("from "+name[:-3]+" import *")