I have a package whose structure is like this:
/backends/
__init__.py
abc.py
def.py
ghi.py
xyz.py
common.py
The modules abc.py, def.py, ghi.py and xyz.py contain some common functions e.g. func0() and func1().
In the common.py module I am importing * from all modules like this:
from abc import *
from def import *
from ghi import *
from xyz import *
I don't think this is a Pythonic way to do this. Think about a few tens of such modules.
What I want is to have a single line statement which imports * from all the modules in the package. Something like this:
from backends import *
I tried this link, but couldn't get what I wanted. I created a variable __all__ in the __init__.py and assigned the list of all modules to it. The I put this import line in the common.py module:
from . import *
And then I tried to access the function func0() which is present in any module (except in __init__.py and common.py) in the package. But it raised an error which reads
ValueError: Attempted relative import in non-package
I need a detailed answer.
Here is a solution I tried myself and it worked,
I'll presume that you will be working with common.py as your main module where you'll be importing the rest modules in it, so:
1 - In the __init__.py file, add:
import os
import glob
modules = glob.glob(os.path.dirname(__file__)+"/*.py")
__all__ = [ os.path.basename(f)[:-3] for f in modules if os.path.basename(f)[:-3] != 'common']
2 - In common.py add:
import sys
from os import path
sys.path.append( path.dirname(path.dirname(path.abspath(__file__))))
#print sys.path #for debugging
import backends
#from backends import * #up to you, but previous is better
Voila! Found a solution which I am completely satisfied with. :-)
I left the __init__.py module untouched. And instead put the following codes in the beginning of the common.py module.
import os
pwd = os.path.dirname(os.path.realpath(__file__))
file_names = os.listdir(pwd)
for name in file_names:
if ".pyc" not in name and "__init__" not in name and "common" not in name and ".py" in name:
exec "from "+name[:-3]+" import *"
It worked like charm. Though I don't know whether this is the best solution. I guess in Python 3.x, exec statement will be like following:
exec("from "+name[:-3]+" import *")
Related
I have a Python project with several Git submodules structured like so:
myproject/
__init__.py
src/
__init__.py
main.py
submodules/
__init__.py
submodule1/
__init__.py
utils.py
submodule2/
...
I want to be able to import code in main.py from my submodules with the following constraints:
The import statement must not contain submodules
IntelliSense must function. This rules out modifying the PATH, e.g. via sys.path.append().
The submodules must not be installed using a package manager.
I am hoping for an import statement in main.py that looks something like from submodule1 import utils or from submodule1.utils import myUtilFunction.
If this is impossible, any explanation or resources to help me understand why would be appreciated. I have been reading the docs about the import function, modules, and namespaces, but I do not understand how to achieve this or if it's even possible.
I've tried using various combinations of import statements and __all__ declarations in my __init__.py and files like so:
myproject/__init__.py:
__all__ = ['submodule1', 'submodules/submodule1']
from .submodules.submodule1 import *
from .submodules import *
import .submodules.submodule1
myproject/submodules:
__all__ = ['submodule1']
from .submodule1 import *
myproject/src/main.py:
from submodule1.utils import myUtilFunction
But so far nothing has allowed from submodule1.utils import myUtilFunction to work in main.py.
I have imports.py containing:
import os as exported_os
and foo.py containing:
from imports import exported_os
print(exported_os.path.devnull) # works
from imports.exported_os.path import devnull # doesn't
Is there a way to make the second import work? I tried adding __path__ to imports.py and fiddling with it but couldn't get anything.
Actual usecase: os is some_library_version_n and exported_os is some_library_version (I'm trying to avoid having many instances of some_library_version_n across different files).
One approach
Directory structure:
__init__.py
foo.py
imports/
├ __init__.py
└ exported_os/
├ __init__.py
└ path.py
imports/exported_os/__init__.py:
from . import path
from os import * # not necessary for the question
# but it makes `exported_os` more like `os`
# e.g., `exported_os.listdir` will be callable
imports/exported_os/path.py:
from os.path import *
In this way, you can use exported_os as if it is os with a submodule path. Different with import, from takes modules and classes.
Another approach
imports.py:
import os
import sys
ms = []
for m in sys.modules:
if m.startswith('os'):
ms.append(m)
for m in ms:
sys.modules['imports.exported_os' + m[2:]] = sys.modules[m]
Or, by explicitly extending sys.modules you can use exported_os as if os with its submodules.
Why you cannot simply change the name of os
If you open .../lib/python3.9/os.py you can find the following line:
sys.modules['os.path'] = path
So even if you copy .../lib/python3.9/os.py to .../lib/python3.9/exported_os.py, the following does not work:
from exported_os.path import devnull
But if you change the line sys.modules['os.path'] to sys.modules['exported_os.path'] it works.
The error you are getting would be something like:
ModuleNotFoundError: No module named 'imports.exported_os'; 'imports' is not a package
When you code from imports import exported_os, then imports can refer to a module implemented by file imports.py. But when the name imports is part of a hierarchy as in from imports.exported_os.path import devnull, then imports must be a package implemented as a directory in a directory structure such as the following:
__init__.py
imports
__init__.py
exported_os
__init__.py
path.py
where directory containing the top-most __init__.py must be in the sys.path search path.
So, unless you want to rearrange your directory structure to something like the above, the syntax (and selective importing) you want to use is really not available to you without getting into the internals of Python's module system.
Although this is not a solution to your wanting to be able to do an from ... import ... due to your unique versioning issue, let me suggest an alternate method of doing this versioning. In your situation you could do the following. Create a package, my_imports (give it any name you want):
my_imports
__init__.py
The contents of __init__.py is:
import some_library_version_n as some_library_version
Then in foo.py and in any other file that needs this module:
from my_imports import *
This is another method of putting the versioning dependency in one file. If you had other similar dependencies, you would, of course, add them to this file and you could import from my_imports just the names you are interested. You still have the issue that you are importing the entire module some_library_version.
However, we could take this one step further. Suppose the various versions of your library had components A, B and C that you might be interested in importing individually or all together. Then you could do the following. Let's instead name the package some_library_version, since it will only be dealing with this one versioning issue:
some_library_version/init.py
from some_library_version_n import A
from some_library_version_n import B
from some_library_version_n import C
foo.py
from some_library_version import A, C
Most of the answers added are accured but dont add context of why works in that way, GyuHyeon explains it well but it just resumes it into import is a fancy file include system that checks into the std libraries, then the installed ones and finaly into the context provided, context is added on where is called and the from given.
This example gives the various method of importing a specific function dirname(), the lib os is just a folder, if you imagine that os is in your working folder the import path whoud be the same or './os' and beause python, everything is a class, so import will search for the .os/__init__.py so if your library dont have one importing the subdirs it will have no efect.
from os.path import dirname as my_fucntion # (A_2)
from os import path as my_lib # (B_2)
from os.path import dirname # (C_1)
from os import path # (B_1)
import os # (A_1)
if __name__ == '__main__':
print(os.path.dirname(__file__)) # (A_1)
print(path.dirname(__file__)) # (B_1)
print(dirname(__file__)) # (C_1)
print(my_lib.dirname(__file__)) # (B_2)
print(my_fucntion(__file__)) # (A_2)
You could try to go with sys.path.append(...), e.g.:
import sys
sys.path.append(<your path to devnull goes here>)
Maybe not so nice, but you could use the pathlib library and path joins to construct the path (but some assumptions on file structure unfortunately have to be made if you the files are in separate folder structures):
from pathlib import Path
from os import path
sys.path.append(path.join(str(Path(__file__).parents[<integer that tells how many folders to go up>]), <path to devnull>))
Instead of pathlib you could also use the dirname function from os.path.
After appending to the system path, you could just use:
import devnull
I am having a lot of trouble understanding the python module import system.
I am trying to create a simple folder structure as follows.
SomeModule
__init__.py
AnotherModule
AnotherModule.py
__init__.py
Utils
Utils.py
__init__.py
To use SomeModule i can do:
SomeModule.Foo()
Now inside AnotherModule.py I would like to import my Utils directory.
How come I have to do
import SomeModule.AnotherModule.Utils.Foo
why cannot I just do
import Utils.Foo
To shorten up the actual function name that you'll have to call in your code, you can always do:
from SomeModule.AnotherModule.Utils import *
While this still won't allow you to get away with a shorter import statement at the top of your script, you'll be able to access all of the functions within .Utils just by calling their function name (i.e. foo(x) instead of SomeModule.AnotherModule.Utils.foo(x).
Part of the reason for the lengthy import statement goes to the comment from #wim . Have a look by typing import this in a python interpreter.
put
import sys
import SomeModule.AnotherModule
sys.modules['AnotherModule'] = SomeModule.AnotherModule
in SomeModules __init__.py
I am using someone elses project to add some functionality to mine, and there is a python script I want to import. The problem comes with the import structure of their directories: I have placed their project directory in a subfolder under my main project (needs to stay there so I can keep their project out of my version control) it looks like this:
myproject/
myscript.py
theirproject/
__init__.py
baz.py
secondlayer/
__init__.py
all.py
foo.py
bar.py
all.py is simply a list of import statements which import additional scripts from the secondlayer directory like this:
from secondlayer.foo import *
from secondlayer.bar import * #etc
I would like to import:
from theirproject.secondlayer.all import *
but that fails when python complains "no module named secondlayer.foo"
I have also tried the following:
from theirproject.secondlayer import all
I can get it to work when I place my script in theirproject/ and import all without the "theirproject" prefix, but I really cant have it be like that. I can get further through the import process by importing foo, bar, etc individually like this:
from theirproject.secondlayer import foo
from theirproject.secondlayer import bar #etc
But then those scripts fail to import more stuff from still other scripts (like baz.py) at the same level as secondlayer, so im stuck.
Whats the right way to do this in python 2.7.6?
If you change
from secondlayer.foo import *
from secondlayer.bar import *
to user relative imports like this
from .foo import *
from .bar import *
or like this
from foo import *
from bar import *
it works.
Plus you could do these imports in the __init__.py at secondlayer level so that the import from myscript.py becomes
from theirproject.secondlayer.all import *
See if you have the necessary permissions to import the package from your directory and its corresponding sub directories.
For reference, you may like to see this and its corresponding linked questions :
Python Imports do not work
I ended up solving my problem by adding theirproject/ to my PYTHONPATH. I upvoted junnytony's answer - it helped point me in the right direction, so thanks!
I have made a package in the following structure:
test.py
pakcage1/
__init__.py
module1.py
module2.py
In the test.py file, with the code
from package1 import *
what I want it to do is to
from numpy import *
from module1 import *
from module2 import *
What should I write in __init__.py file to achieve this?
Currently in my __init__.py file I have
from numpy import *
__all__ = ['module1','module2']
and this doesn't give me what I wanted. In this way numpy wan't imported at all, and the modules are imported as
import module1
rather than
from module1 import *
If you want this, your __init__.py should contain just what you want:
from numpy import *
from module1 import *
from module2 import *
When you do from package import *, it imports all names defined in the package's __init__.py.
Note that this could become awkward if there are name clashes among the modules you import. If you just want convenient access to the functions in those modules, I would suggest using instead something like:
import numpy as np
import module1 as m1
import module2 as m2
That is, import the modules (not their contents), but under shorter names. You can then still access numpy stuff with something like np.add, which adds only three characters of typing but guards against name clashes among different modules.
I second BrenBarn's suggestion, but be warned though, importing everything into one single namespace using from x import * is generally a bad idea, unless you know for certain that there won't be any conflicting names.
I think it's still safer to use import package.module, though it does take extra keystrokes.