I have imports.py containing:
import os as exported_os
and foo.py containing:
from imports import exported_os
print(exported_os.path.devnull) # works
from imports.exported_os.path import devnull # doesn't
Is there a way to make the second import work? I tried adding __path__ to imports.py and fiddling with it but couldn't get anything.
Actual usecase: os is some_library_version_n and exported_os is some_library_version (I'm trying to avoid having many instances of some_library_version_n across different files).
One approach
Directory structure:
__init__.py
foo.py
imports/
├ __init__.py
└ exported_os/
├ __init__.py
└ path.py
imports/exported_os/__init__.py:
from . import path
from os import * # not necessary for the question
# but it makes `exported_os` more like `os`
# e.g., `exported_os.listdir` will be callable
imports/exported_os/path.py:
from os.path import *
In this way, you can use exported_os as if it is os with a submodule path. Different with import, from takes modules and classes.
Another approach
imports.py:
import os
import sys
ms = []
for m in sys.modules:
if m.startswith('os'):
ms.append(m)
for m in ms:
sys.modules['imports.exported_os' + m[2:]] = sys.modules[m]
Or, by explicitly extending sys.modules you can use exported_os as if os with its submodules.
Why you cannot simply change the name of os
If you open .../lib/python3.9/os.py you can find the following line:
sys.modules['os.path'] = path
So even if you copy .../lib/python3.9/os.py to .../lib/python3.9/exported_os.py, the following does not work:
from exported_os.path import devnull
But if you change the line sys.modules['os.path'] to sys.modules['exported_os.path'] it works.
The error you are getting would be something like:
ModuleNotFoundError: No module named 'imports.exported_os'; 'imports' is not a package
When you code from imports import exported_os, then imports can refer to a module implemented by file imports.py. But when the name imports is part of a hierarchy as in from imports.exported_os.path import devnull, then imports must be a package implemented as a directory in a directory structure such as the following:
__init__.py
imports
__init__.py
exported_os
__init__.py
path.py
where directory containing the top-most __init__.py must be in the sys.path search path.
So, unless you want to rearrange your directory structure to something like the above, the syntax (and selective importing) you want to use is really not available to you without getting into the internals of Python's module system.
Although this is not a solution to your wanting to be able to do an from ... import ... due to your unique versioning issue, let me suggest an alternate method of doing this versioning. In your situation you could do the following. Create a package, my_imports (give it any name you want):
my_imports
__init__.py
The contents of __init__.py is:
import some_library_version_n as some_library_version
Then in foo.py and in any other file that needs this module:
from my_imports import *
This is another method of putting the versioning dependency in one file. If you had other similar dependencies, you would, of course, add them to this file and you could import from my_imports just the names you are interested. You still have the issue that you are importing the entire module some_library_version.
However, we could take this one step further. Suppose the various versions of your library had components A, B and C that you might be interested in importing individually or all together. Then you could do the following. Let's instead name the package some_library_version, since it will only be dealing with this one versioning issue:
some_library_version/init.py
from some_library_version_n import A
from some_library_version_n import B
from some_library_version_n import C
foo.py
from some_library_version import A, C
Most of the answers added are accured but dont add context of why works in that way, GyuHyeon explains it well but it just resumes it into import is a fancy file include system that checks into the std libraries, then the installed ones and finaly into the context provided, context is added on where is called and the from given.
This example gives the various method of importing a specific function dirname(), the lib os is just a folder, if you imagine that os is in your working folder the import path whoud be the same or './os' and beause python, everything is a class, so import will search for the .os/__init__.py so if your library dont have one importing the subdirs it will have no efect.
from os.path import dirname as my_fucntion # (A_2)
from os import path as my_lib # (B_2)
from os.path import dirname # (C_1)
from os import path # (B_1)
import os # (A_1)
if __name__ == '__main__':
print(os.path.dirname(__file__)) # (A_1)
print(path.dirname(__file__)) # (B_1)
print(dirname(__file__)) # (C_1)
print(my_lib.dirname(__file__)) # (B_2)
print(my_fucntion(__file__)) # (A_2)
You could try to go with sys.path.append(...), e.g.:
import sys
sys.path.append(<your path to devnull goes here>)
Maybe not so nice, but you could use the pathlib library and path joins to construct the path (but some assumptions on file structure unfortunately have to be made if you the files are in separate folder structures):
from pathlib import Path
from os import path
sys.path.append(path.join(str(Path(__file__).parents[<integer that tells how many folders to go up>]), <path to devnull>))
Instead of pathlib you could also use the dirname function from os.path.
After appending to the system path, you could just use:
import devnull
Related
i'm struggling with something that feels like it should be simple.
my current dir looks like this:
root/
└─ __init__.py (tried with it and without)
└─ file_with_class.py
└─ tests_folder/
└─ __init__.py (tried with it and without)
└─ unittest_for_class.py
unittest_for_class.py needs to import the class from file_with_class to test it, i tried to import it in various ways i found online but i just keep getting errors like:
(class name is same as file name lets say its called file_with_class)
File "tests_folder/unittest_for_class.py", line 3, in <module>
from ..file_with_class import file_with_class
ValueError: Attempted relative import in non-package
File "tests_folder/unittest_for_class.py", line 3, in <module>
from file_with_class import file_with_class
ImportError: No module named file_with_class
and others..
what is the correct way to import a class from a .py file that is in the parent folder ?
As a short explanation
import * from ..parent works if your program started at the parent level.
You import submodules which can have cross relations to other submodules or files in the package -> They are only relative inside a package not the os structure.
Option 1
you actually enter via a script in your parent folder and import your mentioned file as a submodule. Nicest, cleanest and intended way, but then your file is no standalone.
Option 2 - Add the parent dictionary to your path
sys.path.append('/path/to/parent')
import parent
This is a little bit dirty as you now have an extra path for your imports but still one of most easiest ones without much trickery.
Further Options and theory
There are quite a few posts here covering this topic relative imports covers quite a few good definitions and concepts in the answers.
Option 3 - Deprecated and not future proof importlib.find_loader
import os
import importlib
current = os.getcwd() # for rollback
os.chdir("..") # change to arbitrary path
loader = importlib.find_loader("parent") # load filename
assert loader
parent = loader.load_module() # now this is your module
assert parent
os.chdir(current) # change back to working dictionary
(Half of an) Option 4
When working with an IDE this might work, Spyder allows the following code. Standard python console does NOT.
import os
current = os.getcwd()
os.chdir("..")
import parent
os.chdir(current)
Following up on #Daraan's answer:
You import submodules which can have cross relations to other submodules or files in the package -> They are only relative inside a package not the os structure.
I've written an experimental, new import library: ultraimport which allows to do just that, relative imports from anywhere in the file system. It will give you more control over your imports.
You could then write in your unittest_for_class.py:
import ultraimport
MyClass = ultraimport("__dir__/../file_with_class.py", "MyClass")
# or to import the whole module
file_with_class = ultraimport("__dir__/../file_with_class.py")
The advantage is that this will always work, independent of sys.path, no matter how you run your script and all the other things that were mentioned.
You can add the parent folder to the search path with sys.path.append() like so:
import sys
sys.path.append('/path/to/parentdir')
from file_with_class import file_with_class
...
See also the tutorial for how Python modules and packages are handled
Just keep from file_with_class import file_with_class. Then run python -m test_folder.unittest_for_class. This supports running the script as if it is a module.
I am develepong a blender addon which has several submodules
- A/__init__.py
|
|- B/__init__.py
In A/init.py I can do import A.B to import the content of submodule B, but I would like to be able to import automatically this files. Is there any way to achieve that?
The idea is that my addon is used to implement several glTF extensions. Each extension is in its own submodule and the glTF exporter expects me to return some classes, one for each extension.
Instead of importing manually each extension and adding the class to the list of extensions, I want that to happen automatically
instead of
from A.B import B_extension
glTF2ExportUserExtensions = [A.B.B_extension]
I want something like
# A.submodules returns all the submodules and extension() returns the extension class. Im assuming each submodule have an extension() function
glTF2ExportUserExtensions = [A.submodules.extension()]
I need to return a list of classes
You can import things from an essential file that you have the imports on.
Example of ./import_file.py
import example
import ...
You can use "*" to import everything from the import file, or you can import something specific.
Example of ./main_file.py
from import_file import *
from import_file import example
Ok, I guess I understand what you want to do.
So to import several submodules under the same "submodule-name" submodules, import them in a separate file submodules.py at the same level as A/__init__.py:
from .B import B_extension as extension
from .C import C_extension as someothername
And in A/__init__.py add:
from . import submodules
Now if you import A with
import A
you can access the submodules with A.submodules.extension, A.submodules.someothername, etc...
If you want to access the submodules functions/classes/etc. directly from submodules, such as A.submodules.extension(), your submodules.py files has to look like:
from .B.B_extension import extension, anotherBmethod, SomeBClass
from .C.C_extension import Cextension, anotherCmethod, SomeCClass
If you want to have a fully automatic import, use pkgutil (credit goes to this answer), even though I strongly object importing all submodules automatically. Explicit is better than implicit. You never know what happens when you change a single line in a submodule, without testing all imports when doing it implicitly... Add this to A/__init__.py:
import pkgutil
__all__ = []
for loader, module_name, is_pkg in pkgutil.walk_packages(__path__):
__all__.append(module_name)
_module = loader.find_module(module_name).load_module(module_name)
globals()[module_name] = _module
I have code in one folder, and want to import code in an adjacent folder like this:
I am trying to import a python file in innerLayer2, into a file in innerLayer1
outerLayer:
innerLayer1
main.py
innerLayer2
functions.py
I created the following function to solve my problem, but there must be an easier way? This only works on windows aswell and I need it to work on both linux and windows.
# main.py
import sys
def goBackToFile(layerBackName, otherFile):
for path in sys.path:
titles = path.split('\\')
for index, name in enumerate(titles):
if name == layerBackName:
finalPath = '\\'.join(titles[:index+1])
return finalPath + '\\' + otherFile if otherFile != False else finalPath
sys.path.append(goBackToFile('outerLayer','innerLayer2'))
import functions
Is there an easier method which will work on all operating systems?
Edit: I know the easiest method is to put innerLayer2 inside of innerLayer1 but I cannot do that in this scenario. The files have to be adjacent.
Edit: Upon analysing answers this has received I have discovered the easiest method and have posted it as an answer below. Thankyou for your help.
Use . and .. to address within package structure as specified by PEP 328 et al.
Suppose you have the following structure:
proj/
script.py # supposed to be installed in bin folder
mypackage/ # supposed to be installed in sitelib folder
__init__.py # defines default exports if any
Inner1/
__init__.py # defines default exports from Inner1 if any
main.py
Inner2/
__init__.py # defines default exports from Inner2 if any
functions.py
Inner1.main should contain import string like this:
from ..Inner2 import functions
If you have to use the current directory design, I would suggest using a combination of sys and os to simplify your code:
import sys, os
sys.path.insert(1, os.path.join(sys.path[0], '..'))
from innerLayer2 import functions
Upon analysing answers I have received I have discovered the easiest solution: simply use this syntax to add the outerLayer directory to sys.path then import functions from innerLayer2:
# main.py
import sys
sys.path.append('..') # adds outerLayer to the sys.path (one layer up)
from innerLayer2 import functions
The easiest way is:
Move the innerLayer2 folder to inside the innerLayer1 folder
Add an empty file named __init__.py on the innerLayer2
On the main.py use the following:
import innerLayer2.functions as innerLayer2
# Eg of usage:
# innerLayer2.sum(1, 2)
For Python 3, in this sandbox, foo has a function print_bar() I want to use in script1.
myApp/
main_folder/
helper/
__init__.py
foo.py
scripts/
script1.py
In script1:
from ..helpers import foo
foo.print_bar()
I am met with "ValueError: attempted relative import beyond top-level package".
Fine I'll try the sys path stuff thing I see on the other SO questions.
import os
import sys
# Add parent folder path to sys.path
cur_path = os.path.dirname(os.path.realpath(__file__))
parent_path = os.path.dirname(cur_path)
sys.path.append(parent_path)
from helpers import foo
foo.print_bar()
Okay this works but is a bit convoluted. Is there not a simpler way to say, "Go up a directory and the import x?"
Seems like relative import along the lines of "from ..helpers import foo" is what should work but am I not getting it right or am I not using it for its intended purpose?
Following methods can be used to include parent folder in python path:
import sys, os
sys.path.append('..') # method 1
sys.path.append(os.path.abspath('..')) # method 2
sys.path.append(os.pardir) # method 3
I want to import main_file.py from sub_file.py, i have my init files setup. I am able to do in my main_file.py:
from sub_folder.sub_file import *
However i do not know how to do it the other way around.
This is my structure:
|+main_folder
|--_init_.py
|--main_file.py
|++sub_folder
|---_init_.py
|---sub_file.py
The old way:
Make sure the directory with main_folder is on your sys.path;
from main_folder import main_file.
The new, usually better way:
from ..main_folder import main_file
This has the advantage of never clashing with system imports. If you rename your main_folder to e.g. math, from math import my_func will crash, because stdlib's math does not have this function, or import from your module, depending on sys.path. OTOH from ..math import my_func will definitely always import from your own module.
If in doubt, always print sys.path before your failing import statement to understand if you're actually looking at the right directories.