How to import top level files in python using __init__ - python

I want to import main_file.py from sub_file.py, i have my init files setup. I am able to do in my main_file.py:
from sub_folder.sub_file import *
However i do not know how to do it the other way around.
This is my structure:
|+main_folder
|--_init_.py
|--main_file.py
|++sub_folder
|---_init_.py
|---sub_file.py

The old way:
Make sure the directory with main_folder is on your sys.path;
from main_folder import main_file.
The new, usually better way:
from ..main_folder import main_file
This has the advantage of never clashing with system imports. If you rename your main_folder to e.g. math, from math import my_func will crash, because stdlib's math does not have this function, or import from your module, depending on sys.path. OTOH from ..math import my_func will definitely always import from your own module.
If in doubt, always print sys.path before your failing import statement to understand if you're actually looking at the right directories.

Related

Structuring python packages for straightforward importing

I have a number of functions that I have written myself over time, over time I have placed each of these functions in their own modules (as I thought this was best practice).
I would like take the next step and organise these modules that I can import in fewer lines of code. However, I'm finding that I need to need to use a lot of lines of 'repeated' code to import the functions I want to have access to.
my_functions
-src
--__init__.py
--func1.py
--func2.py
--func3.py
I have followed this talk tutorial to build this collection of modules into a package thinking that I could use something like
import my_fucntions as mf
but I have to import each module
import func1 as fu1
import func2 as fu2
...
a = fu1.func1()
my question is, what do I have to do to be able to import my functions as a package, the same way that I would import something like pandas
import my_functions as mf
a = mf.func1(arg)
I'm finding it difficult to find either a tutorial or a clear simple example of how to structure this so any guidance at all would be useful. If it's just not doable without building something as complex and pandas or numpy thats ok too, I just said I'd try one last shot at it.
Create a __init__.py file in both my_functions directory and src directory.
Now in the my_functions __init__.py file. No need to edit __init__ file inside src.
from src.func1 import func1
from src.func2 import func2
__all__ = [
'func1',
'func2'
]
Now you can use my_functions like below
import my_functions
my_functions.f1()
__all__ gets called when you try to import my_functions and allow you to import things you have mentioned using dot notation.
To call a python script from the root call it using dot notation as below
python3 -m <some_directory>.<file>

Automatically import files in python

I am develepong a blender addon which has several submodules
- A/__init__.py
|
|- B/__init__.py
In A/init.py I can do import A.B to import the content of submodule B, but I would like to be able to import automatically this files. Is there any way to achieve that?
The idea is that my addon is used to implement several glTF extensions. Each extension is in its own submodule and the glTF exporter expects me to return some classes, one for each extension.
Instead of importing manually each extension and adding the class to the list of extensions, I want that to happen automatically
instead of
from A.B import B_extension
glTF2ExportUserExtensions = [A.B.B_extension]
I want something like
# A.submodules returns all the submodules and extension() returns the extension class. Im assuming each submodule have an extension() function
glTF2ExportUserExtensions = [A.submodules.extension()]
I need to return a list of classes
You can import things from an essential file that you have the imports on.
Example of ./import_file.py
import example
import ...
You can use "*" to import everything from the import file, or you can import something specific.
Example of ./main_file.py
from import_file import *
from import_file import example
Ok, I guess I understand what you want to do.
So to import several submodules under the same "submodule-name" submodules, import them in a separate file submodules.py at the same level as A/__init__.py:
from .B import B_extension as extension
from .C import C_extension as someothername
And in A/__init__.py add:
from . import submodules
Now if you import A with
import A
you can access the submodules with A.submodules.extension, A.submodules.someothername, etc...
If you want to access the submodules functions/classes/etc. directly from submodules, such as A.submodules.extension(), your submodules.py files has to look like:
from .B.B_extension import extension, anotherBmethod, SomeBClass
from .C.C_extension import Cextension, anotherCmethod, SomeCClass
If you want to have a fully automatic import, use pkgutil (credit goes to this answer), even though I strongly object importing all submodules automatically. Explicit is better than implicit. You never know what happens when you change a single line in a submodule, without testing all imports when doing it implicitly... Add this to A/__init__.py:
import pkgutil
__all__ = []
for loader, module_name, is_pkg in pkgutil.walk_packages(__path__):
__all__.append(module_name)
_module = loader.find_module(module_name).load_module(module_name)
globals()[module_name] = _module

Python import from submodule of package exported by another file

I have imports.py containing:
import os as exported_os
and foo.py containing:
from imports import exported_os
print(exported_os.path.devnull) # works
from imports.exported_os.path import devnull # doesn't
Is there a way to make the second import work? I tried adding __path__ to imports.py and fiddling with it but couldn't get anything.
Actual usecase: os is some_library_version_n and exported_os is some_library_version (I'm trying to avoid having many instances of some_library_version_n across different files).
One approach
Directory structure:
__init__.py
foo.py
imports/
├ __init__.py
└ exported_os/
├ __init__.py
└ path.py
imports/exported_os/__init__.py:
from . import path
from os import * # not necessary for the question
# but it makes `exported_os` more like `os`
# e.g., `exported_os.listdir` will be callable
imports/exported_os/path.py:
from os.path import *
In this way, you can use exported_os as if it is os with a submodule path. Different with import, from takes modules and classes.
Another approach
imports.py:
import os
import sys
ms = []
for m in sys.modules:
if m.startswith('os'):
ms.append(m)
for m in ms:
sys.modules['imports.exported_os' + m[2:]] = sys.modules[m]
Or, by explicitly extending sys.modules you can use exported_os as if os with its submodules.
Why you cannot simply change the name of os
If you open .../lib/python3.9/os.py you can find the following line:
sys.modules['os.path'] = path
So even if you copy .../lib/python3.9/os.py to .../lib/python3.9/exported_os.py, the following does not work:
from exported_os.path import devnull
But if you change the line sys.modules['os.path'] to sys.modules['exported_os.path'] it works.
The error you are getting would be something like:
ModuleNotFoundError: No module named 'imports.exported_os'; 'imports' is not a package
When you code from imports import exported_os, then imports can refer to a module implemented by file imports.py. But when the name imports is part of a hierarchy as in from imports.exported_os.path import devnull, then imports must be a package implemented as a directory in a directory structure such as the following:
__init__.py
imports
__init__.py
exported_os
__init__.py
path.py
where directory containing the top-most __init__.py must be in the sys.path search path.
So, unless you want to rearrange your directory structure to something like the above, the syntax (and selective importing) you want to use is really not available to you without getting into the internals of Python's module system.
Although this is not a solution to your wanting to be able to do an from ... import ... due to your unique versioning issue, let me suggest an alternate method of doing this versioning. In your situation you could do the following. Create a package, my_imports (give it any name you want):
my_imports
__init__.py
The contents of __init__.py is:
import some_library_version_n as some_library_version
Then in foo.py and in any other file that needs this module:
from my_imports import *
This is another method of putting the versioning dependency in one file. If you had other similar dependencies, you would, of course, add them to this file and you could import from my_imports just the names you are interested. You still have the issue that you are importing the entire module some_library_version.
However, we could take this one step further. Suppose the various versions of your library had components A, B and C that you might be interested in importing individually or all together. Then you could do the following. Let's instead name the package some_library_version, since it will only be dealing with this one versioning issue:
some_library_version/init.py
from some_library_version_n import A
from some_library_version_n import B
from some_library_version_n import C
foo.py
from some_library_version import A, C
Most of the answers added are accured but dont add context of why works in that way, GyuHyeon explains it well but it just resumes it into import is a fancy file include system that checks into the std libraries, then the installed ones and finaly into the context provided, context is added on where is called and the from given.
This example gives the various method of importing a specific function dirname(), the lib os is just a folder, if you imagine that os is in your working folder the import path whoud be the same or './os' and beause python, everything is a class, so import will search for the .os/__init__.py so if your library dont have one importing the subdirs it will have no efect.
from os.path import dirname as my_fucntion # (A_2)
from os import path as my_lib # (B_2)
from os.path import dirname # (C_1)
from os import path # (B_1)
import os # (A_1)
if __name__ == '__main__':
print(os.path.dirname(__file__)) # (A_1)
print(path.dirname(__file__)) # (B_1)
print(dirname(__file__)) # (C_1)
print(my_lib.dirname(__file__)) # (B_2)
print(my_fucntion(__file__)) # (A_2)
You could try to go with sys.path.append(...), e.g.:
import sys
sys.path.append(<your path to devnull goes here>)
Maybe not so nice, but you could use the pathlib library and path joins to construct the path (but some assumptions on file structure unfortunately have to be made if you the files are in separate folder structures):
from pathlib import Path
from os import path
sys.path.append(path.join(str(Path(__file__).parents[<integer that tells how many folders to go up>]), <path to devnull>))
Instead of pathlib you could also use the dirname function from os.path.
After appending to the system path, you could just use:
import devnull

Why cannot import sub module?

My project structure like this:
/project
main.py
/a_module
__init__.py
/sub_module
__init__.py
some_file.py
main.py
from a_module import main_api
a_module/__init__.py
from sub_module import sub_api
sub_module/__init__.py
from some_file import detail_api
In a_module/__init__.py gives Unable to import 'sub_module' error.
Why I cannot import 'sub_module'?
When I change to the relative path solve the error.
from .sub_module import sub_api
But I don't understand, does __init__.py design for public the API of the module? Why don't treat sub_module as a module instead of a directory? it's such a bad design to me...
__init__.py is executed when you import the package that contains it. But it's not your problem. Your problem is that module imports are always absolute unless explicitly relative. That means that they must chain from some directory in sys.path. By default this includes the working directory, so when you run main.py from within project, it can find a_module, and nothing else.
from sub_module import sub_api
In a_module/__init__.py doesn't work though, because imports are always absolute unless explicitly relative. So that import says "starting from some sys.path root, find a top level package named sub_module and import sub_api from it". Since no such module exists you get an error. from .sub_module import sub_api works because you opted into relative imports, so it doesn't start over from sys.path.
For an example of why you would do this, I'll give you something that broke in our own code back in the Python 2 days before absolute import by default was the law (from __future__ import absolute_import enabled the Py3 behavior, which is how we fixed it, but despite what the docs say, it was never enabled by default in Py2, the only enabled by default behavior was relative imports). Our layout was:
teamnamespace/
module.py
math/
mathrelatedsubmodule.py
othermathsubmodule.py
Now, we innocently thought hey, we'll put all our packages under a single shared top level namespace, and subpackages cover broad categories within them, and since we had a lot of additional utilities for basic mathematics, we put them under teamnamespace.math. Problem was, for the non-math modules, like teamnamespace.module, when they did:
import math # or
from math import ceil
it defaulted to relative lookup, and imported teamnamespace.math as math (a thoroughly useless import, since it was a namespace package only, all the functionality was in the sub-modules), not the built-in math module. In fact, without the Python 3 behavior, there was no reasonable way to get the built-in math module from a module under teamnamespace. Whereas with the Python 3 behavior, you can get either one or both (by aliasing one or the other with as, with no ambiguity:
# Gets built-in
import math
# Gets teamnamespace.math
from . import math

Python: Is there a place when I can put default imports for all my modules?

Is there a place when I can put default imports for all my modules?
If you want default imports when using the python shell you can also set the PYTHONSTARTUP environmental variable to point to a python file that will be executed whenever you start the shell. Put all your default imports in this file.
Yes, just create a separate module and import it into yours.
Example:
# my_imports.py
'''Here go all of my imports'''
import sys
import functools
from contextlib import contextmanager # This is a long name, no chance to confuse it.
....
# something1.py
'''One of my project files.'''
from my_imports import *
....
# something2.py
'''Another project file.'''
from my_imports import *
....
Note that according to standard guidelines, from module import * should be avoided. If you're managing a small project with several files that need common imports, I think you'll be fine with from module import *, but it still would be a better idea to refactor your code so that different files need different imports.
So do it like this:
# something1.py
'''One of my project files. Takes care of main cycle.'''
import sys
....
# something2.py
'''Another project file. Main program logic.'''
import functools
from contextlib import contextmanager # This is a long name, no chance to confuse it.
....

Categories