I am develepong a blender addon which has several submodules
- A/__init__.py
|
|- B/__init__.py
In A/init.py I can do import A.B to import the content of submodule B, but I would like to be able to import automatically this files. Is there any way to achieve that?
The idea is that my addon is used to implement several glTF extensions. Each extension is in its own submodule and the glTF exporter expects me to return some classes, one for each extension.
Instead of importing manually each extension and adding the class to the list of extensions, I want that to happen automatically
instead of
from A.B import B_extension
glTF2ExportUserExtensions = [A.B.B_extension]
I want something like
# A.submodules returns all the submodules and extension() returns the extension class. Im assuming each submodule have an extension() function
glTF2ExportUserExtensions = [A.submodules.extension()]
I need to return a list of classes
You can import things from an essential file that you have the imports on.
Example of ./import_file.py
import example
import ...
You can use "*" to import everything from the import file, or you can import something specific.
Example of ./main_file.py
from import_file import *
from import_file import example
Ok, I guess I understand what you want to do.
So to import several submodules under the same "submodule-name" submodules, import them in a separate file submodules.py at the same level as A/__init__.py:
from .B import B_extension as extension
from .C import C_extension as someothername
And in A/__init__.py add:
from . import submodules
Now if you import A with
import A
you can access the submodules with A.submodules.extension, A.submodules.someothername, etc...
If you want to access the submodules functions/classes/etc. directly from submodules, such as A.submodules.extension(), your submodules.py files has to look like:
from .B.B_extension import extension, anotherBmethod, SomeBClass
from .C.C_extension import Cextension, anotherCmethod, SomeCClass
If you want to have a fully automatic import, use pkgutil (credit goes to this answer), even though I strongly object importing all submodules automatically. Explicit is better than implicit. You never know what happens when you change a single line in a submodule, without testing all imports when doing it implicitly... Add this to A/__init__.py:
import pkgutil
__all__ = []
for loader, module_name, is_pkg in pkgutil.walk_packages(__path__):
__all__.append(module_name)
_module = loader.find_module(module_name).load_module(module_name)
globals()[module_name] = _module
Related
I have imports.py containing:
import os as exported_os
and foo.py containing:
from imports import exported_os
print(exported_os.path.devnull) # works
from imports.exported_os.path import devnull # doesn't
Is there a way to make the second import work? I tried adding __path__ to imports.py and fiddling with it but couldn't get anything.
Actual usecase: os is some_library_version_n and exported_os is some_library_version (I'm trying to avoid having many instances of some_library_version_n across different files).
One approach
Directory structure:
__init__.py
foo.py
imports/
├ __init__.py
└ exported_os/
├ __init__.py
└ path.py
imports/exported_os/__init__.py:
from . import path
from os import * # not necessary for the question
# but it makes `exported_os` more like `os`
# e.g., `exported_os.listdir` will be callable
imports/exported_os/path.py:
from os.path import *
In this way, you can use exported_os as if it is os with a submodule path. Different with import, from takes modules and classes.
Another approach
imports.py:
import os
import sys
ms = []
for m in sys.modules:
if m.startswith('os'):
ms.append(m)
for m in ms:
sys.modules['imports.exported_os' + m[2:]] = sys.modules[m]
Or, by explicitly extending sys.modules you can use exported_os as if os with its submodules.
Why you cannot simply change the name of os
If you open .../lib/python3.9/os.py you can find the following line:
sys.modules['os.path'] = path
So even if you copy .../lib/python3.9/os.py to .../lib/python3.9/exported_os.py, the following does not work:
from exported_os.path import devnull
But if you change the line sys.modules['os.path'] to sys.modules['exported_os.path'] it works.
The error you are getting would be something like:
ModuleNotFoundError: No module named 'imports.exported_os'; 'imports' is not a package
When you code from imports import exported_os, then imports can refer to a module implemented by file imports.py. But when the name imports is part of a hierarchy as in from imports.exported_os.path import devnull, then imports must be a package implemented as a directory in a directory structure such as the following:
__init__.py
imports
__init__.py
exported_os
__init__.py
path.py
where directory containing the top-most __init__.py must be in the sys.path search path.
So, unless you want to rearrange your directory structure to something like the above, the syntax (and selective importing) you want to use is really not available to you without getting into the internals of Python's module system.
Although this is not a solution to your wanting to be able to do an from ... import ... due to your unique versioning issue, let me suggest an alternate method of doing this versioning. In your situation you could do the following. Create a package, my_imports (give it any name you want):
my_imports
__init__.py
The contents of __init__.py is:
import some_library_version_n as some_library_version
Then in foo.py and in any other file that needs this module:
from my_imports import *
This is another method of putting the versioning dependency in one file. If you had other similar dependencies, you would, of course, add them to this file and you could import from my_imports just the names you are interested. You still have the issue that you are importing the entire module some_library_version.
However, we could take this one step further. Suppose the various versions of your library had components A, B and C that you might be interested in importing individually or all together. Then you could do the following. Let's instead name the package some_library_version, since it will only be dealing with this one versioning issue:
some_library_version/init.py
from some_library_version_n import A
from some_library_version_n import B
from some_library_version_n import C
foo.py
from some_library_version import A, C
Most of the answers added are accured but dont add context of why works in that way, GyuHyeon explains it well but it just resumes it into import is a fancy file include system that checks into the std libraries, then the installed ones and finaly into the context provided, context is added on where is called and the from given.
This example gives the various method of importing a specific function dirname(), the lib os is just a folder, if you imagine that os is in your working folder the import path whoud be the same or './os' and beause python, everything is a class, so import will search for the .os/__init__.py so if your library dont have one importing the subdirs it will have no efect.
from os.path import dirname as my_fucntion # (A_2)
from os import path as my_lib # (B_2)
from os.path import dirname # (C_1)
from os import path # (B_1)
import os # (A_1)
if __name__ == '__main__':
print(os.path.dirname(__file__)) # (A_1)
print(path.dirname(__file__)) # (B_1)
print(dirname(__file__)) # (C_1)
print(my_lib.dirname(__file__)) # (B_2)
print(my_fucntion(__file__)) # (A_2)
You could try to go with sys.path.append(...), e.g.:
import sys
sys.path.append(<your path to devnull goes here>)
Maybe not so nice, but you could use the pathlib library and path joins to construct the path (but some assumptions on file structure unfortunately have to be made if you the files are in separate folder structures):
from pathlib import Path
from os import path
sys.path.append(path.join(str(Path(__file__).parents[<integer that tells how many folders to go up>]), <path to devnull>))
Instead of pathlib you could also use the dirname function from os.path.
After appending to the system path, you could just use:
import devnull
Behold my directory structure
exchange.py
exchanges/
kraken.py
gemini.py
bitfinex.py
Now in exchange.py I'd like to dynamically load all modules from the exchanges folder so that I can iterate over them and instantiate the classes within programmatically. Pseudocode:
exchanges = load_modules('exchanges')
for module in exchanges:
config = module.CONFIG # a global
for class in module:
loaded_classes.append(module.class) # add each class in each module to a list
Goal is to allow any contributor to create a class that inherits from exchange.py, put it into the exchanges folder and the app will automatically load it and its configuration.
I've seen this answer which uses the os module to load all files in a directory but this seems unreasonably hacky to me.
Add __init__.py to exchanges folder make it as a python package
import sys
sys.path(path_to_modules_directory)
By doing so you can import all modules by using this way
from exchanges import *
You can use function __import__ or module importlib for with task.
For first use should scan directory for python file. After import then with __import__ or importlib. Better use importlib
Example
import os
import sys
def import_libs(path):
sys.path.append(path)
return [__import__(os.path.join(path, module)) for module in filter(lambda x: x and x[-3:] == ".py", os.listdir("./modules/calculate/calculate"))]
print(import_libs('exchanges'))
Also i have one question? Why you need do this?
I think will be better if you look at the factory design pattern.
I simply want to take all my .py files from a single folder (I don't care about the sub-folders for now) and put them into a single module.
The use case I'm having here is that I'm writing some pretty standard object-oriented code and I'm using a single file for every class, and I don't want to have to write from myClass import myClass for every class into my __init__.py. I can't use Python3, so I'm still working with impand reloadand such.
At the moment I'm using
# this is __init__.py
import pkgutil
for loader, name, is_pkg in pkgutil.walk_packages(__path__):
if not is_pkg:
__import__(__name__ + "." + name)
and it doesn't seem to work, it includes the packages but it includes them as modules, so that I have to write MyClass.MyClass for a class that is defined in a file with it's own name. That's silly and I don't like it.
I've been searching forever and I'm just getting more confused how complicated this seemingly standard use case seems to be. Do python devs just write everything into a single file? Or do they always have tons of imports?
Is this something that should be approached in an entirely different way?
What you really want to do
To do the job you need to bind your class names to namespace of your __init__.py script.
After this step you will be able to just from YourPackageName import * and just use your classes directly. Like this:
import YourPackageName
c = YourPackageName.MyClass()
or
from YourPackageName import *
c = MyClass()
Ways to achieve this
You have multiple ways to import modules dynamically: __import__(), __all__.
But.
The only way to bind names into namespace of current module is to use from myClass import myClass statement. Static statement.
In other words, content of each of your __init__.py scripts should be looking like that:
#!/usr/bin/env python
# coding=utf-8
from .MySubPackage import *
from .MyAnotherSubPackage import *
from .my_pretty_class import myPrettyClass
from .my_another_class import myAnotherClass
...
And you should know that even for a dynamic __all__:
It is up to the package author to keep this list up-to-date when a new version of the package is released.
(https://docs.python.org/2/tutorial/modules.html#importing-from-a-package)
So a clear answers to your questions:
Do python devs just write everything into a single file?
No, they don't.
Or do they always have tons of imports?
Almost. But definitely not tons. You need to import each of your modules just once (into an appropriate __init__.py scripts). And then just import whole package or sub-package at once.
Example
Let's assume that there is next package structure:
MyPackage
|---MySubPackage
| |---__init__.py
| |---pretty_class_1.py
| |---pretty_class_2.py
|---__init__.py
|---sleepy_class_1.py
|---sleepy_class_2.py
Content of the MyPackage/MySubPackage/__init__.py:
#!/usr/bin/env python
# coding=utf-8
from .pretty_class_1 import PrettyClass1
from .pretty_class_2 import PrettyClass2
Content of the MyPackage/__init__.py:
#!/usr/bin/env python
# coding=utf-8
from .MySubPackage import *
from .sleepy_class_1 import SleepyClass1
from .sleepy_class_2 import SleepyClass2
As result, now we are able to write next code in our application:
import MyPackage
p = MyPackage.PrettyClass1()
s = MyPackage.SleepyClass2()
or
from MyPackage import *
p = PrettyClass1()
s = SleepyClass2()
I have files as follow,
file1.py
file2.py
file3.py
Let's say that all three use
lib7.py
lib8.py
lib9.py
Currently each of the three files has the lines
import lib7
import lib8
import lib9
How can I setup my directory/code such that the libs are imported only once, and then shared among the three files?
You will have to import something at least once per file. But you can set it up such that this is a single import line:
The probably cleanest way is to create a folder lib, move all lib?.py in there, and add an empty file called __init__.py to it.
This way you create a package out of your lib?.py files. It can then be used like this:
import lib
lib.lib7
Depending on where you want to end up, you might also want to have some code in the __init__.py:
from lib7 import *
from lib8 import *
from lib9 import *
This way you get all symbols from the individual lib?.py in a single import lib:
import lib
lib.something_from_lib7
Import each of them in a separate module, and then import that:
lib.py:
import lib7
import lib8
import lib9
In each of the files (file1.py, file2.py, file3.py), just use import lib. Of course, you then have to reference them with lib.lib7 – to avoid that, you can use from lib import *.
What would be the best (read: cleanest) way to tell Python to import all modules from some folder?
I want to allow people to put their "mods" (modules) in a folder in my app which my code should check on each startup and import any module put there.
I also don't want an extra scope added to the imported stuff (not "myfolder.mymodule.something", but "something")
If transforming the folder itself in a module, through the use of a __init__.py file and using from <foldername> import * suits you, you can iterate over the folder contents
with "os.listdir" or "glob.glob", and import each file ending in ".py" with the __import__ built-in function:
import os
for name in os.listdir("plugins"):
if name.endswith(".py"):
#strip the extension
module = name[:-3]
# set the module name in the current global name space:
globals()[module] = __import__(os.path.join("plugins", name)
The benefit of this approach is: it allows you to dynamically pass the module names to __import__ - while the ìmport statement needs the module names to be hardcoded, and it allows you to check other things about the files - maybe size, or if they import certain required modules, before importing them.
Create a file named
__init__.py
inside the folder and import the folder name like this:
>>> from <folder_name> import * #Try to avoid importing everything when you can
>>> from <folder_name> import module1,module2,module3 #And so on
You might want to try that project: https://gitlab.com/aurelien-lourot/importdir
With this module, you only need to write two lines to import all plugins from your directory and you don't need an extra __init__.py (or any other other extra file):
import importdir
importdir.do("plugins/", globals())