Is it possible to make __init__ files describe parent package dynamically? - python

So I have a tree of modules and packages, it's two levels deep.
My goal, at top level, in main.py, is to refer to a sub-module using this syntax
import pkg1
pkg1.bebrs.fn_3.run()
the dot, dot, dot notation is very important to me.
inside the __init__.py inside the bebrs folder, my init states
from pkg1.bebrs import fn_3
it explicitly states the name of parent and current package.
Is there any way to type the pkg1 and bebrs in a more generalized way inside from pkg1.bebrs import fn_3? I wish I could just put generic __init__.py files and be able to import nested stuff with dot notation instead of having a very explicit nested import inside each nested __init__.py file.

Related

Group a collection of module files in a directory into a single import

I have a python application that has grown in size and complexity. I now have 2 folders - one that contain some utility classes and another that contains some other classes. It's quite a long list in each and the classes are referenced all over the place.
MyApp Folder
main_app.py
-- states
- Class1.py
- Class2.py
-- util
- Util1.py
- Util2.py
In main_ap.py is there a way I can just do import states and then reference any classes within that folder as states.Class1? I'd do the same for the util folder but the difference there is some of these classes reference each other.
I've tried __init__.py and some other things but i'm a legacy C++/C developer and relatively new to Python and i think this is handled much differently.
So __init__.py is a file that is executed the first time when you access something inside that package.
Saying that, import states is a way to execute content of states/__init__.py
# states/__init__.py
from states.Class1 import MyClass
# Or with relative path
from .Class2 import MyClass as MyClass2
# main_app.py
import states
states.MyClass() # It works
from states import MyClass2
MyClass2() # It works too.
In case you decide to use asterisk import (as Eldamir stated) you can be interested in __all__ keyword.
If states.__init__.py does from .Class1 import * and from .Class2 import *, then main.py can do import states and then states.SomeClassFromClass1Module
define a __init__.py file and then import the classes you want in the __init__.py file eg from class1 import class11 then at last define __all__ method and write all the class name which you want to be used as states.class11 as __all__ = ['class11', 'class12',.. ]

How to share python classes up a directory tree?

I have an example file structure provided below.
/.git
/README.md
/project
/Operation A
generateinsights.py
insights.py
/Operation B
generatetargets.py
targets.py
generateinsights.py is run; it references insights.py to get the definition of an insight object. Next, generatetargets.py is run; it refrences targets.py to get the definition of a target object. The issue that I have, is generatetargets.py also needs to understand what an insight object is. How can I set up my imports so that insights.py and targets.py can be referenced by anything in the project directory? It seems like I should use _ init _.py for this, but I can't get it to work properly.
Firstly, you have to rename Operation A and Operation B so that they are composed of only letters, numbers and underscores, for example Operation_A - this is needed to be able to use these in an import statement without raising a SyntaxError.
Then, put an __init__.py file into the project, Operation_A and Operation_B folders. You can leave it empty, but you can also for example define additional attributes for your module.
Finally, you need to make Python find your modules - for this, either:
set your PYTHONPATH environment variable so that it includes the folder containing project or
put the package folder somewhere into Python's default import directories, for example in ´/usr/lib/python3/site-packages` (requires root permissions)
After that you can import both targets.py and insights.py from any place like this:
from project.Operation_A import insights
from project.Operation_B import targets

How to structure modules to avoid things like "import module.module.module"

I have a module structured as follows:
/module
__init__.py
/submod_1
__init__.py
submod_1_class.py
/submod_2
__init__.py
submod_2_class.py
but I find it incredibly annoying to have to import a class within submod_1_class.py with:
from module.submod_1.submod_1_class import my_class
What I would prefer to be able to type is:
from module import my_class
I have browsed through the site-packages folder and looked through popular modules like numpy, but I haven't been able to figure out how, for example:
import numpy
a = numpy.array([1,2,3,4,5])
can be used when the definition of numpy array objects is buried deep within several subfolders of the numpy package.
You can use __init__.py files to define what can be imported from a given module. A very simple addition to your structure, for example, would draw up Class from submod_1_class such that external users can simply from module import Class.
/module
__init__.py
from submod_1 import Class
/submod_1
__init__.py
from submod_1_class import Class
submod_1_class.py
/submod_2
submod_2_class.py
In numpy, for example, the top-level __init__.py contains a line that reads:
from .core import *
This means everything defined within /core/__init__.py is available externally directly in the numpy namespace, even though it may actually be buried deep in some complex structure.

avoid sub-modules and external packages in a module's namespace

I'm writing a module to load a dataset. I want to keep the interface/API as clean as possible - so I've made internal functions and variables hidden by prefacing their names with __. Awesome. My module, however, imports other packages (e.g. numpy) which still appear in my module's namespace, how can I avoid this?
i.e. my file looks something like:
Loader.py:
import numpy as np
__INTERNAL_VAR1 = True
EXTERNAL_VAR = True
def loadData():
data = __INTERNAL_FUNC1()
...
return data
def __INTERNAL_FUNC1():
...
return data
and when I import my module np is exposed:
> import Loader
> Loader.[TAB]
Loader.EXTERNAL_VAR Loader.loadData Loader.np
If the autocompletion you are using is correctly implemented, it should honour the __all__ attribute of modules, if set.
Add a list of all names your module exports in that name:
__all__ = ['loadData', 'EXTERNAL_VAR']
The __all__ variable is used to determine what names are imported if you use a from modulename import * wildcard import, as well as by the help() function when documenting your module.
There is no point in using double-underscore names as globals; it is a single underscore at the start that marks such names as 'internal' (by convention).
Another solution could be to create an __init__.py that contains a line that imports the functions and variables you need:
from Loader import EXTERNAL_VAR,loadData
This file should be placed inside a folder containing your module:
└── Loader
├── __init__.py
└── Loader.py
Now when you import Loader you can access directly only EXTERNAL_VAR and loadData while all the other classes and modules are in Loader.Loader.
This solution should be independent of your autocompletion system.

Dynamically create subpackage

Is it possible to create a package dynamically, something like:
subpackage = create_subpackage(package_name, package_path)
The package should be associated with a physical path so that modules from that path can be imported through it.
The purpose is to be able to have subpackages that are not subdirectories of their parent package.
e.g.
main_package/
__init__.py
sub_package/
__init__.py
some_module.py
Contents of main_package/__init__.py:
sub_package = create_subpackage("sub_package", "/a/path/to/sub_package")
globals()["sub_package"] = sub_package
Contents of some_random_script.py
from main_package.sub_package import some_module
While this won't give you exactly the layout you're asking for, this might help: http://docs.python.org/tutorial/modules.html#packages-in-multiple-directories
Basically, each package has a __path__ attribute that contains a list of places to search for submodules. And you can modify it to your liking.
e.g.
main_package/__init__.py:
__path__ += ['/tmp/some/other/path/']
/tmp/some/other/path/sub_package/__init__.py:
value = 42
test.py:
from main_package.sub_package import value
print value
If that doesn't cut it, you can go read up on import hooks, the all-powerful (and correspondingly complicated) way to modify Python's import behavior.

Categories