I am trying to organize some modules for my own use. I have something like this:
lib/
__init__.py
settings.py
foo/
__init__.py
someobject.py
bar/
__init__.py
somethingelse.py
In lib/__init__.py, I want to define some classes to be used if I import lib. However, I can't seem to figure it out without separating the classes into files, and import them in__init__.py.
Rather than say:
lib/
__init__.py
settings.py
helperclass.py
foo/
__init__.py
someobject.py
bar/
__init__.py
somethingelse.py
from lib.settings import Values
from lib.helperclass import Helper
I want something like this:
lib/
__init__.py #Helper defined in this file
settings.py
foo/
__init__.py
someobject.py
bar/
__init__.py
somethingelse.py
from lib.settings import Values
from lib import Helper
Is it possible, or do I have to separate the class into another file?
EDIT
OK, if I import lib from another script, I can access the Helper class. How can I access the Helper class from settings.py?
The example here describes Intra-Package References. I quote "submodules often need to refer to each other". In my case, the lib.settings.py needs the Helper and lib.foo.someobject need access to Helper, so where should I define the Helper class?
'lib/'s parent directory must be in sys.path.
Your 'lib/__init__.py' might look like this:
from . import settings # or just 'import settings' on old Python versions
class Helper(object):
pass
Then the following example should work:
from lib.settings import Values
from lib import Helper
Answer to the edited version of the question:
__init__.py defines how your package looks from outside. If you need to use Helper in settings.py then define Helper in a different file e.g., 'lib/helper.py'.
.
| `-- import_submodule.py
`-- lib
|-- __init__.py
|-- foo
| |-- __init__.py
| `-- someobject.py
|-- helper.py
`-- settings.py
2 directories, 6 files
The command:
$ python import_submodule.py
Output:
settings
helper
Helper in lib.settings
someobject
Helper in lib.foo.someobject
# ./import_submodule.py
import fnmatch, os
from lib.settings import Values
from lib import Helper
print
for root, dirs, files in os.walk('.'):
for f in fnmatch.filter(files, '*.py'):
print "# %s/%s" % (os.path.basename(root), f)
print open(os.path.join(root, f)).read()
print
# lib/helper.py
print 'helper'
class Helper(object):
def __init__(self, module_name):
print "Helper in", module_name
# lib/settings.py
print "settings"
import helper
class Values(object):
pass
helper.Helper(__name__)
# lib/__init__.py
#from __future__ import absolute_import
import settings, foo.someobject, helper
Helper = helper.Helper
# foo/someobject.py
print "someobject"
from .. import helper
helper.Helper(__name__)
# foo/__init__.py
import someobject
If lib/__init__.py defines the Helper class then in settings.py you can use:
from . import Helper
This works because . is the current directory, and acts as a synonym for the lib package from the point of view of the settings module. Note that it is not necessary to export Helper via __all__.
(Confirmed with python 2.7.10, running on Windows.)
You just put them in __init__.py.
So with test/classes.py being:
class A(object): pass
class B(object): pass
... and test/__init__.py being:
from classes import *
class Helper(object): pass
You can import test and have access to A, B and Helper
>>> import test
>>> test.A
<class 'test.classes.A'>
>>> test.B
<class 'test.classes.B'>
>>> test.Helper
<class 'test.Helper'>
Add something like this to lib/__init__.py
from .helperclass import Helper
now you can import it directly:
from lib import Helper
Edit, since i misunderstood the question:
Just put the Helper class in __init__.py. Thats perfectly pythonic. It just feels strange coming from languages like Java.
Yes, it is possible. You might also want to define __all__ in __init__.py files. It's a list of modules that will be imported when you do
from lib import *
Maybe this could work:
import __init__ as lib
Related
I have the following project structure:
- workflow/
file1.ipynb
file2.ipynb
...
- utils/
__init__.py
function_one.py
function_two.py
...
I am working on file1.ipynb, so far I have found a way to import the variables defined in init.py through the following code:
utils = importlib.machinery.SourceFileLoader('utils', '/home/utils/__init__.py').load_module()
Let's assume my __init__.py contains the following:
from .function_one import *
I can then use the variables defined inside the __init__.py file.
However, every time I want to call any of these variables I need to use the following syntax:
utils.function_one ...
I want to be able to write function_one without the utils at the beginning.
How can I import directly the variables defined inside the __init__.py ?
I don't know why you don't import your module with the normal import mechanism: from ..utils import * or depending on where your python interpreter was started just from utils import * but if you insist on using utils = importlib.machinery.SourceFileLoader('utils', '/home/utils/__init__.py').load_module() you can hack all values into your globals like this:
tmp = globals()
for attr_name in dir(utils):
if not attr_name.startswith("_"): # don't import private and dunder attributes
tmp[attr_name] = getattr(utils, attr_name)
del tmp
function_one(...) # should work now
Try this:
from ..utils import *
so I have a module/directory called A and it has init.py file and in it, it has another module/directory called B which have its init.py and a file called function.py which has a function called dummy()
here is the structure of directories
A
|-- __init__.py
|
|-- B
|
|-- __init__.py
|-- function.py
so what I want is to be on the same directory that contains directory A and do that
from A import *
dummy()
what I have done is do that in B/init.py
from dummy import *
and that in A/init.py
import B
and I can do that
from A.B import *
I want to write A instead of A.B
I changed your import code a bit and it seems to work now like you wanted.
So in the B directory's init.py it has:
# __init__.py in B
from .function import *
In the A directory's init.py:
# __init__.py in A
from .B import *
Now, when I run Python shell in the directory that contains A and and use from A import *, it calls dummy() with no problem.
However, there are discussions on using wildcard imports in Python. Check this post for example: Should wildcard import be avoided?
I have an extremely complicated module and I want to break out the subpackages into individual packages. My first attempt will be for the "utilities" submodule. I want to be able to import everything from the parent package example_utils.py into example_module.utils but I also want example_module.utils to have it's own functions as well.
In the end I want to be able to do the following:
import example_module as em
x = 10
y1 = em.utils.f_parent1(x)
y2 = em.utils.f_child1(x)
# and do this
from example_module.utils import f_parent1, f_child1
# and use the parent module as a standalone
from example_utils import f_parent1, f_parent2
How can I structure my child module example_module to have this functionality?**
Module utilities saved as separate standalone module example_utils.py
def f_parent1(x):
return x
def f_parent2(x,y):
return x+y
This module will be installed in my environment:
pip install path/to/example_module
Larger module (example_module) using example_utils as a dependency
# Directory structure for larger Module
example_module
|++++| __init__.py
|++++| utils
|++++| ++++ | __init__.py
|++++| ++++ | utils.py
Contents of |++++| ++++ | __init__.py
from .utils import *
Contents of |++++| ++++ | utils.py
from example_utils import *
def f_child1(x):
return x**2
Contents of |++++| __init__.py
__version__= "0.1"
__developmental__ = True
# Utilities
from .utils import utils
# =======
# Direct Exports
# =======
_submodules = ["utils"]
__all__ = sorted(__all__)
Apologies in advance if namespace is not the correct term. I get confused with namespace, scope, etc.
With suggestions from #r-ook I found that I could use getattr to grab the function by string name from the parent module. After that, I could add the function into the namespace?scope? of the child module.
example.py
from example_utils import as emu
functions_from_parent = ["f_parent1", "f_parent2"]
__all__ = {"f_child1"}
for function_name in functions_from_parent:
globals()[function_name] = getattr(emu, function_name)
__all__.add(function_name)
__all__ = sorted(__all__)
def f_child1(x):
return x**2
I create a module with one function (read_file) which read a file and add data in one dict (DATA).
The goal is to have access to the dict (DATA) everywhere in other modules, but with only one call to the function read_file. I need this module compatible with python2 and Python3
The tree is :
Config\
__init__.py
config.py
OtherModule\
__init__.py
module1.py
module2.py
The code are:
Config
==== __init__.py ====
from config.config import read_file
DATA = dict()
==== config.py ====
import config
def read_file(param):
....
config.DATA = (depend of param)
OtherModule
==== module1.py =====
import config
config.read_file(param)
==== module2.py =====
import config
print(config.DATA) -> Empty
import module2
print(config.DATA) -> NotEmpty
The question is why this works very fine in python3 but it's always empty in python2 ?
I know I can pass with another module, like config.varaible.DATA but I would like to not do it. It is possible ?
I using python2.7 et python 3.6
Somehow I couldn't find the exact answer to this elsewhere on SO.
Given:
root\
__init__.py
main.py
folder0\
__init__.py
folder1\
__init__.py
class1.py
folder2\
__init__.py
class2.py
Is there a way to import the top level directory as a whole? e.g.
# main.py
import folder0
obj1 = folder0.folder1.class1.Class1()
obj2 = folder0.folder2.class2.Class2()
Or do I have to import the modules directly? e.g.
# main.py
from folder0.folder1 import class1
from folder0.folder2 import class2
obj1 = class1.Class1()
obj2 = class2.Class2()
Sure. You just need to add the relevant imports into the __init__.py all the way down. e.g.:
# folder2/__init__.py
from . import class2
and
# folder0/__init__.py
from . import folder1
from . import folder2
and so-on.