I'm using Django, and I like to separate my models, views, and tests into subdirectories.
But, this means that I need to maintain an __init__.py in each subdirectory that imports every module in that directory.
I'd rather just put some call in that says:
from some_library import import_everything
import_everything()
That would have the same effect as iterating over the current directory and importing every .py file in that directory.
What's the best/easiest way to implement this?
Here are what my django application directories (essentially) look like:
some_django_app/
models/
__init__.py
create.py
read.py
update.py
delete.py
views/
__init__.py
create.py
read.py
update.py
delete.py
forms/
__init__.py
create.py
update.py
tests/
__init__.py
create.py
read.py
update.py
delete.py
So, you can see that to make a "proper" Django app, all my init.py files need to import all the other .py files in each directory. I'd rather just have some simple boilerplate there.
Within your app/models/__init__.py add these lines:
from app.models.create import *
from app.models.read import *
from app.models.update import *
from app.models.delete import *
This'll be your best bet for conciseness and readability. from app.models import * will now load all classes/etc from within each of the other files. Likewise, from app.models import foo will load foo no matter which of these files it's defined in.
Using the information given in synthesizerpatel's answer, you could implement import_everything this way:
import os
import sys
def import_everything(path):
# Insert near the beginning so path will be the item removed with sys.path.remove(path) below
# (The case when sys.path[0] == path works fine too).
# Do not insert at index 0 since sys.path[0] may have a special meaning
sys.path.insert(1,path)
for filename in os.listdir(path):
if filename.endswith('.py'):
modname = filename.replace('.py', '')
module = __import__(modname, fromlist = [True])
attrs = getattr(module, '__all__',
(attr for attr in dir(module) if not attr.startswith('_')))
for attr in attrs:
# print('Adding {a}'.format(a = attr))
globals()[attr] = getattr(module, attr)
sys.path.remove(path)
and could be used like this:
print(globals().keys())
# ['import_everything', '__builtins__', '__file__', '__package__', 'sys', '__name__', 'os', '__doc__']
import_everything(os.path.expanduser('~/test'))
print(globals().keys())
# ['hashlib', 'pythonrc', 'import_everything', '__builtins__', 'get_input', '__file__', '__package__', 'sys', 'mp', 'time', 'home', '__name__', 'main', 'os', '__doc__', 'user']
Related
I am working in Jupyter notebook. I created a simple module called conv.py for converting miles to km. When I try to import this module in a separate code (in the same directory) the import seems to go successfully but it doesn't recognize either of the functions I defined in the 'conv' module.
I have imported os and os.getcwd() provides the correct folder for conv.py...
code for conv.py
in_n_ft = 12
ft_n_mile = 5280
m_n_km = 1000
cm_n_in = 2.54
cm_n_m = 100
mm_n_m = 1000
def ft_to_km(feet):
return feet*in_n_ft*cm_n_in/cm_n_m/m_n_km
print(ft_to_km(5280))
def mil_to_km(mile):
return mile*ft_n_mile*in_n_ft*cm_n_in/cm_n_m/m_n_km
print(mil_to_km(3.2))
Code for new module
import conv
km = conv.mil_to_km(5)
Error provided
AttributeError Traceback (most recent call last)
<ipython-input-111-bfd778724ae2> in <module>
3 import conv
4
----> 5 km = conv.mil_to_km(5)
AttributeError: module 'conv' has no attribute 'mil_to_km'
When I type
dir(conv)
I get
['__builtins__',
'__cached__',
'__doc__',
'__file__',
'__loader__',
'__name__',
'__package__',
'__spec__']
What am I doing wrong?
EDIT
I have also tried
from conv import mil_to_km
when I do that I get a different error
cannot import name 'mil_to_km' from 'conv' (C:\Users\223023441\Documents\python\conv.py)
I have also queried the module using:
from inspect import getmembers, isfunction
import conv
print(getmembers(conv, isfunction))
from here I get:
['__builtins__',
'__cached__',
'__doc__',
'__file__',
'__loader__',
'__name__',
'__package__',
'__spec__']
I am also unable to access any of the variables within the conv.py file after import... Am I doing something wrong when I save the py file? Jupyter makes ipynb as the common file, when I 'save as' to conv.py, is this screwing it up?
You should import from the module.
Try this:
from conv import mil_to_km
km = mil_to_km(5)
The reason is that when you import the module in that way, you are executing it.
In the way I shown, you are just importing the needed functions.
So the ultimate issue was the way I was saving the .py file... I was using the 'save as' command in jupyter notebook and typing 'conv.py' for my file save... This was showing up in the directory as a .py file, but my main file wasn't recognizing it properly. Once I downloaded the file as a .py file, cut from my downloads folder and pasted into my working directory everything worked...
Are the following import statements equivalent: import mypack.mymod and from mypack import mymod in Python3?
Suppose I've the following directory directory hierarchy:
.
├── main.py
└── mypack
├── __init__.py
├── __pycache__
│ ├── __init__.cpython-37.pyc
│ └── mymod.cpython-37.pyc
└── mymod.py
When importing the mymod module in main.py are the import mypack.mymod and from mypack import mymod statements equivalent? I've been experimenting with both, and they seem to perform the exact same job.
Difference is w.r.t how package is available to use in your main.py. Let's take the three possible ways:
assume dir structure is same as yours and foo function is present in mymod.py
case 1:
# main.py
from mypack import mymod
if __name__ == '__main__':
print(dir())
mymod.foo()
This results in
['__builtins__', '__doc__', '__file__', '__name__', '__package__', 'mymod']
bar
case 2:
# main.py
import mypack.mymod
if __name__ == '__main__':
print(dir())
mypack.mymod.foo()
This results in
['__builtins__', '__doc__', '__file__', '__name__', '__package__', 'mypack']
bar
case 3:
# main.py
import mypack.mymod as md
if __name__ == '__main__':
print(dir())
md.foo()
This results in
['__builtins__', '__doc__', '__file__', '__name__', '__package__', 'md']
bar
Observation:
As you can see when dirs is printed, the module in case 1 is available as mymod & thus you use foo as mymod.foo, in case 2 as mypack & thus to use foo you now have to use mypack.mymod.foo and last in case 3 as md & this to use foo you now have to use md.foo
The two statements import the same module, but make it accessible through different identifiers. I will give examples using the os.path module:
import os.path
# Try accessing os.path.realpath
try:
rp = os.path.realpath(".")
except NameError:
print("!! Cannot access os.path")
else:
print("OK", rp)
# Try accessing path.realpath
try:
rp = path.realpath(".")
except NameError:
print("!! Cannot access path")
else:
print("OK", rp)
This yields:
OK C:\Users\ME\Documents
!! Cannot access path
Change the import in the first line to:
from os import path
And the output switches to:
!! Cannot access os.path
OK C:\Users\ME\Documents
Package structure:
World\
__init__.py
Chip\
__init__.py
grass.py
snow.py
water.py
(both __init__.pys are empty.)
When I do from world.chip import * I can use grass.Grass, but not snow.Snow. Why is this?
MUSIC = {
grass.Grass: "mus_grass",
snow.Snow: "mus_snow",
water.Water: "mus_water",
"default": "mus_grass"
}
NameError: name 'snow' is not defined
Is not the right way, but you can force the load of your packages:
# in your world/chip/__init__.py
from grass import *
from snow import *
from water import *
And then, when you import the Chip module, you will load all the others packages:
# Your structure dirs
$ tree
.
`-- world
|-- __init__.py
`-- chip
|-- __init__.py
|-- grass.py
|-- snow.py
|-- water.py
In your shell:
$ python
>>> dir()
['__builtins__', '__doc__', '__name__', '__package__', 'help']
>>> from world.chip import *
>>> dir()
['Grass', 'Snow', 'Water', '__builtins__', '__doc__', '__name__', '__package__', 'grass', 'help', 'snow', 'water']
if __init__.py is empty, there is no reason to expect that from world.chimp import * would bring in either to snow or grass modules.
Your post indicates that it brought in the grass module but we can't see everyting on your system.
Is there a world/__init__.py It could have a from chimp import grass which would explain the behavior. You could also have a pyc that is lurking around even though the py files are as you describe.
I'm still figuring this out myself, but I believe that you need to explicitly tell the chip portion of the package what sub-modules Python should import when using the * wildcard. Inside the __init__.py in the chip folder, add this:
__all__ = ["grass", "snow", "water"]
Without this addition, I get a NameError on grass. With this change, there is no error.
I need to know if there is a way to access parent modules from submodules. If I import submodule:
from subprocess import types
I have types - is there some Python magic to get access to subprocess module from types? Something similar to this for classes ().__class__.__bases__[0].__subclasses__().
If you've accessed a module you can typically get to it from the sys.modules dictionary. Python doesn't keep "parent pointers" with names, particularly because the relationship is not one-to-one. For example, using your example:
>>> from subprocess import types
>>> types
<module 'types' from '/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/types.pyc'>
>>> import sys
>>> sys.modules['subprocess']
<module 'subprocess' from '/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/subprocess.pyc'>
If you'll note the presence of types in the subprocess module is just an artifact of the import types statement in it. You just import types if you need that module.
In fact, a future version of subprocess may not import types any more, and your code will break. You should only import the names that appear in the __all__ list of a module; consider other names as implementation details.
So, for example:
>>> import subprocess
>>> dir(subprocess)
['CalledProcessError', 'MAXFD', 'PIPE', 'Popen', 'STDOUT', '_PIPE_BUF', '__all__', '__builtins__', '__doc__',
'__file__', '__name__', '__package__', '_active', '_cleanup', '_demo_posix', '_demo_windows', '_eintr_retry_call',
'_has_poll', 'call', 'check_call', 'check_output', 'errno', 'fcntl', 'gc', 'list2cmdline', 'mswindows', 'os',
'pickle', 'select', 'signal', 'sys', 'traceback', 'types']
>>> subprocess.__all__
['Popen', 'PIPE', 'STDOUT', 'call', 'check_call', 'check_output', 'CalledProcessError']
You can see that most of the names visible in subprocess are just other top-level modules that it imports.
For posterity, I ran into this also and came up with the one liner:
import sys
parent_module = sys.modules['.'.join(__name__.split('.')[:-1]) or '__main__']
The or '__main__' part is just in case you load the file directly it will return itself.
full_module_name = module.__name__
parent, _, sub = full_module_name.rpartition('.')
if parent:
parent = import(parent, fromlist='dummy')
I assume you are not inside the subprocess module already, you could do
import somemodule
children = dir(somemodule)
Then you could inspect the children of subprocess with the inspect module:
http://docs.python.org/library/inspect.html
Maybe the getmodule method would be useful for you?
http://docs.python.org/library/inspect.html#inspect.getmodule
import inspect
parent_module = inspect.getmodule(somefunction)
children = dir(parent_module)
package = parent_module.__package__
On my machine __package__ returns empty for 'types', but can be more useful for my own modules as it does return the parent module as a string
Best way worked for us was
Let' say folder structure
src
|demoproject
|
|--> uimodule--> ui.py
|--> backendmodule --> be.py
setup.py
1. Create installable package out of the project
2. Have __init__.py in all the directory(module)
3. create setup.py [ Keep in top level folder, here inside src]
Sample
from setuptools import setup, find_packages
setup(
name="demopackage",
version="1",
packages=find_packages(exclude=["tests.*", "tests"]),
author='',
author_email='',
description="",
url="",
)
4. From src folder, create installable package
pip3 install .
5. this will install a package --> demopackage
6. Now from any of your module you can access any module, ex
7. from ui.py to access be.py function calldb(), make below import
from demopackage.backendmodule.be import calldb
8. and so on, when you a new folder into your project just add __init__.py in that folder and it will be accessible, just like above, but you have to execute `"pip3 install ."`
zjm_code
|-----a.py
|-----a
|----- __init__.py
|-----b.py
in a.py is :
c='ccc'
in b.py is :
import a
print dir(a)
when i execute b.py ,it show (it import 'a' folder):
['__builtins__', '__doc__', '__file__', '__name__', '__path__']
and when i delete a folder, it show ,(it import a.py):
['__builtins__', '__doc__', '__file__', '__name__', 'c']
so my question is :
how to import a.py via not delete a folder
thanks
updated
i use imp.load_source, so in b.py is :
import imp,os
path = os.path.join(os.path.dirname(__file__), os.path.join('aaa.py'))
ok=imp.load_source('*',path)
print ok.c
it is ok now ,and print 'ccc'
and
how to show 'ccc' via "print c" not via "print ok.c" ???
thanks
updated2
it is ok now :
imp.load_source('anyname',path)
from anyname import *
print c
it show 'ccc'
updated3
it is also ok:
import imp,os
imp.load_source('anyname','aaa.py')
from anyname import *
print c
Use imp.load_module - there you can specify the file directory, overriding the behaviour of import.
Rename the folder to a different name. A folder with the same name takes precedence.