Python: Some things in package visible, others not - python

Package structure:
World\
__init__.py
Chip\
__init__.py
grass.py
snow.py
water.py
(both __init__.pys are empty.)
When I do from world.chip import * I can use grass.Grass, but not snow.Snow. Why is this?
MUSIC = {
grass.Grass: "mus_grass",
snow.Snow: "mus_snow",
water.Water: "mus_water",
"default": "mus_grass"
}
NameError: name 'snow' is not defined

Is not the right way, but you can force the load of your packages:
# in your world/chip/__init__.py
from grass import *
from snow import *
from water import *
And then, when you import the Chip module, you will load all the others packages:
# Your structure dirs
$ tree
.
`-- world
|-- __init__.py
`-- chip
|-- __init__.py
|-- grass.py
|-- snow.py
|-- water.py
In your shell:
$ python
>>> dir()
['__builtins__', '__doc__', '__name__', '__package__', 'help']
>>> from world.chip import *
>>> dir()
['Grass', 'Snow', 'Water', '__builtins__', '__doc__', '__name__', '__package__', 'grass', 'help', 'snow', 'water']

if __init__.py is empty, there is no reason to expect that from world.chimp import * would bring in either to snow or grass modules.
Your post indicates that it brought in the grass module but we can't see everyting on your system.
Is there a world/__init__.py It could have a from chimp import grass which would explain the behavior. You could also have a pyc that is lurking around even though the py files are as you describe.

I'm still figuring this out myself, but I believe that you need to explicitly tell the chip portion of the package what sub-modules Python should import when using the * wildcard. Inside the __init__.py in the chip folder, add this:
__all__ = ["grass", "snow", "water"]
Without this addition, I get a NameError on grass. With this change, there is no error.

Related

Are the following `import` statements equivalent: `import mypack.mymod` and `from mypack import mymod` in Python3?

Are the following import statements equivalent: import mypack.mymod and from mypack import mymod in Python3?
Suppose I've the following directory directory hierarchy:
.
├── main.py
└── mypack
├── __init__.py
├── __pycache__
│   ├── __init__.cpython-37.pyc
│   └── mymod.cpython-37.pyc
└── mymod.py
When importing the mymod module in main.py are the import mypack.mymod and from mypack import mymod statements equivalent? I've been experimenting with both, and they seem to perform the exact same job.
Difference is w.r.t how package is available to use in your main.py. Let's take the three possible ways:
assume dir structure is same as yours and foo function is present in mymod.py
case 1:
# main.py
from mypack import mymod
if __name__ == '__main__':
print(dir())
mymod.foo()
This results in
['__builtins__', '__doc__', '__file__', '__name__', '__package__', 'mymod']
bar
case 2:
# main.py
import mypack.mymod
if __name__ == '__main__':
print(dir())
mypack.mymod.foo()
This results in
['__builtins__', '__doc__', '__file__', '__name__', '__package__', 'mypack']
bar
case 3:
# main.py
import mypack.mymod as md
if __name__ == '__main__':
print(dir())
md.foo()
This results in
['__builtins__', '__doc__', '__file__', '__name__', '__package__', 'md']
bar
Observation:
As you can see when dirs is printed, the module in case 1 is available as mymod & thus you use foo as mymod.foo, in case 2 as mypack & thus to use foo you now have to use mypack.mymod.foo and last in case 3 as md & this to use foo you now have to use md.foo
The two statements import the same module, but make it accessible through different identifiers. I will give examples using the os.path module:
import os.path
# Try accessing os.path.realpath
try:
rp = os.path.realpath(".")
except NameError:
print("!! Cannot access os.path")
else:
print("OK", rp)
# Try accessing path.realpath
try:
rp = path.realpath(".")
except NameError:
print("!! Cannot access path")
else:
print("OK", rp)
This yields:
OK C:\Users\ME\Documents
!! Cannot access path
Change the import in the first line to:
from os import path
And the output switches to:
!! Cannot access os.path
OK C:\Users\ME\Documents

ModuleNotFoundError using "classic" import on module imported from importlib

TL,DR
I've :
mod = import_module('path.module')
After that, what i want/need to do :
from mod.script import func
But that give me :
ModuleNotFoundError: No module named 'mod.script'
Warning : call it using "mod.script.func()" or something like that doesn't respond to my need (project constraint), i search how to have a syntax like "from [module_imported_from_importlib] import XXX"
Introduction :
I need to split an existing code in different folder with multiple version. The goal is to have different part in the app, with each part using a specified version of an other part.
Example tree :
ref.py
block1
__init__.py
- v1
| __init__.py
|- __init__.py
|- script1.py
block2
__init__.py
- v1
| __init__.py
|-- __init__.py
|-- script2.py
- v2
|- __init__.py
|-- __init__.py
|-- script2.py
With this, i need to run :
/block2/v1/script2.py functions in /block1/v1/script1.py
Goal
what i try to do is to specify where script1 should take the "script2" (in v1 or v2 in block2) using the same syntax but just specifying the block without the version (that will change) :
old script1.py :
from script2 import <func>
new script1.py
from block2.script2 import <func>
Code
I've tryed a lot of thing without success, now i'm here that seems to be close to solution but i can't find it (maybe it's not possible ?) :
in block1/v1/init.py :
from importlib import import_module, reload
MODULE = import_module('block2.v1') # With 'block2.v1' defined as a variable somewhere else (eg in ref.py)
reload(MODULE)
in block1/v1/script1.py :
from block1.v1 import MODULE as block2
print(block2)
print(f'block2 : {dir(block2)}')
from block2.script2 import test
in block2/v1/init.py :
from block2.v1 import script2
print(script2)
in block2/v1/script2.py :
def test():
print("hello")
Result of # python block1/v1/script1.py :
<module 'block2.v1.script2' from 'xxx/block2/v1/script2.py'>
<module 'block2.v1.script2' from 'xxx/block2/v1/script2.py'>
<module 'block2.v1' from 'xxx/block1/v1/__init__.py'>
block2 : ['__builtins__', '__cached__', '__doc__', '__file__', '__loader__', '__name__', '__package__', '__path__', '__spec__', 'script2']
Traceback (most recent call last):
File "block1/v1/script1.py", line 17, in <module>
from block2.script2 import test
ModuleNotFoundError: No module named 'block2.script2'
Expected result
Shortly, I expect to have this call syntax in script 1 :
from block2.script2 import test
test()
to be able to run my test function
Thanks a lot for help, I don't know it's really clear !
After lot of test, I finally found something nice.
If you want to use a custom module with a custom name, the best to do is to create a module and giving him the name you want :D
Tree :
/GLOBALCONFIG
|- /module_manager
|- block1.py
|- block2.py
|- blocX..
- module_version_manager
/block1
|- /v1
|- script1.py
|- /v2
|- script1.py
/block2
|- /v1
|- script2.py
|- /v2
|- script2.py
For that I've made a dir named "GLOBALCONFIG" at the project root that contains a .py manager with this :
from importlib import import_module
import sys
import os
block1 = sys.modules['block1'] = import_module('block1.v1')
block2 = sys.modules['block2'] = import_module('block2.v1')
Using this, i can keep all my import, specifying the version only on the "manager" file, eg :
block1/v1/script1.py :
from GLOBALCONFIG.module_manager.block1 import block2
from block2 import script2
if I put script2 in block2/v2 or /v3 or any, i just have to change with the right version to use for block2 in block1.py manager and code will still work
Moreover, if you want :
block1/v1 using block2/v1
block1/v2 using block2/v2
You can create a dict that refer the module / version in your global config :
module_version_manager.py :
block1_modules_versions = {
'block1.v1': {
'block2':'block2.v1',
}
'block1.v2': {
'block2':'block2.v2',
}
}
block2_modules_versions = {
'block2.v2': {
'block1':'block1.v1',
}
'block2.v2': {
'block1':'block1.v2',
}
}
then, in your block1.py manager :
from GLOBALCONFIG.module_version_manager import block1_modules_versions
from importlib import import_module
import sys
import os
block1 = sys.modules['u1'] = import_module('.'.join(str.rsplit(sys.argv[0], "/")[3:5])) # give me the current version of the current block : block1.vX
block2 = sys.modules['block2'] = import_module(block1_modules_versions['.'.join(str.rsplit(sys.argv[0], "/")[3:5])]['block2']) # give me the block2 to use for the right block1.vX
in block1/v1/script1.py :
import sys
sys.argv[0] = __file__ # [path]/block1/v1/script1 => current path of executed script giving the right folder and import the right modules
from GLOBALCONFIG.module_manager.block1 import block1, block2
block1 => block1/v1
block2 => block2/v1
But here is the magical trick :
in block1/v2/script1.py :
import sys
sys.argv[0] = __file__
from GLOBALCONFIG.module_manager.block1 import block1, block2
block1 => block1/v2
block2 => block2/v2
Different version but exactly the same code
FYI : The sys.argv[0] = file that force the current file path is mandatory for me cause my service start using gunicon wsgi:app that made the default path at [venv]/bin/gunicorn but without this constraint, the GLOBALCONFIG import should nicely work
Regards,

What's the best way to implement "from . import *" in Python?

I'm using Django, and I like to separate my models, views, and tests into subdirectories.
But, this means that I need to maintain an __init__.py in each subdirectory that imports every module in that directory.
I'd rather just put some call in that says:
from some_library import import_everything
import_everything()
That would have the same effect as iterating over the current directory and importing every .py file in that directory.
What's the best/easiest way to implement this?
Here are what my django application directories (essentially) look like:
some_django_app/
models/
__init__.py
create.py
read.py
update.py
delete.py
views/
__init__.py
create.py
read.py
update.py
delete.py
forms/
__init__.py
create.py
update.py
tests/
__init__.py
create.py
read.py
update.py
delete.py
So, you can see that to make a "proper" Django app, all my init.py files need to import all the other .py files in each directory. I'd rather just have some simple boilerplate there.
Within your app/models/__init__.py add these lines:
from app.models.create import *
from app.models.read import *
from app.models.update import *
from app.models.delete import *
This'll be your best bet for conciseness and readability. from app.models import * will now load all classes/etc from within each of the other files. Likewise, from app.models import foo will load foo no matter which of these files it's defined in.
Using the information given in synthesizerpatel's answer, you could implement import_everything this way:
import os
import sys
def import_everything(path):
# Insert near the beginning so path will be the item removed with sys.path.remove(path) below
# (The case when sys.path[0] == path works fine too).
# Do not insert at index 0 since sys.path[0] may have a special meaning
sys.path.insert(1,path)
for filename in os.listdir(path):
if filename.endswith('.py'):
modname = filename.replace('.py', '')
module = __import__(modname, fromlist = [True])
attrs = getattr(module, '__all__',
(attr for attr in dir(module) if not attr.startswith('_')))
for attr in attrs:
# print('Adding {a}'.format(a = attr))
globals()[attr] = getattr(module, attr)
sys.path.remove(path)
and could be used like this:
print(globals().keys())
# ['import_everything', '__builtins__', '__file__', '__package__', 'sys', '__name__', 'os', '__doc__']
import_everything(os.path.expanduser('~/test'))
print(globals().keys())
# ['hashlib', 'pythonrc', 'import_everything', '__builtins__', 'get_input', '__file__', '__package__', 'sys', 'mp', 'time', 'home', '__name__', 'main', 'os', '__doc__', 'user']

Is there a way to access parent modules in Python

I need to know if there is a way to access parent modules from submodules. If I import submodule:
from subprocess import types
I have types - is there some Python magic to get access to subprocess module from types? Something similar to this for classes ().__class__.__bases__[0].__subclasses__().
If you've accessed a module you can typically get to it from the sys.modules dictionary. Python doesn't keep "parent pointers" with names, particularly because the relationship is not one-to-one. For example, using your example:
>>> from subprocess import types
>>> types
<module 'types' from '/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/types.pyc'>
>>> import sys
>>> sys.modules['subprocess']
<module 'subprocess' from '/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/subprocess.pyc'>
If you'll note the presence of types in the subprocess module is just an artifact of the import types statement in it. You just import types if you need that module.
In fact, a future version of subprocess may not import types any more, and your code will break. You should only import the names that appear in the __all__ list of a module; consider other names as implementation details.
So, for example:
>>> import subprocess
>>> dir(subprocess)
['CalledProcessError', 'MAXFD', 'PIPE', 'Popen', 'STDOUT', '_PIPE_BUF', '__all__', '__builtins__', '__doc__',
'__file__', '__name__', '__package__', '_active', '_cleanup', '_demo_posix', '_demo_windows', '_eintr_retry_call',
'_has_poll', 'call', 'check_call', 'check_output', 'errno', 'fcntl', 'gc', 'list2cmdline', 'mswindows', 'os',
'pickle', 'select', 'signal', 'sys', 'traceback', 'types']
>>> subprocess.__all__
['Popen', 'PIPE', 'STDOUT', 'call', 'check_call', 'check_output', 'CalledProcessError']
You can see that most of the names visible in subprocess are just other top-level modules that it imports.
For posterity, I ran into this also and came up with the one liner:
import sys
parent_module = sys.modules['.'.join(__name__.split('.')[:-1]) or '__main__']
The or '__main__' part is just in case you load the file directly it will return itself.
full_module_name = module.__name__
parent, _, sub = full_module_name.rpartition('.')
if parent:
parent = import(parent, fromlist='dummy')
I assume you are not inside the subprocess module already, you could do
import somemodule
children = dir(somemodule)
Then you could inspect the children of subprocess with the inspect module:
http://docs.python.org/library/inspect.html
Maybe the getmodule method would be useful for you?
http://docs.python.org/library/inspect.html#inspect.getmodule
import inspect
parent_module = inspect.getmodule(somefunction)
children = dir(parent_module)
package = parent_module.__package__
On my machine __package__ returns empty for 'types', but can be more useful for my own modules as it does return the parent module as a string
Best way worked for us was
Let' say folder structure
src
|demoproject
|
|--> uimodule--> ui.py
|--> backendmodule --> be.py
setup.py
1. Create installable package out of the project
2. Have __init__.py in all the directory(module)
3. create setup.py [ Keep in top level folder, here inside src]
Sample
from setuptools import setup, find_packages
setup(
name="demopackage",
version="1",
packages=find_packages(exclude=["tests.*", "tests"]),
author='',
author_email='',
description="",
url="",
)
4. From src folder, create installable package
pip3 install .
5. this will install a package --> demopackage
6. Now from any of your module you can access any module, ex
7. from ui.py to access be.py function calldb(), make below import
from demopackage.backendmodule.be import calldb
8. and so on, when you a new folder into your project just add __init__.py in that folder and it will be accessible, just like above, but you have to execute `"pip3 install ."`

how to import a.py not a folder

zjm_code
|-----a.py
|-----a
|----- __init__.py
|-----b.py
in a.py is :
c='ccc'
in b.py is :
import a
print dir(a)
when i execute b.py ,it show (it import 'a' folder):
['__builtins__', '__doc__', '__file__', '__name__', '__path__']
and when i delete a folder, it show ,(it import a.py):
['__builtins__', '__doc__', '__file__', '__name__', 'c']
so my question is :
how to import a.py via not delete a folder
thanks
updated
i use imp.load_source, so in b.py is :
import imp,os
path = os.path.join(os.path.dirname(__file__), os.path.join('aaa.py'))
ok=imp.load_source('*',path)
print ok.c
it is ok now ,and print 'ccc'
and
how to show 'ccc' via "print c" not via "print ok.c" ???
thanks
updated2
it is ok now :
imp.load_source('anyname',path)
from anyname import *
print c
it show 'ccc'
updated3
it is also ok:
import imp,os
imp.load_source('anyname','aaa.py')
from anyname import *
print c
Use imp.load_module - there you can specify the file directory, overriding the behaviour of import.
Rename the folder to a different name. A folder with the same name takes precedence.

Categories