I have file A.py in a package with a class and a list as class attribute or class variable. File A run a loop to update value to that list. How can I get the list and all changes to file B.py (in another package).
When using only 'class attributes', the values are static, a simple 'import' should work.
For example, proj/package1/A.py:
class A:
attr1 = "foo"
attr2 = "bar"
Create file __init__.py in all directories: current, package1, package2
Import A from proj dir in proj/B.py:
from package1.A import *
print(A.attr1)
print(A.attr2)
Import A from non-parent dir, eg. proj/package2/B.py
import sys
sys.path.append("package1") # Run programme in `proj` dir
from A import *
print(A.attr1)
print(A.attr2)
Related
I have the following project structure:
- workflow/
file1.ipynb
file2.ipynb
...
- utils/
__init__.py
function_one.py
function_two.py
...
I am working on file1.ipynb, so far I have found a way to import the variables defined in init.py through the following code:
utils = importlib.machinery.SourceFileLoader('utils', '/home/utils/__init__.py').load_module()
Let's assume my __init__.py contains the following:
from .function_one import *
I can then use the variables defined inside the __init__.py file.
However, every time I want to call any of these variables I need to use the following syntax:
utils.function_one ...
I want to be able to write function_one without the utils at the beginning.
How can I import directly the variables defined inside the __init__.py ?
I don't know why you don't import your module with the normal import mechanism: from ..utils import * or depending on where your python interpreter was started just from utils import * but if you insist on using utils = importlib.machinery.SourceFileLoader('utils', '/home/utils/__init__.py').load_module() you can hack all values into your globals like this:
tmp = globals()
for attr_name in dir(utils):
if not attr_name.startswith("_"): # don't import private and dunder attributes
tmp[attr_name] = getattr(utils, attr_name)
del tmp
function_one(...) # should work now
Try this:
from ..utils import *
I have a project organized in directories as follows:
|main_dir/
| /__init__.py
| /core/
| /__init__.py
| /foo.py
| /test1.py
| /scr/
| /test2.py
In foo.py, I define a class as follows
class foo(object):
def __init__(self):
pass
This is what I have in core/test1.py
from foo import foo
f1=foo()
print(type(f1))
This is what I have in scr/test2.py:
import sys
sys.path.append('../')
from core.foo import foo
f2 = foo()
print(type(f2))
main_dir/__init__.py contains this:
__all__ = ["core"]
import sys, os
dir = os.path.dirname(__file__)
for subpackage in __all__:
sys.path.append(os.path.join(dir,subpackage))
and core/__init__.py contains:
__all__ = ["foo"]
import sys, os
dir = os.path.dirname(__file__)
for subpackage in __all__:
sys.path.append(os.path.join(dir,subpackage))
When I run test1.py, the result is class 'foo.foo'> while when I run test2.py, I get <class 'core.foo.foo'>. I understand that this is how Python behaves, but I am not sure how to get around it. I would like to get True when checking type(f1) == type(f2) as I need to access the object foo for different locations.
Add the first two lines from test2.py:
import sys
sys.path.append('../')
to the top of test1.py. Then change your imports of foo in test1.py and test2.py to use the fully-qualified name:
from main_dir.core.foo import foo
Afterward:
# core/test.py
<class 'main_dir.core.foo.foo'>
# scr/test2.py
<class 'main_dir.core.foo.foo'>
I think in file core/test1.py You can try to import foo like this
from .foo import foo
with the dot before the file name to ensure that is the file in the same directory gave it a try
or
You can import foo using as like it down
from .foo import foo as foo_one
from core.foo import foo as foo_core
I hope I understand what do you need and I hope it will help
Using python I launch a script from a certain point. From here, there's the subdirectory "A" where it's contained the file "B.py", which contains a class called "C"
from A import C
Traceback
ImportError: cannot import name 'C' from 'A' (unknown location). Is there an easy way to make python look in A then in B and finally get C? Thank you
If I have a directory structure like the following:
A -----------|
|- B.py
test.py
B.py contains the following code:
class C:
def __init__(self):
print("I made it to C!")
If I'm writing test.py to need an instance of C I need to import like this:
from A.B import C
myInst = C()
And I get the following output when I run test.py
I made it to C!
Alternatively, you can add the path to the directory containing B.py to your PATH with the sys module. To show this, I have put the file E.py in the structure of A/B/C/D/E.py along with an empty file named __init__.py
'=
E.py contains the following code:
class F:
def __init__(self):
print("I made it to C!")
test.py contains the following code:
import sys
sys.path.insert(0, "./A/B/C/D/") # You can put the absolute path
from E import F
myInst = F()
And I get the output:
I made it to C!
I'm trying to create an auto_import function which is part of a library: the purpose of this to avoid listing from .x import y many times in __init__ files, only do something this import lib; lib.auto_import(__file__) <- this would search for python files in that folder where the __init__ is present and would import all stuff by exec statement (i.e. exec('from .x import abc')).
My problem is that, somehow the 'from' statement always tries to import .x from lib directory, even if I change the cwd to the directory where the actual __init__ file is placed... How should I solve this? How should I change the search dir for from . statement?
Structure:
$ ls -R
.:
app.py lib x
./lib:
__init__.py auto_import.py
./x:
__init__.py y
./x/y:
__init__.py y.py
e.g.: ./x/y/__init__.py contains import lib; lib.auto_import(__file__)
auto_import is checking for files in dir of __file__ and import them with exec('from .{} import *') (but this from . is always the lib folder and not the dir of __file__, and that is my question, how to change this to dir of __file__
Of course the whole stuff is imported in app.py like:
import x
print(x.y)
Thanks
EDIT1: final auto_import (globals() / gns cannot be avoided )
import os, sys, inspect
def auto_import(gns):
current_frame = inspect.currentframe()
caller_frame = inspect.getouterframes(current_frame)[1]
src_file = caller_frame[1]
for item in os.listdir(os.path.dirname(src_file)):
item = item.split('.py')[0]
if item in ['__init__', '__pycache__']:
continue
gns.update(__import__(item, gns, locals(), ['*'], 1).__dict__)
The problem of your approach is that auto_import is defined in lib/auto_import.py so the context for exec('from .x import *') is always lib/. Even though you manage to fix the path problem, lib.auto_import(__file__) will not import anything to the namespace of lib.x.y, because the function locates in another module.
Use the built-in function __import__
Here is the auto_import script:
myimporter.py
# myimporter.py
def __import_siblings__(gns, lns={}):
for name in find_sibling_names(gns['__file__']):
gns.update((k,v) for k,v in __import__(name, gns,lns).__dict__.items() if not k.startswith('_'))
import re,os
def find_sibling_names(filename):
pyfp = re.compile(r'([a-zA-Z]\w*)\.py$')
files = (pyfp.match(f) for f in os.listdir(os.path.dirname(filename)))
return set(f.group(1) for f in files if f)
Inside your lib/x/y/__init__.py
#lib/x/y/__init__.py
from myimporter import __import_siblings__
__import_siblings__(globals())
Let's say you have a dummy module that need to be imported to y:
#lib/x/y/dummy.py
def hello():
print 'hello'
Test it:
import x.y
x.y.hello()
Please be aware that from lib import * is usually a bad habit because of namespace pollution. Use it with caution.
Refs:
1
2
Use sys module
With this folder structure:
RootDir
|-- module.py
|--ChildDir
|-- main.py
Now in main.py you can do
import sys
sys.path.append('..')
import module
A believe there are other hacks, but this is the one I know of and works for my purpose. I am not sure, whether it is the best option to go for some kind of auto_import stuff though.
Here's the structure I'm working with:
directory/
script.py
subdir/
__init__.py
myclass01.py
myclass02.py
What I want to do is import in script.py the classes defined in myclass01.py and myclass02.py. If I do:
from subdir.myclass01 import *
It works fine for the class defined in myclass01.py. But with this solution if there are many classes defined in different files in subdir and I want to import all of them, I'd have to type one line for each file. There must be a shortcut for this. I tried:
from subdir.* import *
But it didn't work out.
EDIT: here are the contents of the files:
This is __init__.py (using __all__ as Apalala suggested):
__all__ = ['MyClass01','MyClass02']
This is myclass01.py:
class MyClass01:
def printsomething():
print 'hey'
This is myclass02.py:
class MyClass02:
def printsomething():
print 'sup'
This is script.py:
from subdir import *
MyClass01().printsomething()
MyClass02().printsomething()
This is the traceback that I get when I try to run script.py:
File "script.py", line 1, in <module>
from subdir import *
AttributeError: 'module' object has no attribute 'MyClass01'
Although the names used there are different from what's shown in your question's directory structure, you could use my answer to the question titled Namespacing and classes. The __init__.py shown there would have also allowed the usepackage.py script to have been written this way (package maps to subdir in your question, and Class1 to myclass01, etc):
from package import *
print Class1
print Class2
print Class3
Revision (updated):
Oops, sorry, the code in my other answer doesn't quite do what you want — it only automatically imports the names of any package submodules. To make it also import the named attributes from each submodule requires a few more lines of code. Here's a modified version of the package's __init__.py file (which also works in Python 3.4.1):
def _import_package_files():
""" Dynamically import all the public attributes of the python modules in this
file's directory (the package directory) and return a list of their names.
"""
import os
exports = []
globals_, locals_ = globals(), locals()
package_path = os.path.dirname(__file__)
package_name = os.path.basename(package_path)
for filename in os.listdir(package_path):
modulename, ext = os.path.splitext(filename)
if modulename[0] != '_' and ext in ('.py', '.pyw'):
subpackage = '{}.{}'.format(package_name, modulename) # pkg relative
module = __import__(subpackage, globals_, locals_, [modulename])
modict = module.__dict__
names = (modict['__all__'] if '__all__' in modict else
[name for name in modict if name[0] != '_']) # all public
exports.extend(names)
globals_.update((name, modict[name]) for name in names)
return exports
if __name__ != '__main__':
__all__ = ['__all__'] + _import_package_files() # '__all__' in __all__
Alternatively you can put the above into a separate .py module file of its own in the package directory—such as _import_package_files.py—and use it from the package's __init__.py like this:
if __name__ != '__main__':
from ._import_package_files import * # defines __all__
__all__.remove('__all__') # prevent export (optional)
Whatever you name the file, it should be something that starts with an _ underscore character so it doesn't try to import itself recursively.
Your best option, though probably not the best style, is to import everything into the package's namespace:
# this is subdir/__init__.py
from myclass01 import *
from myclass02 import *
from myclass03 import *
Then, in other modules, you can import what you want directly from the package:
from subdir import Class1
I know it's been a couple months since this question was answered, but I was looking for the same thing and ran across this page. I wasn't very satisfied with the chosen answer, so I ended up writing my own solution and thought I'd share it. Here's what I came up with:
# NOTE: The function name starts with an underscore so it doesn't get deleted by iself
def _load_modules(attr_filter=None):
import os
curdir = os.path.dirname(__file__)
imports = [os.path.splitext(fname)[0] for fname in os.listdir(curdir) if fname.endswith(".py")]
pubattrs = {}
for mod_name in imports:
mod = __import__(mod_name, globals(), locals(), ['*'], -1)
for attr in mod.__dict__:
if not attr.startswith('_') and (not attr_filter or attr_filter(mod_name, attr)):
pubattrs[attr] = getattr(mod, attr)
# Restore the global namespace to it's initial state
for var in globals().copy():
if not var.startswith('_'):
del globals()[var]
# Update the global namespace with the specific items we want
globals().update(pubattrs)
# EXAMPLE: Only load classes that end with "Resource"
_load_modules(attr_filter=lambda mod, attr: True if attr.endswith("Resource") else False)
del _load_modules # Keep the namespace clean
This simply imports * from all .py files in the package directory and then only pulls the public ones into the global namespace. Additionally, it allows a filter if only certain public attributes are desired.
I use this simple way:
add the directory to the system path, then
import module or from module import function1, class1 in that directory.
notice that the module is nothing but the name of your *.py file, without the extension part.
Here is a general example:
import sys
sys.path.append("/path/to/folder/")
import module # in that folder
In your case it could be something like this:
import sys
sys.path.append("subdir/")
import myclass01
# or
from myclass01 import func1, class1, class2 # .. etc
from subdir.* import *
You can not use '*' this way directly after the 'from' statement.
You need explict imports. Please check with the Python documentation on imports and packages.