Import a whole folder of python files - python

I am making a bot in python 3 and wish it to be easily expanded so I have a central file and then one for each command. I wish to know if there is a way to import a sub-directory full of modules without importing each separately. For example:
example
├── commands
│   ├── bar.py
│   └── foo.py
└── main.py
And the code in main.pywould be something like:
import /commands/*
Thanks :D
Solution:
Import each separately with:
from commands import foo, bar
from commands import * Does not work.

If you're using python3, the importlib module can be used to dynamically import modules. On python2.x, there is the __import__ function but I'm not very familiar with the semantics. As a quick example,
I have 2 files in the current directory
# a.py
name = "a"
and
# b.py
name = "b"
In the same directory, I have this
import glob
import importlib
for f in glob.iglob("*.py"):
if f.endswith("load.py"):
continue
mod_name = f.split(".")[0]
print ("importing {}".format(mod_name))
mod = importlib.import_module(mod_name, "")
print ("Imported {}. Name is {}".format(mod, mod.name))
This will print
importing b Imported <module 'b' from '/tmp/x/b.py'>.
Name is b
importing a Imported <module 'a' from '/tmp/x/a.py'>.
Name is a

Import each separately with:
from commands import bar and
from commands import foo
from commands import * Does not work.

Related

import modules from __init__.py in another folder

I have the following project structure:
- workflow/
file1.ipynb
file2.ipynb
...
- utils/
__init__.py
function_one.py
function_two.py
...
I am working on file1.ipynb, so far I have found a way to import the variables defined in init.py through the following code:
utils = importlib.machinery.SourceFileLoader('utils', '/home/utils/__init__.py').load_module()
Let's assume my __init__.py contains the following:
from .function_one import *
I can then use the variables defined inside the __init__.py file.
However, every time I want to call any of these variables I need to use the following syntax:
utils.function_one ...
I want to be able to write function_one without the utils at the beginning.
How can I import directly the variables defined inside the __init__.py ?
I don't know why you don't import your module with the normal import mechanism: from ..utils import * or depending on where your python interpreter was started just from utils import * but if you insist on using utils = importlib.machinery.SourceFileLoader('utils', '/home/utils/__init__.py').load_module() you can hack all values into your globals like this:
tmp = globals()
for attr_name in dir(utils):
if not attr_name.startswith("_"): # don't import private and dunder attributes
tmp[attr_name] = getattr(utils, attr_name)
del tmp
function_one(...) # should work now
Try this:
from ..utils import *

Python: A problem with the package import from ... import * using __all__ and __init__

I have the following Python package with 2 moludes:
-pack1
|-__init__
|-mod1.py
|-mod2.py
-import_test.py
with the code:
# in mod1.py
a = 1
and
# in mod2.py
from mod1 import a
b = 2
and the __init__ code:
# in __init__.py
__all__ = ['mod1', 'mod2']
Next, I am trying to import the package:
# in import_test.py
from pack1 import *
But I get an error:
ModuleNotFoundError: No module named 'mod1'
If I remove the dependency "from mod1 import a" in mod2.py, the import goes correctly. But that dependency makes the import incorrect with that exception "ModuleNotFoundError".
???
The issue here is that from mod2 perspective the first level in which it will search for a module is in the path from which you are importing it (here I am assuming that pack1 is not in your PYTHONPATH and that you are importing it from the same directory where pack1 is contained).
This means that if pack1 is in the directory /dir/to/pack1 and you do:
from mod1 import a
Python will look for mod1 in the same directory as pack1, i.e., /dir/to/pack1.
To solve your issue it is enough to do either:
from pack1.mod1 import a
or in Python 3.5+
from .mod1 import a
As a side note, unless this is a must for you, I do not recommend designing your package to be used as from pack import *, even if __all__ exists to give you better control of your public API.

How to store a python module and data in a directory, and import that module with acess to the data

setup
I have the following structure (printed using tree from project root):
└── stuff
├── __init__.py
├── mod.py
└── stuff_data.py
I would like to be able to open an ipython session from the project root, and do the following:
import stuff.mod
But it's not currently working.
 file content
The files have the following content:
stuff_data.py
paper = {"type_0": "lined", "type_1": "plain"}
mod.py
from stuff_data import paper
def f():
"""print something...
"""
print(paper)
When I try the following (from project root)
import stuff.mod
I get the error
----> 1 from stuff_data import paper
2
3
4 def f():
5 """print x + 1
ModuleNotFoundError: No module named 'stuff_data'
I'm wondering either - how I should structure things so that I'm able to use
them in the way I've outlined above. Or - what should be done instead.
Your mod.py file's import should either be a relative import:
from .stuff_data import paper
OR
an absolute import starting from the project's root:
from stuff.stuff_data import paper

How to access symbols in __init__.py from __main__.py?

I have a module - let's call it foo - and I want to make it usable via a python -m foo call. My program look like this:
my_project
├── foo
│   └── __init__.py
└── my_program.py
In __init__.py I have some code which I run when calling python -m foo:
def bar(name):
print(name)
# -- code used to 'run' the module
def main(name):
bar("fritz")
if __name__ == "__main__":
main()
Since I have a fair amount of execution code in __init__.py now (argparse stuff and some logic) I want to separate it into a __main__.py:
my_project
├── foo
│   ├── __init__.py
│   └── __main__.py
└── my_program.py
Despite that looks very simple to me I didn't manage to import stuff located in __init__.py from __main__.py yet.
I know - if foo is located in site-packages or accessible via PYTHONPATH I can just import foo..
But in case I want to execute __main__.py directly (e.g. from some IDE) with foo located anywhere (i.e. not a folder where Python looks for packages) - is there a way to import foo (__init__.py from the same directory)?
I tried import . and import foo - but both approaches fail (because they just mean something else of course)
What I can do - at least to explain my goal - is something like this:
sys.path.append(os.path.join(os.path.dirname(__file__), ".."))
import foo
Works, but is ugly and a bit dangerous since I don't even know if I really import foo from the same directory..
You can manually set the module import state as if __main__.py were executed with -m:
# foo/__main__.py
import os
import sys
if __package__ is None and __name__ == "__main__": # executed without -m
# set special attributes as if part of the package
__file__ = os.path.abspath(__file__)
__package__ = os.path.basename(os.path.dirname(__file__))
# replace import path for __main__ with path for package
main_path = os.path.dirname(__file__)
try:
index = sys.path.index(dir_path)
if index != 0 or index != 1:
raise ValueError('expected script directory after current directory or matching it')
except ValueError:
raise RuntimeError('sys.path does not include script directory as expected')
else:
sys.path[index] = main_path
# import regularly
from . import bar
This exploits that python3 path/to/foo/__main__.py executes __main__ as a standalone script: __package__ is None and the __name__ does not include the package either. The search path in this case is <current directory>, <__main__ directory>, ..., though it gets collapsed if the two are the same: the index is either 0 or 1.
As with all trickery on internals, there is some transient state where invariants are violated. Do not perform any imports before the module is patched!

How to access the current executing module's attributes from other modules?

I have several 'app'-modules (which are being started by a main-application)
and a utility module with some functionality:
my_utility/
├── __init__.py
└── __main__.py
apps/
├── app1/
│ ├── __init__.py
│ └── __main__.py
├── app2/
│ ├── __init__.py
│ └── __main__.py
...
main_app.py
The apps are being started like this (by the main application):
python3 -m <app-name>
I need to provide some meta information (tied to the module) about each app which is readable by the main_app and the apps themselves:
apps/app1/__init__.py:
meta_info = {'min_platform_version': '1.0',
'logger_name': 'mm1'}
... and use it like this:
apps/app1/__main__.py:
from my_utility import handle_meta_info
# does something with meta_info (checking, etc.)
handle_meta_info()
main_app.py:
mod = importlib.import_module('app1')
meta_inf = getattr(mod, 'meta_info')
do_something(meta_inf)
The Problem
I don't know how to access meta_info from within the apps. I know I can
import the module itself and access meta_info:
apps/app1/__main__.py:
import app1
do_something(app1.meta_info)
But this is only possible if I know the name of the module. From inside another module - e.g. my_utility I don't know how to access the module which has been started in the first place (or it's name).
my_utility/__main__.py:
def handle_meta_info():
import MAIN_MODULE <-- don't know, what to import here
do_something(MAIN_MODULE.meta_info)
In other words
I don't know how to access meta_info from within an app's process (being started via python3 -m <name> but from another module which does not know the name of the 'root' module which has been started
Approaches
Always provide the module name when calling meta-info-functions (bad, because it's verbose and redundant)
from my_utility import handle_meta_info
handle_meta_info('app1')
add meta_info to __builtins__ (generally bad to pollute global space)
Parse the command line (ugly)
Analyze the call stack on import my_utility (dangerous, ugly)
The solution I'd like to see
It would be nice to be able to either access the "main" modules global space OR know it's name (to import)
my_utility/__main__.py:
def handle_meta_info():
do_something(__main_module__.meta_info)
OR
def handle_meta_info():
if process_has_been_started_as_module():
mod = importlib.import_module(name_of_main_module())
meta_inf = getattr(mod, 'meta_info')
do_something(meta_inf)
Any ideas?
My current (bloody) solution:
Inside my_utility I use psutil to get the command line the module has been started with (why not sys.argv? Because). There I extract the module name. This way I attach the desired meta information to my_utility (so I have to load it only once).
my_utility/__init__.py:
def __get_executed_modules_meta_info__() -> dict:
def get_executed_module_name()
from psutil import Process
from os import getpid
_cmdline = Process(getpid()).cmdline
try:
# normal case: app has been started via 'python3 -m <app>'
return _cmdline[_cmdline.index('-m') + 1]
except ValueError:
return None
from importlib import import_module
try:
_main_module = import_module(get_module_name())
return import_module(get_executed_module_name()).meta_info
except AttributeError:
return {}
__executed_modules_meta_info__ = __get_executed_modules_meta_info__()

Categories