I want to make a function that loads all the .py files in a directory, and imports them using
__import__(), but I keep getting an ImportError: No module named toolboxtool1.
This is file structure:
project/dirreader.py
project/tools/toolboxtool1.py
project/tools/toolboxtool2.py
project/tools/toolboxtool3.py
What am I doing wrong?
import os
os.chdir(os.getcwd()+"/tools/")
stuff = os.listdir(os.getcwd())
for i in range(0,len(stuff)):
if stuff[i][-3:] == ".py":
stuff[i] = stuff[i][:-3]
else:
pass
modules = map(__import__, stuff)
Try prefixing the module names with "tools."
stuff[i] = 'tools.' + stuff[i][:-3]
because the modules you are trying to import are inside tools module package.
Related
I am brand new to working with python so this question might be basic. I am attempting to import five helper files into a primary script, the directory setup is as follows and both the script I'm calling from and the helper scripts are located within src here in this path-
/Users/myusername/Desktop/abd-datatable/src
I am currently importing as-
import helper1 as fdh
import helper2 as hdh
..
import helper5 as constants
The error I see is
File "/Users/myusername/Library/Application Support/JetBrains/Toolbox/apps/PyCharm-P/ch-0/22.77756/PyCharm.app/Contents/plugins/python/helpers/pydev/_pydev_bundle/pydev_import_hook.py", line 21, in do_import
module = self._system_import(name, *args, **kwargs)
ModuleNotFoundError: No module named 'src'
I have also attempted the following for which the import was still unsuccessful:
import src.helper1 as fdh
..
import src.helper5 as constants
and
from src import src.helper1 as fdh
...
from src import helper5 as constants
and tried adding the following to the head of the script-
import sys
import os
module_path = os.path.abspath(os.getcwd())
if module_path not in sys.path:
sys.path.append(module_path)
Would be very grateful for any pointers on how to debug this!
Seems likely that it's a path issue.
In your case, if you want to import from src, you would need to add Users/myusername/Desktop/abd-datatable to the path.
os.getcwd() is getting the path you invoke the script from, so it would only work if you are invoking the script from Users/myusername/Desktop/abd-datatable.
I have two scripts in the same folder:
5057_Basic_Flow_Acquire.xyzpy
5006_Basic_Flow_Execute.xyzpy
I need to call function run() from 5057_Basic_Flow_Acquire file in the 5006_Basic_Flow_Execute file.
I tried two approaches:
1.
import time
import os
import sys
import types
dir_path = os.path.dirname(os.path.realpath(__file__))
#os.chdir(str(dir_path))
sys.path.append(str(dir_path))
sensor = __import__('5057_Basic_Flow_Acquire')
import time
import os
import sys
import types
import importlib
import importlib.util
dir_path = os.path.dirname(os.path.realpath(__file__))
spec = importlib.util.spec_from_file_location('run', dir_path+'\\5057_Basic_Flow_Acquire')
module = importlib.util.module_from_spec(spec)
Pycharm is reporting this type of errors:
case:
ModuleNotFoundError: No module named '5057_Basic_Flow_Acquire'
case
AttributeError: 'NoneType' object has no attribute 'loader'
So in both cases module was not found.
Does someone have any suggestion?
You should rename your files to something like :
5057_Basic_Flow_Acquire_xyz.py
5006_Basic_Flow_Execute_xyz.py
so that they are recognized as modules and can be imported.
Then, in your 5057_Basic_Flow_Acquire_xyz module, you can import the run function with the following line :
from 5006_Basic_Flow_Execute_xyz import run
Both files are in the same directory, so you should be able to import without changing your PATH environment variable as the current directory is automatically added at the top of the list.
Here's my folder structure:
src
->deployment_pipeline
->__init__.py, train_pipeline.py
src
->dags
->__init__.py,airflow_dag.py
src
->db_connector_mlflow
-> __init__.py, db_connector_mlflow.py
Now, I am trying to import a function start_final_train from train_pipeline.py(which is inside the folder deployment_pipeline) to airflow_dag.py and from db_connector_mlflow.py(which is inside the folder db_connector_mlflow) to airflow_dag.py
My import statement:
from deployment_pipeline import start_final_train
But I keep getting this error:
ModuleNotFoundError: No module named 'deployment_pipeline'
imports must be globally installed or be in the same directory or subdirectory thereof as your main file.
If you move your main file to the src folder and run everything from there it works out.
Your main file should import:
from dags.airflow_dag import <stuff you need from airflow_dag.py>
....
You should keep the same structure in airflow_dag.py to import your function (as if you were importing from src):
from deployment_pipeline.train_pipeline import start_final_train
I need to dynamically import modules into my project from another package.
The structure is like:
project_folder/
project/
__init__.py
__main__.py
plugins/
__init__.py
plugin1/
__init__.py
...
plugin2/
__init__.py
...
I made this function to load a module:
import os
from importlib.util import spec_from_file_location, module_from_spec
def load_module(path, name=""):
""" loads a module by path """
try:
name = name if name != "" else path.split(os.sep)[-1] # take the module name by default
spec = spec_from_file_location(name, os.path.join(path, "__init__.py"))
plugin_module = module_from_spec(spec)
spec.loader.exec_module(plugin_module)
return plugin_module
except Exception as e:
print("failed to load module", path, "-->", e)
It works, unless the module uses relative imports:
failed to load module /path/to/plugins/plugin1 --> Parent module 'plugin1' not loaded, cannot perform relative import
What am I doing wrong?
I managed to solve my own issue after a LOT of googling. Turns out I needed to import using relative paths:
>>> from importlib import import_module
>>> config = import_module("plugins.config")
>>> config
<module 'plugins.config' from '/path/to/plugins/config/__init__.py'>
>>>
I had a similar problem not long ago. I added the path of the project folder to the sys.path using the module's absolute path like this:
import sys
import os
sys.path.append(os.path.dirname(os.path.realpath(__file__))+'/..')
This adds the project_folder to the sys.path thus allowing the import statement to find the plugin modules.
In a little project I have following path-structure:
main.py
Data---
|
__init__.py
Actions_2016_01.py
Actions_2016_02.py
Actions_2016_03.py
... and so on...
Every 'Action_date.py' file includes a list called
data = [something_1, something_2, ...]
Now I am trying get all the data-lists of the 'Action_date.py' files to a single list in the 'main.py' file.
I tried something like
files = os.listdir(os.path.join(os.path.dirname(__name__), 'Data'))
all_data = []
for name in files:
if name.startswith('Actions'):
import Data.name
all_data.extend(name.data)
But this doesn't work at all... I'm getting
ImportError: No module named 'Data.name'
as output.
I found a solution. Simply have to use the importlib module.
You could make your Data package's __init_.py do the work:
def _import_modules():
""" Dynamically import certain modules in the package, extract data in each
of them, and store it in a module global named all_data.
"""
from fnmatch import fnmatch
import traceback
import os
global __all__
__all__ = []
global all_data
all_data = []
globals_, locals_ = globals(), locals()
# dynamically import the desired package modules
for filename in os.listdir(os.path.join(os.path.dirname(__name__), 'Data')):
# process desired python files in directory
if fnmatch(filename, 'Actions*.py'):
modulename = filename.split('.')[0] # filename without extension
package_module = '.'.join([__name__, modulename])
try:
module = __import__(package_module, globals_, locals_, [modulename])
except:
traceback.print_exc()
raise
all_data.extend(module.data)
__all__.append('all_data')
_import_modules()
Which would allow you to do this in main.py:
import Data
print(Data.all_data)
This is an adaption of my answer to the question How to import members of modules within a package?