Import many python files in bash script - python

I have a problem with the import file in the bash script.
My project directory looks like this:
project
├── folder1
│ └── file1.py
├── folder2
│ └── file2.py
├── data
│ └── example.txt
├── utils
│ ├── __init__.py
│ └── globals.py
└── run.sh
The globals.py file contains some config global variables that will be used in the whole project.
I want to write a bash file to run all .py files like this run.sh:
conda activate pyenv
# Step 1.1
python folder1/file1.py data/example.txt >> data/output_1_1.txt
# Step 1.2
python folder2/file2.py data/output_1_1.txt >> data/output_1_2.txt
But then I got this error:
Traceback (most recent call last):
File "folder2/file2.py", line 15, in <module>
from utils import globals
ModuleNotFoundError: No module named 'utils'
This is my file2.py where I import utils.globals:
import sys
sys.path.append("../") # Add "../" to utils folder path
from utils import globals
When I run each file individually, it works fine, but I don't know why it doesn't work when I run source run.sh.

As this answer points out it is not good to use a relative path when the project structure is complex.
So I change my project directory to this to make it a package like #bigbounty's suggested:
project
├── package
│   ├── folder1
│   │   └── file1.py
│   ├── folder2
│   │   └── file2.py
│   ├── data
│   │   └── example.txt
│   └── utils
│      ├── __init__.py
│       └── globals.py
└── run.sh
In the file2.py, change the import to this:
from package.utils import globals
The run.sh file:
conda activate pyenv
# Step 1.1
python -m package.folder1.file1 package/data/example.txt >> package/data/output_1_1.txt
# Step 1.2
python -m package.folder2.file2 package/data/output_1_1.txt >> package/data/output_1_2.txt
To run the bash script, cd to the /project:
(base) user#user1:~/project$ source run.sh
Then it works as expected.

Related

ImportError: No module named unable to import toplevel directory of module in python

My Directory structure:
├── common
│   ├── common.py
│   └── __init__.py
├── project1
│   ├── __init__.py
│   └── scripts
│   ├── example_import.py
│   └── __init__.py
└── project2
├── __init__.py
└── scripts
└── __init__.py
I need to import common/common.py module in project1/scripts/example_import.py file
example_import.py:
import sys
sys.path.append("../common")
from common import Test
print("Module Not import error")
Error:
Traceback (most recent call last):
File "project1/scripts/example_import.py", line 3, in <module>
from common import Test
ImportError: No module named common
How to fix a issue?
Understanding how Python imports work is tricky in the beginning but makes sense once you understand how.
There are different way to fix your import issue. I would not recommend messing up with sys.path. Depending on where you are calling your script, you have multiple choice at hand.
Your Directory structure.
├── common
│   ├── common.py
│   └── __init__.py
├── project1
│   ├── __init__.py
│   └── scripts
│   ├── example_import.py
│   └── __init__.py
└── project2
├── __init__.py
└── scripts
└── __init__.py
On the root of directory
python project1/scripts/example_import.py
Will work, assuming that the imports in example_import.py looks like
from common.common import Test
If you want to use from common import Test, you need to add from common.common import Test in __init__.py file in common folder.
You can also use PYTHONPATH environment variable to tell Python where to look for the base module.
PYTHONPATH=/pathto/basefolder python folder/filesx.py
Another way is to create a setup.py on base and do development installation python -m pip install --editable . on environment you are working on.
Example of setup.py
#!/usr/bin/env python
from distutils.core import setup
setup(name='projectX',
version='1.0',
description='The World is not Enoug',
author='James Bond',
author_email='agent#007.net',
url='https://www.python.org/sigs/distutils-sig/',
packages=['common', 'project1', 'project2],
)
See Python Documentation for more setup.py options.

ImportError from tests directory in Python

I have the following project structure:
python_project
├── module
│   ├── constants.py
│   └── __init__.py
├── scripts
│   ├── __init__.py
│   └── script.py
└── tests
├── constants.py
└── __init__.py
python_project is in PYTHONPATH.
module/constants.py:
VAR_MODULE = 25
tests/constants.py:
VAR = 17
I'm facing the following issue in script.py file:
from module.constants import VAR_MODULE
works
from tests.constants import VAR
throws the following exception:
from tests.constants import VAR
ImportError: cannot import name 'VAR'
I know that there is no point in importing stuff from tests directory, just wondering why this does not work. Is tests directory excluded somehow?
Thanks!

How can I import my own package in python?

I have a project
testci/
├── __init__.py
├── README.md
├── requirements.txt
├── src
│   ├── __init__.py
│   └── mylib.py
└── test
├── __init__.py
└── pow_test.py
When I run python3.6 test/pow_test.py I see an error:
File "test/pow_test.py", line 3, in
import testci.src.mylib as mylib
ModuleNotFoundError: No module named 'testci'
pow_test.py
from testci.src.mylib import get_abs
def test_abs():
assert get_abs(-10) == 10
How can I fix this error?
System details: Ububntu 16.04 LTS, Python 3.6.10
try this
from .src import mylib
from mylib import get_abs
if it won't work then import one by one. But don't import the root folder since the file you are importing to is on the same folder you are trying to import then it will always raise an error
Run Python with the -m argument within the base testsci package to execute as a submodule.
I made a similar mock folder structure:
├───abc_blah
│ │ abc_blah.py
│ │ __init__.py
│
└───def
│ def.py
│ __init__.py
abc_blah.py
print('abc')
def.py
import abc_blah.abc_blah
Execute like such:
python -m def.def
Correctly prints out 'abc' as expected here.
simply add __package__ = "testci" and also it is a good practice to add a try and except block
Your final code should look something like
try:
from testci.src.mylib import get_abs
except ModuleNotFoundError:
from ..testci.src.mylib import get_abs
for running it, type python -m test.pow_test
I think your issue is how the package is installed. The import looks fine to me. As it says CI I'm guessing you're having the package installed remotely with only the test folder somehow.
Try adding a setup.py file where you define that both the test as well as the src packages are part of your testci package.
there are many ways to organize a project, keep things consider in mind, structure should be simple and more scaleable, can differentiate the things in codebase easily.
one of the few good possible ways to structure a project is below
project/
├── app.py
├── dockerfile
├── pipfile
├── Readme.md
├── requiements.txt
├── src_code
│   ├── code
│   │   ├── __init__.py
│   │   └── mylib.py
│   └── test
│   ├── __init__.py
│   └── test_func.py
└── travisfile
here app.py is main file which is responsible to run your entire project

Why is absolute import failing with Python 2, but succeeding with Python3?

While fiddling with Python import system, I noticed this form of absolute import works well with Python 3.6.8, but throws ImportError with Python 2.7.17. The package structure is as follows:
├── main8.py
├── pkg_a
│   ├── __init__.py
│   ├── mod7.py
│   ├── pkg_c
│   │   ├── __init__.py
│   │   ├── mod2.py
main8.py
import pkg_a.mod7
pkg_a/mod7.py
import pkg_a.pkg_c.mod2
pkg_a/pkg_c/mod2.py
print('Imported pkg_a.pkg_c.mod2')
If I execute main8.py with Python3, pkg_a.pkg_c.mod2 gets imported successfully.
$ python3 main8.py
Imported pkg_a.pkg_c.mod2
However, If I execute main8.py with Python2, it throws an ImportError.
$ python2 main8.py
Traceback (most recent call last):
File "main8.py", line 1, in <module>
import pkg_a.mod7
File "pkg_a/mod7.py", line 1, in <module>
import pkg_a.pkg_c.mod2
ImportError: No module named pkg_c.mod2
Adding from __future__ import absolute_import directive at the top of main8.py and pkg_a/mod7.py didn't help. Can anyone please explain why Python2 import is behaving like this?
For Python2 you need to have a __init__.py next to main8.py to make a package:
.
├── __init__.py
├── main8.py
└── pkg_a
├── __init__.py
├── __init__.pyc
├── mod7.py
├── mod7.pyc
└── pkg_c
├── __init__.py
├── __init__.pyc
├── mod2.py
└── mod2.pyc
2 directories, 10 files
Running:
>> /usr/bin/python2.7 ./main8.py
Imported pkg_a.pkg_c.mod2
>> python3 ./main8.py
Imported pkg_a.pkg_c.mod2

Absolute import results in ModuleNotFoundError

Python 3.6
I've written some components and I'm trying to import one of them in the other.
Below is what my project structure looks like:
.
└── components
├── __init__.py
   ├── extract
│   └── python3
| ├── __init__.py
│   └── extract.py
   └── transform
   └── python3
├── __init__.py
   └── preprocess.py
extract.py
from components.transform.python3.preprocess import my_function
if __name__ == '__main__':
my_function()
preprocess.py
def my_function():
print("Found me")
When I run python components/extract/python3/extract.py
I see the following error:
ModuleNotFoundError: No module named 'components'
I've added an empty __init__.py file to the directories that contain modules as well as the top level package directory.
Ok, imports require the top level package to be available in Python PATH (sys.path).
So to make it work, you should:
cd to the directory containing components
add . to the Python PATH:
export PYTHONPATH='.'
launch your script:
python components/extract/python3/extract.py
On my system, it successfully displays:
Found me

Categories