My project is structured as follows:
$ tree . -I venv
.
├── mydir
│ └── __init__.py
├── myotherdir
│ └── t.py
└── t.py
2 directories, 3 files
Both t.py and myotherdir/t.py have the same content:
$ cat t.py
import mydir
$ cat myotherdir/t.py
import mydir
Now, if I run mypy t.py, then all works fine:
$ mypy t.py
Success: no issues found in 1 source file
However, if I run it from inside myotherdir, then it's unable to find mydir:
$ cd myotherdir/
$ mypy t.py
t.py:1: error: Cannot find implementation or library stub for module named 'mydir'
t.py:1: note: See https://mypy.readthedocs.io/en/latest/running_mypy.html#missing-imports
Found 1 error in 1 file (checked 1 source file)
I was expecting that I would be able to solve this by modifying PYTHONPATH - however, that didn't work:
$ PYTHONPATH=.. mypy t.py
t.py:1: error: Cannot find implementation or library stub for module named 'mydir'
t.py:1: note: See https://mypy.readthedocs.io/en/latest/running_mypy.html#missing-imports
Found 1 error in 1 file (checked 1 source file)
How can I let mypy recognise mydir when I'm running inside myotherdir?
Related
I have a project for developing a Python package where the structure of the project is similar to the following:
myproject/
├── README.md
├── examples/
│ ├── ex1.py
│ └── ex2.py
├── pyproject.toml
├── src/
│ └── mypackage/
│ ├── __init__.py
│ ├── adder.py
│ └── divider.py
└── tests/
├── test_adder.py
├── test_divider.py
└── test_examples.py
The project is for developing a Python package named mypackage which is located in the src directory. The package is uploaded to PyPI where users can pip install it. Tests for the package are run with pytest and are located in the tests directory. Examples of using the package are in the examples directory. The examples are just scripts as shown below for ex1.py
"""
Example 1
"""
from mypackage import adder
x = 2.5
y = 8
a = adder(x, y)
print('a is', a)
The purpose of test_examples.py is to test the example files, its contents are shown below:
from examples import ex1
from examples import ex2
def test_ex1():
ex1
def test_ex2():
ex2
When I run pytest in the myproject directory I get the error shown here:
$ cd myproject
$ pytest
platform darwin -- Python 3.10.6, pytest-7.1.2, pluggy-1.0.0
rootdir: /Users/gavinw/Desktop/test-examples
collected 2 items / 1 error
================================================================== ERRORS ==================================================================
_________________________________________________ ERROR collecting tests/test_examples.py __________________________________________________
ImportError while importing test module '/Users/gavinw/Desktop/test-examples/tests/test_examples.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
/opt/miniconda3/envs/ztest/lib/python3.10/importlib/__init__.py:126: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
tests/test_examples.py:1: in <module>
from examples import ex1
E ModuleNotFoundError: No module named 'examples'
========================================================= short test summary info ==========================================================
ERROR tests/test_examples.py
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: 1 error during collection !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
============================================================= 1 error in 0.05s =============================================================
It looks like pytest is not able to run the files in the examples directory because the location does not allow them to be imported. Any suggestions on how I can test the example files? Should I even use pytest for testing examples or is there a different testing tool for this?
You should ensure that the examples directory contains __init__.py so it can be imported correctly.
If that is not enough, you can use PYTHONPATH:
PYTHONPATH="/path/to/your/code/project/:/path/to/your/code/project/examples" pytest
PYTHONPATH might be tricky, see https://stackoverflow.com/a/4580120/3800552 and https://stackoverflow.com/a/39682723/3800552 for some usage examples.
There are two approaches that fix the problem. The first approach is to run pytest using the following command:
python -m pytest
The second approach is to add the project directory in the pyproject.toml file using the pythonpath setting as shown below. Then just use the pytest command to run the tests.
[tool.pytest.ini_options]
pythonpath = ["."]
The src/online_serving/main.py cannot find lib/common/util_string.py although they are in the same /app directory in which Python process is executed.
/app# python3 src/online_serving/main.py
Traceback (most recent call last):
File "/app/src/online_serving/main.py", line 18, in <module>
from lib.common import util_string
ModuleNotFoundError: No module named 'lib'
My directory layout is fairly straightforward. The Docker image is created using the Dockerfile at the BASE directory.
BASE
├── Dockerfile
├── lib
│ ├── __init__.py
│ ├── common
│ │ ├── __init__.py
│ │ └── util_string.py
├── src
│ ├── online_serving
│ │ ├── __init__.py
│ │ ├── main.py
My Dockerfile contains:
FROM python:3.9-slim
ENV APP_HOME /app
WORKDIR $APP_HOME
COPY . ./
ENTRYPOINT gunicorn --bind :$PORT --timeout 120 src.online_serving.main:app
main.py imports the lib.common.util_string which is causing error.
from flask import (
Flask,
request,
Response
)
from lib.common import util_string
# Flask runtime
app = Flask(__name__)
As a workaround, I can set this line in the Dockerfile
ENV PYTHONPATH /app
But why isn't the PYTHONPATH automatically set to the /app directory in the container?
When the Python interpreter is started with python a/b/c.py, the current directory is not added in sys.path but the directory where the script c.py is.
$ python a/b/c.py
$ pwd
/app
$ tree
a
└── b
└── c.py
$ cat a/b/c.py
import sys
for p in sys.path:
print(p)
$ python3 a/b/c.py
/app/a/b # <--- path[0] is the directory of the script used to invoke the interpreter.
/usr/local/lib/python39.zip
/usr/local/lib/python3.9
/usr/local/lib/python3.9/lib-dynload
/usr/local/lib/python3.9/site-packages
sys.path
A list of strings that specifies the search path for modules. Initialized from the environment variable PYTHONPATH, plus an installation-dependent default.
As initialized upon program startup, the first item of this list,path[0], is the directory containing the script that was used to invoke the Python interpreter.
If the script directory is not available, thenpath[0] is the empty string, which directs Python to search modules in the current directory first.(e.g. if the interpreter is invoked interactively or if the script is read from standard input),
Notice that the script directory is insertedbeforethe entries inserted as a result ofPYTHONPATH.
To add the current directory to sys.path so that a module is searched in the current directory tree, need to use PYTHONPATH or start the interpreter with -m option.
-m
If this option is given, the first element of sys.argv will be the full path to the module file (while the module file is being located, the first element will be set to "-m"). As with the -c option, the current directory will be added to the start of sys.path.
$ pwd
/app
$ tree
a
└── b
└── c.py
$ export PYTHONPATH=${PYTHONPATH}:/app
$ python a/b/c.py
/app/a/b # <--- path[0] is the directory of the script used to invoke the interpreter.
/app # <--- then from the environment variable PYTHONPATH, plus an installation-dependent default.
/usr/local/lib/python39.zip
/usr/local/lib/python3.9
/usr/local/lib/python3.9/lib-dynload
/usr/local/lib/python3.9/site-packages
Or
$ pwd
/app
$ tree
a
└── b
└── c.py
$ cat a/b/c.py
import sys
for p in sys.path:
print(p)
$ python3 -m a.b.c
/app # <--- Current directory is path[0] for python -m <module> invocation
/usr/local/lib/python39.zip
/usr/local/lib/python3.9
/usr/local/lib/python3.9/lib-dynload
/usr/local/lib/python3.9/site-packages
While working on a Python FastAPI project using Pipenv and Pytest, I was asked to write a Pipenv script to run the tests.
Project has the following structure:
.
├── app
| ├── main.py
│ ├── my_package
│ │ ├── __init__.py
│ │ ├── (project folders)
│ │ └── tests
| | └── tests.py
│ └── __pycache__
└──(deployment scripts and folders, Pipfile, Dokerfile, etc)
I'd like to run Pytest on tests.py from the top structure (./app/.../tests.py). At least for now.
As recommended here, I currently run them from the top with:
( cd app ; python -m pytest my_package/tests/tests.py )
... which works as expected.
However, when I add that my Pipfile-scripts-section:
[scripts]
my_script = "( cd app ; python -m pytest my_package/tests/tests.py )"
... run it with:
pipenv run my_script
I get the error:
Error: the command ( (from my_script) could not be found within PATH.
I've also tried:
[scripts]
my_script = "cd app && python -m pytest my_package/tests/tests.py"
... which returns another similar error:
Error: the command cd (from my_script) could not be found within PATH.
So it's clear I'm wrong to use it as bash aliases.
I've tried searching for more documentation on how the [scripts] section works as I, but I've had no luck (yet).
I'm not familiar with the tools you're using, but the error message suggests that it's looking for an executable. ( is part of shell syntax, and cd is a shell builtin, not an executable. Try this:
my_script = "bash -c 'cd app && python -m pytest my_package/tests/tests.py'"
Here bash is the executable, and -c makes it run your snippet.
BTW, keep in mind that cd can fail, so the script should bail out if it does.
I am learning Test Driven development for the first time. I have no experience of software development, but have some experience with scripting.
I have been following LinuxAcademy Python 3 for Sys Admin tutorial.
I created the following structure,
├── Makefile
├── Pipfile
├── Pipfile.lock
├── README.rst
├── setup.py
├── src
│ └── pgbackup
│ ├── cli.py
│ └── __init__.py
└── tests
└── test_cli.py
setup.py file,
from setuptools import setup, find_packages
with open('README.rst', 'r') as f:
readme = f.read()
setup(
name='pgbackup',
version='0.1.0',
description='Database backups locally or to AWS S3.',
long_description=readme,
author='Keith Thompson',
author_email='keith#linuxacademy.com',
packages=find_packages('src'),
package_dir={'': 'src'},
)
Makefile file,
.PHONY: install test
default: test
install:
pipenv install --dev --skip-lock
test:
PYTHONPATH=./src pytest
tests/test_cli.py file,
import pytest
from pgbackup import cli
def test_helloworld():
"""
JUST A HELLO WORLD TEST
"""
assert cli.hello() == "helloworld"
and src/pgbackup/cli.py file,
def hello():
return "helloworld"
I wrote helloworld as my first sample test it is not the part of the tutorial. Now when I run make command from project root directory, my test is passed,
========================================== test session starts ===========================================platform linux -- Python 3.6.6, pytest-3.8.0, py-1.6.0, pluggy-0.7.1
rootdir: /root/code/pgbackup, inifile:
collected 1 item
tests/test_cli.py . [100%]
======================================== 1 passed in 0.04 seconds ========================================
I know the make command is setting PYTHONPATH to ./src pytest but not get my head around how its running actual test? I know its only setting a search path to import python modules.
If I try to run pytest command from tests dir, my test if failed,
================================================= ERRORS =================================================___________________________________ ERROR collecting tests/test_cli.py ___________________________________ImportError while importing test module '/root/code/pgbackup/tests/test_cli.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
test_cli.py:2: in <module>
from pgbackup import cli
E ModuleNotFoundError: No module named 'pgbackup'
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: 1 errors during collection !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!======================================== 1 error in 0.35 seconds =========================================
If i run test from src dir, it doesn't run anything,
====================================== no tests ran in 0.01 seconds ======================================
Can someone please explain how running make runs the test since Makefile is just setting PYTHONPATH variable?
Can someone please explain how running make runs the test since Makefile is just setting PYTHONPATH variable?
It's not only setting variable. pytest is being run here because your current line for test:
PYTHONPATH=./src pytest
is equivalent to:
export PYTHONPATH=./src; pytest
Check 3rd example here: Command not found error in Bash variable assignment for explanation
You will probably want to change it to look accordingly,
If I try to run pytest command from tests dir, my test if failed,
As for running pytest from different directories. Indeed, this will give different results. pytest searches for tests (looks for all files in test folder, or current one named test_*). So if you run it inside src, it won't find any tests.
If I try to run pytest command from tests dir, my test if failed
pgbackup seem to be imported from current directory so when you move inside 'tests' folder it won't be found.
Hope this helps.
This question already has answers here:
How can I manually generate a .pyc file from a .py file
(9 answers)
Closed 6 years ago.
I know that when Python script is imported in other python script, then a .pyc script is created. Is there any other way to create .pyc file by using linux bash terminal?
Use the following command:
python -m compileall <your_script.py>
This will create your_script.pyc file in the same directory.
You can pass directory also as :
python -m compileall <directory>
This will create .pyc files for all .py files in the directory
Other way is to create another script as
import py_compile
py_compile.compile("your_script.py")
It also create the your_script.pyc file. You can take file name as command line argument
You could use the py_compile module. Run it from command line (-m option):
When this module is run as a script, the main() is used to compile all
the files named on the command line.
Example:
$ tree
.
└── script.py
0 directories, 1 file
$ python3 -mpy_compile script.py
$ tree
.
├── __pycache__
│ └── script.cpython-34.pyc
└── script.py
1 directory, 2 files
compileall provides similar functionality, to use it you'd do something like
$ python3 -m compileall ...
Where ... are files to compile or directories that contain source files, traversed recursively.
Another option is to import the module:
$ tree
.
├── module.py
├── __pycache__
│ └── script.cpython-34.pyc
└── script.py
1 directory, 3 files
$ python3 -c 'import module'
$ tree
.
├── module.py
├── __pycache__
│ ├── module.cpython-34.pyc
│ └── script.cpython-34.pyc
└── script.py
1 directory, 4 files
-c 'import module' is different from -m module, because the former won't execute the if __name__ == '__main__': block in module.py.