Pytest cov is not reading its setting from the pyproject.toml file. I am using nox, so I run the test with:
python3 -m nox
It seems I have the same issue even without nox.
In fact, after running a poetry install:
poetry run pytest --cov=src passes the test
poetry run pytest --cov does not pass the test
In particular, when failing the test I have the following output (output is cut to the most important stuff):
WARNING: Failed to generate report: No data to report.
/Users/matteo/Library/Caches/pypoetry/virtualenvs/project-Nz69kfmJ-py3.7/lib/python3.7/site-packages/pytest_cov/plugin.py:271: PytestWarning: Failed to generate report: No data to report.
self.cov_controller.finish()
---------- coverage: platform darwin, python 3.7.7-final-0 -----------
FAIL Required test coverage of 100.0% not reached. Total coverage: 0.00%
Code with a reproducible error here.
To run it you'll need to install poetry and to install nox.
Turning the comment into an answer:
Check the current treatment of the src directory. Right now, it seems to be a namespace package which is not what you intend. Either switch to the src layout:
# pyproject.toml
[tool.poetry]
...
packages = [
{ include = 'project', from = 'src' }
]
[tool.coverage.run]
...
source = ['project']
and fix the import in test_code.py:
from src.project import code
to
from project import code
or remove the src dir:
rootdir
├── project
│ └── __init__.py
└── tests
└── test_code.py
and fix the import in test_code.py.
Related
I've coded my python project and have succeeded in publishing it to test pypi. However, now I can't figure out how to correctly configure it as a console script. Upon running my_project on the command line, I get the following stack trace:
Traceback (most recent call last):
File "/home/thatcoolcoder/.local/bin/my_project", line 5, in <module>
from my_project.__main__ import main
ModuleNotFoundError: No module named 'my_project'
Clearly, it's created a script to run but the script is then failing to import my actual code.
Folder structure:
pyproject.toml
setup.cfg
my_project
├── __init__.py (empty)
├── __main__.py
Relevant sections of setup.cfg:
[metadata]
name = my-project
version = 1.0.5
...
[options]
package_dir =
= my_project
packages = find:
...
[options.packages.find]
where = my_project
[options.entry_points]
console_scripts =
my_project = my_project.__main__:main
pyproject.toml (probably not relevant)
[build-system]
requires = [
"setuptools>=42",
"wheel"
]
__main__.py:
from my_project import foo
def main():
foo.bar()
if __name__ == '__main__':
main()
To build and upload, I'm running the following: (python is python 3.10)
python -m build
python -m twine upload --repository testpypi dist/*
Then to install and run:
pip install -i https://test.pypi.org/pypi/ --extra-index-url https://pypi.org/simple my-project --upgrade
my_project
How can I make this console script work?
Also, this current method of setting console_scripts only allows it to be run as my_project; is it possible to also make it work by python -m my_project? Or perhaps this will work once my main issue is fixed.
It's funny, but I had the same frustration when trying to install scripts on multiple platforms. (As Python calls them; posix and nt.)
So I wrote setup-py-script in 2020. It's up on github now.
It installs scripts that use their own modules as a self-contained zip-file. (This method was inspired by youtube-dl.) That means no more leftover files when you delete a script but forget to remove the module et cetera.
It does not require root or administrator privileges; installation is done in user-accessible directories.
You might have to structure your project slightly differently; the script itself is not in the module directory. See the project README.
I finally got back to this problem today and it appears that I was using an incorrect source layout, which caused the pip module installation to not work. I switched to a directory structure like this one:
├── src
│ └── mypackage
│ ├── __init__.py
│ └── mod1.py
├── setup.py
└── setup.cfg
and modified the relevant parts of my setup.cfg:
[options]
package_dir=
=src
packages=find:
[options.packages.find]
where=src
Then I can run it like python -m mypackage. This also made the console scripts work. It works on Linux but I presume it also works on other systems.
I'm migrating all my modules to Poetry and I have a problem.
Before with a python setup.py test I was able to run my tests with the correct coverage information.
Now I'm moving to poetry, so my best option is poetry run pytest or otherwise poetry install; pytest. In both cases, I have to specify the source location in Sonar to collect the coverage data. Here I would naturally just pass my src folder, but clearly the references will be wrong because pytest is running using the code installed in the environment by poetry, not on the local code as it used to happen before, so the references will be mismatched. No amount of tinkering seems to be working.
So, is there a way with poetry to use the local references instead of the environment references when running with pytest? Or should I give up and use some weird trick with inspect to retrieve the path of the installed package in the site-packages folder?
Your current setup where pytest is run against the installed package instead of the source files is vastly preferable, since it simulates the behavior of the code as it will behave in use. Path errors, files that were not correctly marked/moved for install, or any other thing that can go wrong during deployment will be encountered right away at no cost whatsoever.
It also helps giving a more accurate coverage, since e.g. any build files that are not part of the package will be ignored. All that you need in order to tell coverage to look at the package instead of your source files is to tell it exactly that. Having this in your .coveragerc should be enough:
[run]
source = sample_project
Given a project structure like this[1]
.
├── .coveragerc
├── src
│ └── sample_project
│ ├── __init__.py
│ └── util.py
└── tests
├── __init__.py
└── test_util.py
Running pytest --cov tests/ looks inside the installed package correctly:
Test session starts (platform: linux, Python 3.7.2, pytest 3.10.1, pytest-sugar 0.9.2)
rootdir: /home/user/dev/sample_project, inifile:
plugins: sugar-0.9.2, cov-2.7.1
collecting ...
tests/test_util.py ✓ 100% ██████████
----------- coverage: platform linux, python 3.7.2-final-0 -----------
Name Stmts Miss Cover
----------------------------------------
tests/__init__.py 0 0 100%
tests/test_util.py 6 0 100%
----------------------------------------
TOTAL 6 0 100%
Results (0.10s):
1 passed
[1] It might be important to split off the source code in a directory to avoid name shadowing (the import mechanism will prefer a local package foo in its PYTHONPATH which the working directory is always part of over an installed package foo). From your description, it seems that you're already doing that. If you aren't, consider setting up your project again with poetry new and its optional --src flag enabled.
I want my tests folder separate to my application code. My project structure is like so
myproject/
myproject/
myproject.py
moduleone.py
tests/
myproject_test.py
myproject.py
from moduleone import ModuleOne
class MyProject(object)
....
myproject_test.py
from myproject.myproject import MyProject
import pytest
...
I use myproject.myproject since I use the command
python -m pytest
from the project root directory ./myproject/
However, then the imports within those modules fail with
E ModuleNotFoundError: No module named 'moduleone'
I am running Python 3.7 and have read that since 3.3, empty __init__ files are no longer needed which means my project becomes an implicit namespace package
However, I have tried adding an __init__.py file in myproject/myproject/ and also tried adding a conftest.py file in myproject/ but neither works
I have read answers that say to mess with the paths and then upvoted comments in other questions saying not to.
What is the correct way and what am I missing?
EDIT;
Possibly related, I used a requirements.txt to install pytest using pip. Could this be related? And if so, what is the correct way to install pytest in this case?
EDIT 2:
One of the paths in sys.path is /usr/src/app/ which is a docker volume lined to /my/local/path/myproject/.
Should the volume be /my/local/path/myproject/myproject/ instead?
Not sure if this solution was specific to my problem, but I simply add __init__.py to my tests folder and that solved the problem.
Solution: use the PYTHONPATH env. var
PYTHONPATH=. pytest
As mentioned by #J_H, you need to explicitly add the root directory of your project, since pytest only adds to sys.path directories where test files are (which is why #Mak2006's answer worked.)
Good practice: use a Makefile or some other automation tool
If you do not want to type that long command all the time, one option is to create a Makefile in your project's root dir with, e.g., the following:
.PHONY: test
test:
PYTHONPATH=. pytest
Which allows you to simply run:
make test
Another common alternative is to use some standard testing tool, such as tox.
Be sure to include . dot in the $PYTHONPATH env var.
Use $ python -m site, or this code fragment to debug such issues:
import pprint
import sys
pprint.pprint(sys.path)
Your question managed to use myproject at three different levels. At least during debugging you might want to use three distinct names, to reduce possible confusion.
In my case I added a __init__.py to my test directory with this inside it:
import sys
sys.path.append('.')
My app code is at the same level as my test directory.
In my case it is because I installed pytest on the system level but not in my virtual environment.
You can test this by python -m pytest. If you see ModuleNotFoundError: No module named 'pytest' then your pytest is at the system level.
Install pytest when the virtual environment is activated will fix this.
Kept everything same and just added a blank test file at the root folder .. Solved
Here are the findings, this problem really bugged me for a while.
My folder structure was
mathapp/
- server.py
- configuration.py
- __init__.py
- static/
- home.html
tests/
- functional
- test_errors.py
- unit
- test_add.py
and pytest would complain with the ModuleNotFoundError.
I introduced a mock test file at the same level as mathsapp and tests directory. The file contained nothing. Now pytest does not complain.
Result without the file
$ pytest
============================= test session starts =============================
platform win32 -- Python 3.8.2, pytest-5.4.2, py-1.8.1, pluggy-0.13.1
rootdir: C:\mak2006\workspace\0github\python-rest-app-cont
collected 1 item / 1 error
=================================== ERRORS ====================================
_______________ ERROR collecting tests/functional/test_func.py ________________
ImportError while importing test module 'C:\mainak\workspace\0github\python-rest-app-cont\tests\functional\test_func.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
tests\functional\test_func.py:4: in <module>
from mathapp.service import sum
E ModuleNotFoundError: No module named 'mathapp'
=========================== short test summary info ===========================
ERROR tests/functional/test_func.py
!!!!!!!!!!!!!!!!!!! Interrupted: 1 error during collection !!!!!!!!!!!!!!!!!!!!
============================== 1 error in 0.24s ===============================
Results with the file
$ pytest
============================= test session starts =============================
platform win32 -- Python 3.8.2, pytest-5.4.2, py-1.8.1, pluggy-0.13.1
rootdir: C:\mak2006\workspace\0github\python-rest-app-cont
collected 2 items
tests\functional\test_func.py . [ 50%]
tests\unit\test_unit.py . [100%]
============================== 2 passed in 0.11s ==============================
Better Solution
Try adding a single __init__.py to your tests directory (a level up from your module) with this contents:
import sys
sys.path.append('.')
sys.path.append('./my_module')
Your file structure should look like this:
project
my_module
package.py
tests
__init__.py
my_tests.py
The first append to sys.path will enable you to import from <your-module-name> and the second will enable your packages to import as normal.
In your tests you can import by using from my_module.package import function whereas in your module import using simply from package import function.
Edit: Seems like this solution is not universal (like the others).
I was able to solve this issue using help from this answer.
Add an __init__.py to your main module directory that contains
import pathlib, sys
sys.path.append(str(pathlib.Path(__file__).parent))
I also added another __init__.py to my tests directory (thanks to this answer) with
import sys
sys.path.append('.')
So it seems that the sys.path has to include the application directory rather than the project root folder containing the application directory and test directory.
So in my case /my/local/path/myproject/myproject/ had to be in sys.path rather than /my/local/path/myproject/.
Then I could run pytest in /my/local/path/myproject/ (didn't need python -m pytest). This meant that the modules within /myproject/myproject/ could find each other and the tests as well without any namespace nesting.
So my tests looked like
from moduleone import ModuleOne
import pytest
def test_fun():
assert ModuleOne.example_func() == True
That said, there seem to be many gotchas, so I have no idea if this is correct..
I suggest you have a code structure like this:
myproject/
helpers/
moduleone.py
moduletwo.py
tests/
myproject_test.py
conftest.py
And the content of conftest.py file is:
pytest_plugins = ['helpers']
Run pytest again.
Using poetry and pytest 5.4.3, I had the following structure (some folders / files have been removed for clarity):
project structure
.
├── my_app
│ ├── __init__.py
│ ├── main.py
│ ├── model.py
│ └── orm.py
├── poetry.lock
├── pyproject.toml
├── README.rst
└── tests
├── __init__.py
├── conftest.py
├── test_my_app.py
└── utilities
└── db_postgresql_inmemory.py
tests/conftest.py
pytest_plugins = [
"utilities.db_postgresql_inmemory",
]
which generated a module not found error for the fixture:
ImportError: Error importing plugin "utilities.db_postgresql_inmemory": No module named 'utilities'
None of the other answers have worked for me, as I have tried to add:
[me#linux ~/code/my_app]touch tests/utilities/__init__.py
[me#linux ~/code/my_app]touch ./test_blank.py
I could make the import from conftest.py work by REMOVING both __init__.py files:
[me#linux ~/code/my_app]rm tests/utilities/__init__.py tests/__init__.py
In 2023.02, according to the document of pytest, you can simply add following config to your pyproject.toml to solve this problem
[tool.pytest.ini_options]
pythonpath = "src"
addopts = [
"--import-mode=importlib",
]
I ran into this issue as well and am using poetry for dependency management and direnv for my project specific environment variables. Please note, I am relatively new to Python so I don't know if this is the correct fix.
Here is my entire .envrc file:
layout_poetry() {
if [[ ! -f pyproject.toml ]]; then
log_error 'No pyproject.toml found. Use `poetry new` or `poetry init` to create one first.'
exit 2
fi
local VENV=$(poetry env list --full-path | cut -d' ' -f1)
if [[ -z $VENV || ! -d $VENV/bin ]]; then
log_error 'No created poetry virtual environment found. Use `poetry install` to create one first.'
exit 2
fi
VENV=$VENV/bin
export VIRTUAL_ENV=$(echo "$VENV" | rev | cut -d'/' -f2- | rev)
export POETRY_ACTIVE=1
PATH_add "$VENV"
}
layout poetry
export PYTHONDONTWRITEBYTECODE=1
export PYTHONPATH="$PWD/project_name"
I don't know if I need to layout poetry because it is supposed to be creating virtual environments for us already but this is what I coworker recommended so I went with it. Layout poetry also didn't work without that function and it didn't like when I added it to my zshenv so I added it here.
For this specific question, the last line is the money maker.
ANOTHER SUGGESTION
See this answer: https://stackoverflow.com/a/69691436/595305
I was facing the issue which i resolved by
Installing pytest at the root of my project using pip install pytest
Adding blank __init__.py in the sibling of my test_file.py which i wanted to execute.
I have resolved it by adding export PYTHONPATH="your root dir/src"
i.e.
export PYTHONPATH="/builds/project/src"
poetry run pytest .....
The simplest solution I found was to manually add my target module to syspath. Lets say you have a structure like this:
flaskapp
- src
-- app.py
-- utils
-- ...
- tests
docs
venv
This makes my test folder a sibling to my module's src folder. If I start putting test_* files that need to import some of the module's code, I can simply:
import src.utils.calculator
And this would be fine until I try to import a file that imports another file from the module. The solution is simple: add a __init__.py to your tests folder, and put this line inside:
import sys, os
sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), '../src')))
And just modify the last part relative to your module location and folder name
For me, when I was checking my project structure I found parent directory and sub directory having same names. When I changed the directory name, I got it working. So,
# Did not work
- same_name_project/
- same_name_project/
- tests/
# Worked
- different_named_project/
- a_unique_directory/
- tests/
I am learning Test Driven development for the first time. I have no experience of software development, but have some experience with scripting.
I have been following LinuxAcademy Python 3 for Sys Admin tutorial.
I created the following structure,
├── Makefile
├── Pipfile
├── Pipfile.lock
├── README.rst
├── setup.py
├── src
│ └── pgbackup
│ ├── cli.py
│ └── __init__.py
└── tests
└── test_cli.py
setup.py file,
from setuptools import setup, find_packages
with open('README.rst', 'r') as f:
readme = f.read()
setup(
name='pgbackup',
version='0.1.0',
description='Database backups locally or to AWS S3.',
long_description=readme,
author='Keith Thompson',
author_email='keith#linuxacademy.com',
packages=find_packages('src'),
package_dir={'': 'src'},
)
Makefile file,
.PHONY: install test
default: test
install:
pipenv install --dev --skip-lock
test:
PYTHONPATH=./src pytest
tests/test_cli.py file,
import pytest
from pgbackup import cli
def test_helloworld():
"""
JUST A HELLO WORLD TEST
"""
assert cli.hello() == "helloworld"
and src/pgbackup/cli.py file,
def hello():
return "helloworld"
I wrote helloworld as my first sample test it is not the part of the tutorial. Now when I run make command from project root directory, my test is passed,
========================================== test session starts ===========================================platform linux -- Python 3.6.6, pytest-3.8.0, py-1.6.0, pluggy-0.7.1
rootdir: /root/code/pgbackup, inifile:
collected 1 item
tests/test_cli.py . [100%]
======================================== 1 passed in 0.04 seconds ========================================
I know the make command is setting PYTHONPATH to ./src pytest but not get my head around how its running actual test? I know its only setting a search path to import python modules.
If I try to run pytest command from tests dir, my test if failed,
================================================= ERRORS =================================================___________________________________ ERROR collecting tests/test_cli.py ___________________________________ImportError while importing test module '/root/code/pgbackup/tests/test_cli.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
test_cli.py:2: in <module>
from pgbackup import cli
E ModuleNotFoundError: No module named 'pgbackup'
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: 1 errors during collection !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!======================================== 1 error in 0.35 seconds =========================================
If i run test from src dir, it doesn't run anything,
====================================== no tests ran in 0.01 seconds ======================================
Can someone please explain how running make runs the test since Makefile is just setting PYTHONPATH variable?
Can someone please explain how running make runs the test since Makefile is just setting PYTHONPATH variable?
It's not only setting variable. pytest is being run here because your current line for test:
PYTHONPATH=./src pytest
is equivalent to:
export PYTHONPATH=./src; pytest
Check 3rd example here: Command not found error in Bash variable assignment for explanation
You will probably want to change it to look accordingly,
If I try to run pytest command from tests dir, my test if failed,
As for running pytest from different directories. Indeed, this will give different results. pytest searches for tests (looks for all files in test folder, or current one named test_*). So if you run it inside src, it won't find any tests.
If I try to run pytest command from tests dir, my test if failed
pgbackup seem to be imported from current directory so when you move inside 'tests' folder it won't be found.
Hope this helps.
I've been googling this issue for over an hour and have absolutely 0 idea what to do - I'm trying to set up travis ci on my public repo and for some reason every time I commit and build, I consistently get this:
============================= test session starts ==============================
platform linux2 -- Python 2.7.13, pytest-3.2.1, py-1.4.34, pluggy-0.4.0
rootdir: /home/travis/build/epitone/digitron, inifile:
plugins: cov-2.5.1
collected 0 items
========================= no tests ran in 0.01 seconds =========================
The command "pytest" exited with 5.
Done. Your build exited with 1.
For some reason, my tests aren't being collected even though on my local machine, pytest runs just fine and even passes my (single) test that I've set up. Does anyone know exactly what's going on here?
My directory setup is something like this:
digitron/
├── bot.py
├── lib
│ ├── _config.py
│ ├── config.py
│ ├── __init__.py
│ └── utils.py
├── README.md
├── requirements.txt
└── tests
└── auth_test.py
my .travis.yml file:
language: python
python:
- "2.7"
- "3.2"
- "3.3"
- "3.4"
- "3.5"
- "3.6"
- "nightly" # currently points to 3.7-dev
before_install:
- pip install pytest pytest-cov
# command to install dependencies
install: "pip install -r requirements.txt"
# command to run tests
script: pytest
my requirements.txt file contains:
pytest>=3.2.1
py>=1.4.31
pluggy>=0.4.0
and my auth_test.py file is simply:
"""
Testing for Digitron
"""
# Necessary to find parent directory
import os, sys
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import bot
import pytest
def test_connect():
"""Test connect() method in bot.py
Returns True or false
"""
assert(bot.connect("irc.chat.twitch.tv", 6667, "oauth:key", "username", "#channel")) == True
I'm basically pulling my hair out over here trying to figure out what I'm doing wrong - googling error code 5 gets me nothing, and I'm even following along with someone elses setup and still getting nowhere - is there something I'm missing?
edit: tried explicitly calling the tests/ directory in the pytest command - works locally, fails yet again on travis.ci
While I can't tell if this is the cause of your problem, at least the method to coerce sys.path is not very orthodox. pytest performs this automatically: https://docs.pytest.org/en/latest/goodpractices.html#tests-as-part-of-application-code so there may be import issues.
The error number 5 is caused by pytest not finding any test, see the documentation
Possible tests for resolution:
Rename your test file to test_auth.py
Add diagnostic information to your script: pwd, ls, etc. Try simply executing the file: python tests/auth_test.py.
This is a bit long and annoying to format in a comment, hence the answer.
You should add a pytest.ini file with the contents:
[pytest]
testpaths = <path to your tests here>
Without this, Travis cannot find your tests, hence why you are getting the error number 5.