I've been googling this issue for over an hour and have absolutely 0 idea what to do - I'm trying to set up travis ci on my public repo and for some reason every time I commit and build, I consistently get this:
============================= test session starts ==============================
platform linux2 -- Python 2.7.13, pytest-3.2.1, py-1.4.34, pluggy-0.4.0
rootdir: /home/travis/build/epitone/digitron, inifile:
plugins: cov-2.5.1
collected 0 items
========================= no tests ran in 0.01 seconds =========================
The command "pytest" exited with 5.
Done. Your build exited with 1.
For some reason, my tests aren't being collected even though on my local machine, pytest runs just fine and even passes my (single) test that I've set up. Does anyone know exactly what's going on here?
My directory setup is something like this:
digitron/
├── bot.py
├── lib
│ ├── _config.py
│ ├── config.py
│ ├── __init__.py
│ └── utils.py
├── README.md
├── requirements.txt
└── tests
└── auth_test.py
my .travis.yml file:
language: python
python:
- "2.7"
- "3.2"
- "3.3"
- "3.4"
- "3.5"
- "3.6"
- "nightly" # currently points to 3.7-dev
before_install:
- pip install pytest pytest-cov
# command to install dependencies
install: "pip install -r requirements.txt"
# command to run tests
script: pytest
my requirements.txt file contains:
pytest>=3.2.1
py>=1.4.31
pluggy>=0.4.0
and my auth_test.py file is simply:
"""
Testing for Digitron
"""
# Necessary to find parent directory
import os, sys
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import bot
import pytest
def test_connect():
"""Test connect() method in bot.py
Returns True or false
"""
assert(bot.connect("irc.chat.twitch.tv", 6667, "oauth:key", "username", "#channel")) == True
I'm basically pulling my hair out over here trying to figure out what I'm doing wrong - googling error code 5 gets me nothing, and I'm even following along with someone elses setup and still getting nowhere - is there something I'm missing?
edit: tried explicitly calling the tests/ directory in the pytest command - works locally, fails yet again on travis.ci
While I can't tell if this is the cause of your problem, at least the method to coerce sys.path is not very orthodox. pytest performs this automatically: https://docs.pytest.org/en/latest/goodpractices.html#tests-as-part-of-application-code so there may be import issues.
The error number 5 is caused by pytest not finding any test, see the documentation
Possible tests for resolution:
Rename your test file to test_auth.py
Add diagnostic information to your script: pwd, ls, etc. Try simply executing the file: python tests/auth_test.py.
This is a bit long and annoying to format in a comment, hence the answer.
You should add a pytest.ini file with the contents:
[pytest]
testpaths = <path to your tests here>
Without this, Travis cannot find your tests, hence why you are getting the error number 5.
Related
I have read several similar topics but haven't succeeded yet. I feel I miss or misunderstand some fundamental thing and this is the reason of my failure.
I have an 'application' written in a python which I want to deploy with help of standard setup.py. Due to complex functionality it consists of different python modules. But there is no sense in separate release of this modules as they are too specific.
Expected result is to have package installed in a system with help of pip install and be available from OS command line with simple app command.
Simplifying long story to reproducible example - I have following directory structure:
<root>
├─ app
| ├─ aaa
| | └── module_a.py
| ├─ bbb
| | └── module_b.py
| └── app.py
├─ docs
| └── .....
├─ tests
| └── .....
└─ setup.py
Below is code of modules:
app.py
#!/usr/bin/python
from aaa.module_a import method1
from bbb.module_b import method2
def main():
print("APP main executed")
method1()
method2()
if __name__ == '__main__':
main()
module_a.py
def method1():
print("A1 executed")
module_b.py
def method2():
print("B2 executed")
When I run app.py from console it works fine and gives expected output:
APP main executed
A1 executed
B2 executed
So, this simple 'application' works fine and I want to distribute it with help of following
setup.py
from setuptools import setup
setup(
name="app",
version="1.0",
packages=['app', 'app.aaa', 'app.bbb'],
package_dir={'app': 'app'},
entry_points={
'console_scripts': ['app=app.app:main', ]
}
)
Again, everything looks good and test installation looks good:
(venv) [user#test]$ pip install <root>
Processing /home/user/<root>
Using legacy 'setup.py install' for app, since package 'wheel' is not installed.
Installing collected packages: app
Running setup.py install for app ... done
Successfully installed app-1.0
(venv) [user#test]$
And now comes the problem. With aforementioned entry_points from setup.py I expect to be able execute my application with ./app command. Indeed it works. But application itself fails with error message:
File "/test/venv/lib/python3.9/site-packages/app/app.py", line 3, in <module>
from aaa.module_a import method1
ModuleNotFoundError: No module named 'aaa'
I understand the reason of the error - it is because pip install put directories aaa and bbb together with app.py in one directory app. I.e. from this point of view app.py should use import app.aaa instead of import aaa. But if I do so then my app during development runs with error:
ModuleNotFoundError: No module named 'app.aaa'; 'app' is not a package
that is also logical as there are no app package available at that time... (it is under development and isn't installed in the system...)
Finally. The question is - what is a correct way to create directory structure and setup.py for standalone python application that consist of several own modules?
UPD
The most promising result (but proved to be wrong after discussion in coments) that I have now came after following changes:
moved app.py from <root>/app into <root> itself
I referenced it in setup.py by py_modules=['app']
I changed imports from import aaa.method1 to import app.aaa.method1 etc.
This way package works both in my development environment and after installation.
But I got a problem with entry_points - I see no way how to configure entry point to use main() from app.py that is not a part of app package but is a separate module....
I.e. new structure is
<root>
├─ app
| ├─ aaa
| | └── module_a.py
| ├─ bbb
| | └── module_b.py
| └──__init__.py
├─ docs
| └── .....
├─ tests
| └── .....
├─ app.py
└─ setup.py
I.e. the logic here - to have 2 separate entities:
An empty package app (consists of init.py only) with subpackages aaa, bbb etc.
A script app.py that uses functions from subpackages app.aaa, app.bbb
But as I wrote - I see no way how to define entry point for app.py to allow it's run from OS command line directly.
With that directory (package) structure, in your app.py you should import as one of the following:
from app.aaa.module_a import method1
from .aaa.module_a import method1
Then make sure to call you application like one of the following:
app
(this should work thanks to the console entry point)
python -m app.app
(this should work even without the console entry point)
I try to recreate the complete project here
Directory structure:
.
├── app
│ ├── aaa
│ │ └── module_a.py
│ ├── app.py
│ └── bbb
│ └── module_b.py
└── setup.py
setup.py
import setuptools
setuptools.setup(
name="app",
version="1.0",
packages=['app', 'app.aaa', 'app.bbb'],
entry_points={
'console_scripts': ['app=app.app:main', ]
},
)
app/app.py
#!/usr/bin/python
from .aaa.module_a import method1
from .bbb.module_b import method2
def main():
print("APP main executed")
method1()
method2()
if __name__ == '__main__':
main()
app/aaa/module_a.py
def method1():
print("A1 executed")
app/bbb/module_b.py
def method2():
print("B2 executed")
Then I run following commands:
$ python3 -V
Python 3.6.9
$ python3 -m venv .venv
$ .venv/bin/python -m pip install -U pip setuptools wheel
# [...]
$ .venv/bin/python -m pip list
Package Version
------------- -------------------
pip 20.3.3
pkg-resources 0.0.0
setuptools 51.1.0.post20201221
wheel 0.36.2
$ .venv/bin/python -m pip install .
# [...]
$ .venv/bin/python -m app.app
APP main executed
A1 executed
B2 executed
$ .venv/bin/app
APP main executed
A1 executed
B2 executed
$ .venv/bin/python -m pip uninstall app
# [...]
$ .venv/bin/python -m pip install --editable .
# [...]
$ .venv/bin/python -m app.app
APP main executed
A1 executed
B2 executed
$ .venv/bin/app
APP main executed
A1 executed
B2 executed
The answer from sinoroc is mostly right - he executed a correct example that highlights all options. It shows a correct way to structure python package. But before any run this package should be first installed into venv. Then pip install --editable option is required in order to continue develop/debug the package inside IDE (it installs package in venv but keeps source files in original place).
After long discussion in comments I came to the page An Overview of Packaging for Python that explains all options and highlights that Python’s native packaging is mostly built for distributing reusable code, called libraries. I assume this is a reason of my misunderstanding and initial question.
As an alternative solution I plan to play with PEP 441 -- Improving Python ZIP Application Support that looks like a correct approach for my case.
Pytest cov is not reading its setting from the pyproject.toml file. I am using nox, so I run the test with:
python3 -m nox
It seems I have the same issue even without nox.
In fact, after running a poetry install:
poetry run pytest --cov=src passes the test
poetry run pytest --cov does not pass the test
In particular, when failing the test I have the following output (output is cut to the most important stuff):
WARNING: Failed to generate report: No data to report.
/Users/matteo/Library/Caches/pypoetry/virtualenvs/project-Nz69kfmJ-py3.7/lib/python3.7/site-packages/pytest_cov/plugin.py:271: PytestWarning: Failed to generate report: No data to report.
self.cov_controller.finish()
---------- coverage: platform darwin, python 3.7.7-final-0 -----------
FAIL Required test coverage of 100.0% not reached. Total coverage: 0.00%
Code with a reproducible error here.
To run it you'll need to install poetry and to install nox.
Turning the comment into an answer:
Check the current treatment of the src directory. Right now, it seems to be a namespace package which is not what you intend. Either switch to the src layout:
# pyproject.toml
[tool.poetry]
...
packages = [
{ include = 'project', from = 'src' }
]
[tool.coverage.run]
...
source = ['project']
and fix the import in test_code.py:
from src.project import code
to
from project import code
or remove the src dir:
rootdir
├── project
│ └── __init__.py
└── tests
└── test_code.py
and fix the import in test_code.py.
I'm migrating all my modules to Poetry and I have a problem.
Before with a python setup.py test I was able to run my tests with the correct coverage information.
Now I'm moving to poetry, so my best option is poetry run pytest or otherwise poetry install; pytest. In both cases, I have to specify the source location in Sonar to collect the coverage data. Here I would naturally just pass my src folder, but clearly the references will be wrong because pytest is running using the code installed in the environment by poetry, not on the local code as it used to happen before, so the references will be mismatched. No amount of tinkering seems to be working.
So, is there a way with poetry to use the local references instead of the environment references when running with pytest? Or should I give up and use some weird trick with inspect to retrieve the path of the installed package in the site-packages folder?
Your current setup where pytest is run against the installed package instead of the source files is vastly preferable, since it simulates the behavior of the code as it will behave in use. Path errors, files that were not correctly marked/moved for install, or any other thing that can go wrong during deployment will be encountered right away at no cost whatsoever.
It also helps giving a more accurate coverage, since e.g. any build files that are not part of the package will be ignored. All that you need in order to tell coverage to look at the package instead of your source files is to tell it exactly that. Having this in your .coveragerc should be enough:
[run]
source = sample_project
Given a project structure like this[1]
.
├── .coveragerc
├── src
│ └── sample_project
│ ├── __init__.py
│ └── util.py
└── tests
├── __init__.py
└── test_util.py
Running pytest --cov tests/ looks inside the installed package correctly:
Test session starts (platform: linux, Python 3.7.2, pytest 3.10.1, pytest-sugar 0.9.2)
rootdir: /home/user/dev/sample_project, inifile:
plugins: sugar-0.9.2, cov-2.7.1
collecting ...
tests/test_util.py ✓ 100% ██████████
----------- coverage: platform linux, python 3.7.2-final-0 -----------
Name Stmts Miss Cover
----------------------------------------
tests/__init__.py 0 0 100%
tests/test_util.py 6 0 100%
----------------------------------------
TOTAL 6 0 100%
Results (0.10s):
1 passed
[1] It might be important to split off the source code in a directory to avoid name shadowing (the import mechanism will prefer a local package foo in its PYTHONPATH which the working directory is always part of over an installed package foo). From your description, it seems that you're already doing that. If you aren't, consider setting up your project again with poetry new and its optional --src flag enabled.
I want my tests folder separate to my application code. My project structure is like so
myproject/
myproject/
myproject.py
moduleone.py
tests/
myproject_test.py
myproject.py
from moduleone import ModuleOne
class MyProject(object)
....
myproject_test.py
from myproject.myproject import MyProject
import pytest
...
I use myproject.myproject since I use the command
python -m pytest
from the project root directory ./myproject/
However, then the imports within those modules fail with
E ModuleNotFoundError: No module named 'moduleone'
I am running Python 3.7 and have read that since 3.3, empty __init__ files are no longer needed which means my project becomes an implicit namespace package
However, I have tried adding an __init__.py file in myproject/myproject/ and also tried adding a conftest.py file in myproject/ but neither works
I have read answers that say to mess with the paths and then upvoted comments in other questions saying not to.
What is the correct way and what am I missing?
EDIT;
Possibly related, I used a requirements.txt to install pytest using pip. Could this be related? And if so, what is the correct way to install pytest in this case?
EDIT 2:
One of the paths in sys.path is /usr/src/app/ which is a docker volume lined to /my/local/path/myproject/.
Should the volume be /my/local/path/myproject/myproject/ instead?
Not sure if this solution was specific to my problem, but I simply add __init__.py to my tests folder and that solved the problem.
Solution: use the PYTHONPATH env. var
PYTHONPATH=. pytest
As mentioned by #J_H, you need to explicitly add the root directory of your project, since pytest only adds to sys.path directories where test files are (which is why #Mak2006's answer worked.)
Good practice: use a Makefile or some other automation tool
If you do not want to type that long command all the time, one option is to create a Makefile in your project's root dir with, e.g., the following:
.PHONY: test
test:
PYTHONPATH=. pytest
Which allows you to simply run:
make test
Another common alternative is to use some standard testing tool, such as tox.
Be sure to include . dot in the $PYTHONPATH env var.
Use $ python -m site, or this code fragment to debug such issues:
import pprint
import sys
pprint.pprint(sys.path)
Your question managed to use myproject at three different levels. At least during debugging you might want to use three distinct names, to reduce possible confusion.
In my case I added a __init__.py to my test directory with this inside it:
import sys
sys.path.append('.')
My app code is at the same level as my test directory.
In my case it is because I installed pytest on the system level but not in my virtual environment.
You can test this by python -m pytest. If you see ModuleNotFoundError: No module named 'pytest' then your pytest is at the system level.
Install pytest when the virtual environment is activated will fix this.
Kept everything same and just added a blank test file at the root folder .. Solved
Here are the findings, this problem really bugged me for a while.
My folder structure was
mathapp/
- server.py
- configuration.py
- __init__.py
- static/
- home.html
tests/
- functional
- test_errors.py
- unit
- test_add.py
and pytest would complain with the ModuleNotFoundError.
I introduced a mock test file at the same level as mathsapp and tests directory. The file contained nothing. Now pytest does not complain.
Result without the file
$ pytest
============================= test session starts =============================
platform win32 -- Python 3.8.2, pytest-5.4.2, py-1.8.1, pluggy-0.13.1
rootdir: C:\mak2006\workspace\0github\python-rest-app-cont
collected 1 item / 1 error
=================================== ERRORS ====================================
_______________ ERROR collecting tests/functional/test_func.py ________________
ImportError while importing test module 'C:\mainak\workspace\0github\python-rest-app-cont\tests\functional\test_func.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
tests\functional\test_func.py:4: in <module>
from mathapp.service import sum
E ModuleNotFoundError: No module named 'mathapp'
=========================== short test summary info ===========================
ERROR tests/functional/test_func.py
!!!!!!!!!!!!!!!!!!! Interrupted: 1 error during collection !!!!!!!!!!!!!!!!!!!!
============================== 1 error in 0.24s ===============================
Results with the file
$ pytest
============================= test session starts =============================
platform win32 -- Python 3.8.2, pytest-5.4.2, py-1.8.1, pluggy-0.13.1
rootdir: C:\mak2006\workspace\0github\python-rest-app-cont
collected 2 items
tests\functional\test_func.py . [ 50%]
tests\unit\test_unit.py . [100%]
============================== 2 passed in 0.11s ==============================
Better Solution
Try adding a single __init__.py to your tests directory (a level up from your module) with this contents:
import sys
sys.path.append('.')
sys.path.append('./my_module')
Your file structure should look like this:
project
my_module
package.py
tests
__init__.py
my_tests.py
The first append to sys.path will enable you to import from <your-module-name> and the second will enable your packages to import as normal.
In your tests you can import by using from my_module.package import function whereas in your module import using simply from package import function.
Edit: Seems like this solution is not universal (like the others).
I was able to solve this issue using help from this answer.
Add an __init__.py to your main module directory that contains
import pathlib, sys
sys.path.append(str(pathlib.Path(__file__).parent))
I also added another __init__.py to my tests directory (thanks to this answer) with
import sys
sys.path.append('.')
So it seems that the sys.path has to include the application directory rather than the project root folder containing the application directory and test directory.
So in my case /my/local/path/myproject/myproject/ had to be in sys.path rather than /my/local/path/myproject/.
Then I could run pytest in /my/local/path/myproject/ (didn't need python -m pytest). This meant that the modules within /myproject/myproject/ could find each other and the tests as well without any namespace nesting.
So my tests looked like
from moduleone import ModuleOne
import pytest
def test_fun():
assert ModuleOne.example_func() == True
That said, there seem to be many gotchas, so I have no idea if this is correct..
I suggest you have a code structure like this:
myproject/
helpers/
moduleone.py
moduletwo.py
tests/
myproject_test.py
conftest.py
And the content of conftest.py file is:
pytest_plugins = ['helpers']
Run pytest again.
Using poetry and pytest 5.4.3, I had the following structure (some folders / files have been removed for clarity):
project structure
.
├── my_app
│ ├── __init__.py
│ ├── main.py
│ ├── model.py
│ └── orm.py
├── poetry.lock
├── pyproject.toml
├── README.rst
└── tests
├── __init__.py
├── conftest.py
├── test_my_app.py
└── utilities
└── db_postgresql_inmemory.py
tests/conftest.py
pytest_plugins = [
"utilities.db_postgresql_inmemory",
]
which generated a module not found error for the fixture:
ImportError: Error importing plugin "utilities.db_postgresql_inmemory": No module named 'utilities'
None of the other answers have worked for me, as I have tried to add:
[me#linux ~/code/my_app]touch tests/utilities/__init__.py
[me#linux ~/code/my_app]touch ./test_blank.py
I could make the import from conftest.py work by REMOVING both __init__.py files:
[me#linux ~/code/my_app]rm tests/utilities/__init__.py tests/__init__.py
In 2023.02, according to the document of pytest, you can simply add following config to your pyproject.toml to solve this problem
[tool.pytest.ini_options]
pythonpath = "src"
addopts = [
"--import-mode=importlib",
]
I ran into this issue as well and am using poetry for dependency management and direnv for my project specific environment variables. Please note, I am relatively new to Python so I don't know if this is the correct fix.
Here is my entire .envrc file:
layout_poetry() {
if [[ ! -f pyproject.toml ]]; then
log_error 'No pyproject.toml found. Use `poetry new` or `poetry init` to create one first.'
exit 2
fi
local VENV=$(poetry env list --full-path | cut -d' ' -f1)
if [[ -z $VENV || ! -d $VENV/bin ]]; then
log_error 'No created poetry virtual environment found. Use `poetry install` to create one first.'
exit 2
fi
VENV=$VENV/bin
export VIRTUAL_ENV=$(echo "$VENV" | rev | cut -d'/' -f2- | rev)
export POETRY_ACTIVE=1
PATH_add "$VENV"
}
layout poetry
export PYTHONDONTWRITEBYTECODE=1
export PYTHONPATH="$PWD/project_name"
I don't know if I need to layout poetry because it is supposed to be creating virtual environments for us already but this is what I coworker recommended so I went with it. Layout poetry also didn't work without that function and it didn't like when I added it to my zshenv so I added it here.
For this specific question, the last line is the money maker.
ANOTHER SUGGESTION
See this answer: https://stackoverflow.com/a/69691436/595305
I was facing the issue which i resolved by
Installing pytest at the root of my project using pip install pytest
Adding blank __init__.py in the sibling of my test_file.py which i wanted to execute.
I have resolved it by adding export PYTHONPATH="your root dir/src"
i.e.
export PYTHONPATH="/builds/project/src"
poetry run pytest .....
The simplest solution I found was to manually add my target module to syspath. Lets say you have a structure like this:
flaskapp
- src
-- app.py
-- utils
-- ...
- tests
docs
venv
This makes my test folder a sibling to my module's src folder. If I start putting test_* files that need to import some of the module's code, I can simply:
import src.utils.calculator
And this would be fine until I try to import a file that imports another file from the module. The solution is simple: add a __init__.py to your tests folder, and put this line inside:
import sys, os
sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), '../src')))
And just modify the last part relative to your module location and folder name
For me, when I was checking my project structure I found parent directory and sub directory having same names. When I changed the directory name, I got it working. So,
# Did not work
- same_name_project/
- same_name_project/
- tests/
# Worked
- different_named_project/
- a_unique_directory/
- tests/
I am learning Test Driven development for the first time. I have no experience of software development, but have some experience with scripting.
I have been following LinuxAcademy Python 3 for Sys Admin tutorial.
I created the following structure,
├── Makefile
├── Pipfile
├── Pipfile.lock
├── README.rst
├── setup.py
├── src
│ └── pgbackup
│ ├── cli.py
│ └── __init__.py
└── tests
└── test_cli.py
setup.py file,
from setuptools import setup, find_packages
with open('README.rst', 'r') as f:
readme = f.read()
setup(
name='pgbackup',
version='0.1.0',
description='Database backups locally or to AWS S3.',
long_description=readme,
author='Keith Thompson',
author_email='keith#linuxacademy.com',
packages=find_packages('src'),
package_dir={'': 'src'},
)
Makefile file,
.PHONY: install test
default: test
install:
pipenv install --dev --skip-lock
test:
PYTHONPATH=./src pytest
tests/test_cli.py file,
import pytest
from pgbackup import cli
def test_helloworld():
"""
JUST A HELLO WORLD TEST
"""
assert cli.hello() == "helloworld"
and src/pgbackup/cli.py file,
def hello():
return "helloworld"
I wrote helloworld as my first sample test it is not the part of the tutorial. Now when I run make command from project root directory, my test is passed,
========================================== test session starts ===========================================platform linux -- Python 3.6.6, pytest-3.8.0, py-1.6.0, pluggy-0.7.1
rootdir: /root/code/pgbackup, inifile:
collected 1 item
tests/test_cli.py . [100%]
======================================== 1 passed in 0.04 seconds ========================================
I know the make command is setting PYTHONPATH to ./src pytest but not get my head around how its running actual test? I know its only setting a search path to import python modules.
If I try to run pytest command from tests dir, my test if failed,
================================================= ERRORS =================================================___________________________________ ERROR collecting tests/test_cli.py ___________________________________ImportError while importing test module '/root/code/pgbackup/tests/test_cli.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
test_cli.py:2: in <module>
from pgbackup import cli
E ModuleNotFoundError: No module named 'pgbackup'
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: 1 errors during collection !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!======================================== 1 error in 0.35 seconds =========================================
If i run test from src dir, it doesn't run anything,
====================================== no tests ran in 0.01 seconds ======================================
Can someone please explain how running make runs the test since Makefile is just setting PYTHONPATH variable?
Can someone please explain how running make runs the test since Makefile is just setting PYTHONPATH variable?
It's not only setting variable. pytest is being run here because your current line for test:
PYTHONPATH=./src pytest
is equivalent to:
export PYTHONPATH=./src; pytest
Check 3rd example here: Command not found error in Bash variable assignment for explanation
You will probably want to change it to look accordingly,
If I try to run pytest command from tests dir, my test if failed,
As for running pytest from different directories. Indeed, this will give different results. pytest searches for tests (looks for all files in test folder, or current one named test_*). So if you run it inside src, it won't find any tests.
If I try to run pytest command from tests dir, my test if failed
pgbackup seem to be imported from current directory so when you move inside 'tests' folder it won't be found.
Hope this helps.