When tests are launched in GitLab CI, pytest-sugar doesn't show output like in local launching. What the problem can be?
My gitlab config:
image: project.com/path/dir
stages:
- tests
variables:
TESTS_ENVIRORMENT:
value: "--stage my_stage"
description: "Tests launch my_stage as default"
before_script:
- python3 --version
- pip3 install --upgrade pip
- pip3 install -r requirements.txt
api:
stage: tests
script:
- pytest $TESTS_ENVIRORMENT Tests/API/ -v
Local:
GitLab:
It seems that there's a problem with pytest-sugar inside containers. Add --force-sugar option to pytest call, it worked for me
By default docker container do not allocate a pseudo-terminal(tty) as a result not stdout, its simple output from console.
There is not clear solution for that case, mostly needs to do workarounds and try special python libraries.
Related
Whenever I open my gitpod workspace I have to re-install my requirements.txt file. I was reading about the gitpod.yml file and see that I have to add it in there so the dependencies get installed during the prebuild.
I can't find any examples of this so I just want to see if I understand it correctly.
Right now my gitpod.yml file looks like this...
image:
file: .gitpod.Dockerfile
# List the start up tasks. Learn more https://www.gitpod.io/docs/config-start-tasks/
tasks:
- init: echo 'init script' # runs during prebuild
command: echo 'start script'
# List the ports to expose. Learn more https://www.gitpod.io/docs/config-ports/
ports:
- port: 3000
onOpen: open-preview
vscode:
extensions:
- ms-python.python
- ms-azuretools.vscode-docker
- eamodio.gitlens
- batisteo.vscode-django
- formulahendry.auto-close-tag
- esbenp.prettier-vscode
Do I just add these two new 'init' and 'command' lines under tasks?
image:
file: .gitpod.Dockerfile
# List the start up tasks. Learn more https://www.gitpod.io/docs/config-start-tasks/
tasks:
- init: echo 'init script' # runs during prebuild
command: echo 'start script'
- init: pip3 install -r requirements.txt
command: python3 manage.py
# List the ports to expose. Learn more https://www.gitpod.io/docs/config-ports/
ports:
- port: 3000
onOpen: open-preview
vscode:
extensions:
- ms-python.python
- ms-azuretools.vscode-docker
- eamodio.gitlens
- batisteo.vscode-django
- formulahendry.auto-close-tag
- esbenp.prettier-vscode
Thanks so much for your help. I'm still semi-new to all this and trying to figure my way around.
To install requirements in the prebuild, you have to install them in the Dockerfile. The exception is editable installs, pip install -e ..
For example, to install a package named <package-name>, add this line to .gitpod.Dockerfile:
RUN python3 -m pip install <package-name>
Installing from a requirements file is slightly trickier because the Dockerfile can't "see" the file when it's building. One workaround is to give the Dockerfile the URL of the requirements file in the repo.
RUN python3 -m pip install -r https://gitlab.com/<gitlab-username>/<repo-name>/-/raw/master/requirements.txt
Edit: Witness my embarrassing struggle with the same issue today: https://github.com/gitpod-io/gitpod/issues/7306
I have a simple python dockerized application whose structure is
/src
- server.py
- test_server.py
Dockerfile
requirements.txt
in which the docker base image is Linux-based, and server.py exposes a FastAPI endpoint.
For completeness, server.py looks like this:
from fastapi import FastAPI
from pydantic import BaseModel
class Item(BaseModel):
number: int
app = FastAPI(title="Sum one", description="Get a number, add one to it", version="0.1.0")
#app.post("/compute")
async def compute(input: Item):
return {'result': input.number + 1}
Tests are meant to be done with pytest (following https://fastapi.tiangolo.com/tutorial/testing/) with a test_server.py:
from fastapi.testclient import TestClient
from server import app
import json
client = TestClient(app)
def test_endpoint():
"""test endpoint"""
response = client.post("/compute", json={"number": 1})
values = json.loads(response.text)
assert values["result"] == 2
Dockerfile looks like this:
FROM tiangolo/uvicorn-gunicorn:python3.7
COPY . /app
RUN pip install -r requirements.txt
WORKDIR /app/src
EXPOSE 8000
CMD ["uvicorn", "server:app", "--host", "0.0.0.0", "--port", "8000", "--reload"]
At the moment, if I want to run the tests on my local machine within the container, one way to do this is
Build the Docker container
Run the container, get its name via docker ps
Run docker exec -it <mycontainer> bash and execute pytest to see the tests passing.
Now, I would like to run tests in Azure DevOps (Server) before pushing the image to my Docker registry and triggering a release pipeline. If this sounds an OK thing to do, what's the proper way to do it?
So far, I hoped that something along the lines of adding a "PyTest" step in the build pipeline would magically work:
I am currently using a Linux agent, and the step fails with
The failure is not surprising, as (I think) the container is not run after being built, and therefore pytest can't run within it either :(
Another way to solve the solve this is to include pytest commands in the Dockerfile and deal with the tests in a release pipeline. However I would like to decouple the testing from the container that is ultimately pushed to the registry and deployed.
Is there a standard way to run pytest within a Docker container in Azure DevOps, and get a graphical report?
Update your azure-pipelines.yml file as follows to run the tests in Azure Pipelines
Method-1 (using docker)
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
steps:
- task: Docker#2
inputs:
command: 'build'
Dockerfile: '**/Dockerfile'
arguments: '-t fast-api:$(Build.BuildId)'
- script: |
docker run fast-api:$(Build.BuildId) python -m pytest
displayName: 'Run PyTest'
Successfull pipeline screenshot
Method-2 (without docker)
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
strategy:
matrix:
Python37:
python.version: '3.7'
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'
displayName: 'Use Python $(python.version)'
- script: |
python -m pip install --upgrade pip
pip install -r requirements.txt
displayName: 'Install dependencies'
- script: |
pip install pytest pytest-azurepipelines
python -m pytest
displayName: 'pytest'
BTW, I have one simple FastAPI project, you can reference if your want.
Test your docker script using pytest-azurepipelines:
- script: |
python -m pip install --upgrade pip
pip install pytest pytest-azurepipelines
pip install -r requirements.txt
pip install -e .
displayName: 'Install dependencies'
- script: |
python -m pytest /src/test_server.py
displayName: 'pytest'
Running pytest with the plugin pytest-azurepipelines will let you see your test results in the Azure Pipelines UI.
https://pypi.org/project/pytest-azurepipelines/
You can run your unit tests directly from within your Docker container using pytest-azurepipelines (that you need to install previously in the Docker image):
- script: |
docker run --mount type=bind,source="$(pwd)",target=/results \
--entrypoint /bin/bash my_docker_image \
-c "cd results && pytest"
displayName: 'tests'
continueOnError: true
pytest will create an xml file containing the test results, that will be made available to Azure DevOps pipeline thanks to the --mount flag in the docker run command. Then pytest-azurepipelines will publish directly the results to Azure DevOps.
I have the following yaml pipeline build file:
pr:
branches:
include:
- master
jobs:
- job: 'Test'
pool:
vmImage: 'Ubuntu-16.04'
strategy:
matrix:
Python36:
python.version: '3.6'
maxParallel: 4
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'
architecture: 'x64'
env:
POSTGRES: $(POSTGRES)
- script: python -m pip install --upgrade pip && pip install -r requirements.txt
displayName: 'Install dependencies'
- script: |
pip install pytest
pytest tests -s --doctest-modules --junitxml=junit/test-results.xml
displayName: 'pytest'
I set the variable POSTGRES in the pipeline settings as a secret variable. In the python code all environment variables are read with the call
if not os.getenv(var):
raise ValueError(f'Environment variable \'{var}\' is not set')
When the build is executed it will throw exactly the above error for the POSTGRES variable. Are the environment variables not set correctly?
To make the environment variable available in the Python script, you need to define it in the step where it's used:
- script: |
pip install pytest
pytest tests -s --doctest-modules --junitxml=junit/test-results.xml
displayName: 'pytest'
env:
POSTGRES: $(POSTGRES)
I don't know if you still need this but...
If you take a look at the documentation here it says:
Unlike a normal variable, they are not automatically decrypted into
environment variables for scripts. You can explicitly map them in,
though.
So it looks like you were doing it right. Maybe try using a different name for the mapped variable. It could be the name of the initial encrypted variable is confounding the mapping (because it's already a variable it won't remap it).
I have a problem when I was used CI/CD on the gitlab. I always use "python:latest" but it is 2.7.5 version, but I want to use python2.7.15 or python3.7. How can I install it?
-
image: python:latest
services:
- mongo:latest
variables:
MONGO_DB: ipc_alert
cache:
paths:
- ~/.cache/pip/
before_script:
- python -V
- pip install -r req.txt
stages:
- test
test:
stage: test
script:
- echo 'Testing'
On the image you're posting, you have a different problem. You cannot find django on a valid version for your requirements.
About the question itself, if you want to test against multiple versions, you need to create more than one test. For example:
test:
stage: test
script:
- echo 'Testing'
That's will be:
test-python2.7:
stage: test
image: python:2.7
script:
- echo 'Testing'
test-python3.4:
stage: test
image: python:3.4
script:
- echo 'Testing'
test-python3.5:
stage: test
image: python:3.5
script:
- echo 'Testing'
test-python3.6:
stage: test
image: python:3.6
script:
- echo 'Testing'
test-python3.7:
stage: test
image: python:3.7
script:
- echo 'Testing'
test-python.latest:
stage: test
image: python:latest
script:
- echo 'Testing'
However, maybe this doesn't work, because you're using a "Shell executor". If I remember correctly, this runner execute your code against the current machine. You need to install docker and create a new runner who uses these docker. Without it, you cannot test against different environments / versions.
One exception to this, are if you have all python versions you need installed on your machine, and calls each python concrete version. It depends on your environment, but you can check on /usr/bin if you have multiple python versions. On my machine, I have on /usr/bin these ones:
maqui#kanade:~$ python -V
Python 2.7.15+
maqui#kanade:~$ python2.6 -V
Python 2.6.8
maqui#kanade:~$ python2.7 -V
Python 2.7.15+
maqui#kanade:~$ python3.6 -V
Python 3.6.8rc1
maqui#kanade:~$ python3.7 -V
Python 3.7.2rc1
(As you can see, python is an alias for python2.7).
I'm trying to familiarize myself with the Gitlab CI environment with a test project, https://gitlab.com/khpeek/CI-test. The project has the following .gitlab-ci.yml:
image: python:2.7-onbuild
services:
- rethinkdb:latest
test_job:
script:
- pytest
The problem is that the test_job job in the CI pipeline fails with the following error message:
Running with gitlab-ci-multi-runner 9.0.1 (a3da309)
on docker-auto-scale (e11ae361)
Using Docker executor with image python:2.7-onbuild ...
Starting service rethinkdb:latest ...
Pulling docker image rethinkdb:latest ...
Using docker image rethinkdb:latest ID=sha256:23ecfb08823bc5483c6a955b077a9bc82899a0df2f33899b64992345256f22dd for service rethinkdb...
Waiting for services to be up and running...
Using docker image sha256:aaecf574604a31dd49a9d4151b11739837e4469df1cf7b558787048ce4ba81aa ID=sha256:aaecf574604a31dd49a9d4151b11739837e4469df1cf7b558787048ce4ba81aa for predefined container...
Pulling docker image python:2.7-onbuild ...
Using docker image python:2.7-onbuild ID=sha256:5754a7fac135b9cae7e02e34cc7ba941f03a33fb00cf31f12fbb71b8d389ece2 for build container...
Running on runner-e11ae361-project-3083420-concurrent-0 via runner-e11ae361-machine-1491819341-82630004-digital-ocean-2gb...
Cloning repository...
Cloning into '/builds/khpeek/CI-test'...
Checking out d0937f33 as master...
Skipping Git submodules setup
$ pytest
/bin/bash: line 56: pytest: command not found
ERROR: Job failed: exit code 1
However, there is a requirements.txt in the repository with the single line pytest==3.0.7 in it. It seems to me from the Dockerfile of the python:2.7-onbuild image, however, that pip install -r requirements.txt should get run on build. So why is pytest not found?
If you look at the Dockerfile you linked to, you'll see pip install -r requirements.txt is part of an onbuild command. This is useful if you want to create a new container from that first one and install a bunch of requirements. The pip install -r requirements.txt command is therefore not executed within the container in your CI pipeline and if it were, it would be executed at the very beginning, even before your gitlab repository was cloned.
I would suggest you modify your .gitlab-ci.yml file this way
image: python:2.7-onbuild
services:
- rethinkdb:latest
test_job:
script:
- pip install -r requirements.txt
- pytest
The problem seems to be intermittent: although the first time it took 61 minutes to run the tests (which initially failed), now it takes about a minute (see screen grab below).
For reference, the testing repository is at https://gitlab.com/khpeek/CI-test. (I had to add a before_script with some pip installs to make the job succeed).