How to get code coverage of a python container from JUnit 5? - python

I am writing tests for Docker containers using Testcontainer framework. So far I've been collecting code coverage with Jacoco by reading from a Jacoco port exposed by the container like so:
<Container>.addEnv("JAVA_TOOL_OPTIONS", "-javaagent:/agent/jacocoagent.jar=output=tcpserver,address=*,port=" + JACOCO_PORT)
then reading from the socket and consolidating results in gradle.
I now need a code coverage tool for Python Docker Container (while still writing integration tests in java). Is there a library to get cross-language code coverage? Is there a better way to do this?

Related

Is there a standard way to use Python on Apache NiFi Docker image?

My team is using NiFi scripts in order to do some data processing tasks. We are planning to deploy the NiFi cluster on Kubernetes so we are using apache/nifi docker image for this purpose. We are starting with the NiFi, so we don't have much experience working on the same. We are running some python scripts using NiFi. As the team mentioned that Python is required in the NiFi environment. So, I've made some modifications to the Dockerfile on the apache/nifi source code and built a custom image that has Python installed. Currently, everything is working.
Currently, I'm thinking that is it the right approach? Is there any other methods to work with Python on NiFi Docker? The other difficulties that I'll met soon is when I want to upgrade NiFi, I'll have to go through all the process like fetching the latest NiFi source code and then edit and add Python to the source code and then building and pushing the image to my repository/registry.
Is there a standard way to do this?

Testing python Azure function app before deployment using Github Workflows

I have a python function app that I want I am deploying to Azure using a github standard workflow. I want to add a job before deploying that also runs all the units tests for the function. Locally, I am using pytest to run the unit tests. The structure of the folders is as shown below.
When running locally I do 'python -m pytest tests' and it runs all the tests.
When I add the same thing in the github workflow it gives me error
ERROR: file or directory not found: tests
The job is described as:
Is there a way to run the tests using the workflow?
I was able to solve this issue by using the command
python -m pytest ./function/tests

Python Running Coverage report on host when data collected inside container

I am using Coverage.py
My use case is as follows:
My test is devised of several different processes. Each inside its own container. I would like to get a single coverage report on the entire test
The python packages that I would like to track are usually installed in the base python of the containers, and in one container it is in virtual environment (instead of base python)
I output a different .coverage file for each and would like to
Combine them (they do share some libraries that I would like to get coverage on)
Report to html
The issue is that the paths are local to the container which captured the coverage.
I would like to run combine and report on the host.
I know that for combine I can use [paths] to alias different paths that are the same (haven't worked with it yet since I am trying to run a single process at first).
I am not sure how to get the correct paths in the report.
When I run report I get an error of "no source for code" which makes sense but I can't find a way to run the report on the host even though all of the code exists on the host.

How to measure test coverage of app on python if application and tests in different projects

I want to measure coverage in my project by integration tests (integration of several microservices). Applications - python, tests - pytest.
I know about pytest-cov, but the problem is that my application and tests start in different docker containers. And all interaction between app and tests is carried out through http. So tests know nothing about applications code and vice versa the same.
I know that in C/ะก# it is possible to make special build (instrumental build - or something like this (the name may be wrong :-))). The main idea it is that after work, application generate some report with coverage and you can check it.
Is there something similar for python? Or may be some another way?
However you start your application under test, you can start it with coverage.py instead. If you normally use python my_app.py, use coverage run my_app.py instead. When the app finishes, you will have coverage data for the application.
Coverage.py is the coverage measurement tool that pytest-cov coordinates. You will get the same measurement.

How to make images of Docker containers for Python, R and MongoDB work together

The stack I selected for my project are Python, R and MongoDB. However, I'd like to adopt Docker for this project but when I did my research on the internet, I pretty much found example for MySQL with PHP or Wordpress. So, I'm curious to know where I can find tutorials or example for using containers with Python, R, and MongoDB or any idea on how to put them together. What will the Dockerfile will be like? Especially, in my project, R used for data processing and data visualisation will be called from Python used for data collector as a sub-module for data cleaning as well.
Any help will be appreciated.
Option 1:
Split them in multiple docker images and run them all using docker-compose from a YAML that will set them all up easier.
There's probably already an image for each of those services that you can use and just add some code to them using docker volumes. Just look for them at Docker Hub.
Example of use with exiting Python Image are already in its description. It even shows how to create your own Docker Image using a Dockerfile which you need for each image.
Option 2:
You can build just one image using a less specific image (let's say debian/ubuntu), install all interpreters, libraries and other requirements inside and then create an ENTRYPOINT which will call a script that will run each service and keep open to avoid the container finalisation.

Categories