Couldn't use data file .coverage: unable to open database file - python

A strange issue with permissions occured when pushing to GitHub. I have a test job which runs tests with coverage and then pushes results to codecov on every push and pull request. However, this scenario only works with root user.
If running with digitalshop user it throws an error:
Couldn't use data file '/digital-shop-app/.coverage': unable to open database file
My question is: how to run coverage in docker container so it won't throw this error? My guess is that it's because of permissions.
docker-compose.yml:
version: '3.9'
services:
test:
build: .
command: >
sh -c "
python manage.py wait_for_db &&
coverage run --source='.' manage.py test mainapp.tests &&
coverage report &&
coverage xml
"
volumes:
- ./digital-shop-app:/digital-shop-app
env_file: .env
depends_on:
- db
db:
image: postgres:13-alpine
environment:
- POSTGRES_DB=${DB_NAME}
- POSTGRES_USER=${DB_USER}
- POSTGRES_PASSWORD=${DB_PASS}
Dockerfile:
FROM python:3.9-alpine3.13
ENV PYTHONUNBUFFERED 1
COPY ./requirements.txt /requirements.txt
COPY ./digital-shop-app /digital-shop-app
COPY ./scripts /scripts
WORKDIR /digital-shop-app
RUN python -m venv /py && \
/py/bin/pip install --upgrade pip && \
apk add --no-cache bash && \
apk add --update --no-cache postgresql-client && \
apk add --update --no-cache --virtual .tmp-deps \
build-base jpeg-dev postgresql-dev musl-dev linux-headers \
zlib-dev libffi-dev openssl-dev python3-dev cargo && \
apk add --update --no-cache libjpeg && \
/py/bin/pip install -r /requirements.txt && \
apk del .tmp-deps && \
adduser --disabled-password --no-create-home digitalshop && \
chown -R digitalshop:digitalshop /py/lib/python3.9/site-packages && \
chmod -R +x /scripts
ENV PATH="/scripts:/py/bin:/py/lib:$PATH"
USER digitalshop
CMD ["run.sh"]

So I ended up creating another Dockerfile called Dockerfile.test and putting pretty much the same configuration except non-admin user creation. Here's the final variant:
Running code as root user is not recommended thus please read UPDATE section
Dockerfile.test:
FROM python:3.9-alpine3.13
ENV PYTHONUNBUFFERED 1
COPY ./requirements.txt /requirements.txt
COPY ./digital-shop-app /digital-shop-app
WORKDIR /digital-shop-app
RUN python -m venv /py && \
/py/bin/pip install --upgrade pip && \
apk add --no-cache bash curl gnupg coreutils && \
apk add --update --no-cache postgresql-client libjpeg && \
apk add --update --no-cache --virtual .tmp-deps \
build-base jpeg-dev postgresql-dev musl-dev linux-headers \
zlib-dev libffi-dev openssl-dev python3-dev cargo && \
/py/bin/pip install -r /requirements.txt && \
apk del .tmp-deps
ENV PATH="/py/bin:/py/lib:$PATH"
docker-compose.yml:
version: '3.9'
services:
test:
build:
context: .
dockerfile: Dockerfile.test
command: >
sh -c "
python manage.py wait_for_db &&
coverage run --source='.' manage.py test mainapp.tests &&
coverage report &&
coverage xml
"
volumes:
- ./digital-shop-app:/digital-shop-app
env_file: .env
depends_on:
- db
I don't know exactly whether it is a good practice. If not then please tell how to do it correctly.
UPDATE:
Thanks to #β.εηοιτ.βε for giving me food for thought.
After some local debugging I found out that coverage needs user to own the directory where .coverage file is located. So I created subdir named /cov inside project folder and set digitalshop user as its owner including everything inside. Finally I specified path to .coverage file by setting env variable COVERAGE_FILE=/digital-shop-app/cov/.coverage where digital-shop-app is project root folder. And also specified the same path to coverage.xml report in docker-compose.yml. Here's the code:
docker-compose.yml (added -o flag to coverage xml command):
version: '3.9'
services:
test:
build:
context: .
command: >
sh -c "
python manage.py wait_for_db &&
coverage run --source='.' manage.py test mainapp.tests &&
coverage xml -o /digital-shop-app/cov/coverage.xml
"
env_file: .env
depends_on:
- db
db:
image: postgres:13-alpine
environment:
- POSTGRES_DB=${DB_NAME}
- POSTGRES_USER=${DB_USER}
- POSTGRES_PASSWORD=${DB_PASS}
Dockerfile:
FROM python:3.9-alpine3.13
ENV PYTHONUNBUFFERED 1
COPY ./requirements.txt /requirements.txt
COPY ./digital-shop-app /digital-shop-app
COPY ./scripts /scripts
WORKDIR /digital-shop-app
RUN python -m venv /py && \
/py/bin/pip install --upgrade pip && \
apk add --no-cache bash && \
apk add --update --no-cache postgresql-client && \
apk add --update --no-cache --virtual .tmp-deps \
build-base jpeg-dev postgresql-dev musl-dev linux-headers \
zlib-dev libffi-dev openssl-dev python3-dev cargo && \
apk add --update --no-cache libjpeg && \
/py/bin/pip install -r /requirements.txt && \
apk del .tmp-deps && \
adduser --disabled-password --no-create-home digitalshop && \
chown -R digitalshop:digitalshop /py/lib/python3.9/site-packages && \
chmod -R +x /scripts && \
# New code here
mkdir -p /digital-shop-app/cov && \
chown -R digitalshop:digitalshop /digital-shop-app/cov
ENV PATH="/scripts:/py/bin:/py/lib:$PATH"
USER digitalshop
CMD ["run.sh"]

Related

Docker Invalid Reference Format Error while building in bamboo

In my dockerfile, I was previously using FROM python:3.9-alpineon top of which librdkafka 1.9.2 is built and this was successful. But today, with the same docker file, the build failed by throwing the below error:
#error "confluent-kafka-python requires librdkafka v2.0.2 or later. Install the latest version of librdkafka from the Confluent repositories, see http://docs.confluent.io/current/installation.html".
When I searched in the internet, alpine:edge seems to have the newest version of librdkafka package. So I changed the dockerfile to FROM python:3.9-alpine:edge. But on building, this threw my an error:
Step 1/41 : FROM python:3.9-alpine:edge build 25-Jan-2023 10:25:20 invalid reference format build 25-Jan-2023 10:25:20 [?1h=[41m[37;1mAn error occurred when executing task '
I am new to docker and I used https://www.docker.com/blog/how-to-use-the-alpine-docker-official-image/ for the format. Please do help me regarding this.
This is my dockerfile currently:
FROM python:3.9-alpine:edge
RUN adduser -D pythonwebapi
WORKDIR /home/pythonwebapi
COPY requirements.txt requirements.txt
COPY logger_config.py logger_config.py
# COPY kong.ini kong.ini
# COPY iot.ini iot.ini
# COPY project.ini project.ini
# COPY eom.ini eom.ini
# COPY notify.ini notify.ini
RUN echo 'http://dl-3.alpinelinux.org/alpine/v3.9/main' >> /etc/apk/repositories
RUN apk update \
&& apk upgrade \
&& apk add --no-cache build-base \
autoconf \
bash \
bison \
boost-dev \
cmake \
flex \
zlib-dev
RUN apk add make gcc g++
RUN apk add libffi-dev
RUN apk update && apk --no-cache add librdkafka-dev
RUN apk add postgresql-dev gcc python3-dev musl-dev
RUN pip install --upgrade pip && pip install -r requirements.txt && pip install gunicorn
RUN apk del gcc g++ make
RUN pip install --no-cache-dir six pytest numpy cython
RUN pip install --no-cache-dir pandas
RUN pip install --no-cache-dir confluent-kafka
ARG ARROW_VERSION=3.0.0
ARG ARROW_SHA1=c1fed962cddfab1966a0e03461376ebb28cf17d3
ARG ARROW_BUILD_TYPE=release
ENV ARROW_HOME=/usr/local \
PARQUET_HOME=/usr/local
#Download and build apache-arrow
RUN mkdir -p /arrow \
&& wget -q https://github.com/apache/arrow/archive/apache-arrow-${ARROW_VERSION}.tar.gz -O /tmp/apache-arrow.tar.gz \
&& echo "${ARROW_SHA1} *apache-arrow.tar.gz" | sha1sum /tmp/apache-arrow.tar.gz \
&& tar -xvf /tmp/apache-arrow.tar.gz -C /arrow --strip-components 1 \
&& mkdir -p /arrow/cpp/build \
&& cd /arrow/cpp/build \
&& cmake -DCMAKE_BUILD_TYPE=$ARROW_BUILD_TYPE \
-DOPENSSL_ROOT_DIR=/usr/local/ssl \
-DCMAKE_INSTALL_LIBDIR=lib \
-DCMAKE_INSTALL_PREFIX=$ARROW_HOME \
-DARROW_WITH_BZ2=ON \
-DARROW_WITH_ZLIB=ON \
-DARROW_WITH_ZSTD=ON \
-DARROW_WITH_LZ4=ON \
-DARROW_WITH_SNAPPY=ON \
-DARROW_PARQUET=ON \
-DARROW_PYTHON=ON \
-DARROW_PLASMA=ON \
-DARROW_BUILD_TESTS=OFF \
.. \
&& make -j$(nproc) \
&& make install \
&& cd /arrow/python \
&& python setup.py build_ext --build-type=$ARROW_BUILD_TYPE --with-parquet \
&& python setup.py install \
&& rm -rf /arrow /tmp/apache-arrow.tar.gz
COPY app app
COPY init_app.py ./
ENV FLASK_APP init_app.py
RUN chown -R pythonwebapi:pythonwebapi ./
RUN chown -R 777 ./
USER pythonwebapi
EXPOSE 8000 7000
ENTRYPOINT ["gunicorn","--timeout", "7000","init_app:app","-k","uvicorn.workers.UvicornWorker","-b","0.0.0.0"]```

Error while building confluent-kafka in docker for alphine image

I am trying to build a python application which require confluent-kafka package. But while building in bamboo, I got the below error
fatal error: librdkafka/rdkafka.h: No such file or directory
build 13-Dec-2022 11:46:59 23 | #include <librdkafka/rdkafka.h>
build 13-Dec-2022 11:46:59 | ^~~~~~~~~~~~~~~~~~~~~~
My dockerfile is as such:
FROM python:3.9-alpine
RUN adduser -D pythonwebapi
WORKDIR /home/pythonwebapi
COPY requirements.txt requirements.txt
COPY logger_config.py logger_config.py
RUN echo 'http://dl-3.alpinelinux.org/alpine/v3.9/main' >> /etc/apk/repositories
RUN apk update \
&& apk upgrade \
&& apk add --no-cache build-base \
autoconf \
bash \
bison \
boost-dev \
cmake \
flex \
# libressl-dev \
zlib-dev
RUN apk add make gcc g++
RUN apk add libffi-dev
RUN apk update && apk --no-cache add librdkafka-dev
RUN apk add postgresql-dev gcc python3-dev musl-dev
RUN pip install --upgrade pip && pip install -r requirements.txt && pip install gunicorn
RUN apk del gcc g++ make
RUN pip install --no-cache-dir six pytest numpy cython
RUN pip install --no-cache-dir pandas
RUN pip install --no-cache-dir confluent-kafka
ARG ARROW_VERSION=3.0.0
ARG ARROW_SHA1=c1fed962cddfab1966a0e03461376ebb28cf17d3
ARG ARROW_BUILD_TYPE=release
ENV ARROW_HOME=/usr/local \
PARQUET_HOME=/usr/local
#Download and build apache-arrow
RUN mkdir -p /arrow \
&& wget -q https://github.com/apache/arrow/archive/apache-arrow-${ARROW_VERSION}.tar.gz -O /tmp/apache-arrow.tar.gz \
&& echo "${ARROW_SHA1} *apache-arrow.tar.gz" | sha1sum /tmp/apache-arrow.tar.gz \
&& tar -xvf /tmp/apache-arrow.tar.gz -C /arrow --strip-components 1 \
&& mkdir -p /arrow/cpp/build \
&& cd /arrow/cpp/build \
&& cmake -DCMAKE_BUILD_TYPE=$ARROW_BUILD_TYPE \
-DOPENSSL_ROOT_DIR=/usr/local/ssl \
-DCMAKE_INSTALL_LIBDIR=lib \
-DCMAKE_INSTALL_PREFIX=$ARROW_HOME \
-DARROW_WITH_BZ2=ON \
-DARROW_WITH_ZLIB=ON \
-DARROW_WITH_ZSTD=ON \
-DARROW_WITH_LZ4=ON \
-DARROW_WITH_SNAPPY=ON \
-DARROW_PARQUET=ON \
-DARROW_PYTHON=ON \
-DARROW_PLASMA=ON \
-DARROW_BUILD_TESTS=OFF \
.. \
&& make -j$(nproc) \
&& make install \
&& cd /arrow/python \
&& python setup.py build_ext --build-type=$ARROW_BUILD_TYPE --with-parquet \
&& python setup.py install \
&& rm -rf /arrow /tmp/apache-arrow.tar.gz
COPY app app
COPY init_app.py ./
ENV FLASK_APP init_app.py
RUN chown -R pythonwebapi:pythonwebapi ./
RUN chown -R 777 ./
USER pythonwebapi
EXPOSE 8000 7000
ENTRYPOINT ["gunicorn","--timeout", "7000","init_app:app","-k","uvicorn.workers.UvicornWorker","-b","0.0.0.0"]
I am unable to gauge why the error is coming since librdkafka is already installed.My requirement is to use alpine image. Can anyone please help me regarding this?

Django/Python Docker Libreoffice Subprocess

I am trying to use libreoffice in my Django app, to convert a docx file to pdf using python subprocess.
I have included libreoffice in my dockerfile:
Dockerfile:
FROM python:3.8-alpine
ENV PYTHONUNBUFFERED 1
COPY ./requirements.txt /requirements.txt
COPY ./behavioursolutiondjango /behavioursolutiondjango
COPY ./scripts /scripts
WORKDIR /behavioursolutiondjango
EXPOSE 8000
RUN python -m venv /py && \
/py/bin/pip install --upgrade pip && \
apk add --update python3-dev \
xmlsec xmlsec-dev \
gcc \
libc-dev \
libreoffice \
libffi-dev && \
apk add --update --no-cache postgresql-client && \
apk add --update --no-cache --virtual .tmp-deps \
build-base postgresql-dev musl-dev linux-headers && \
/py/bin/pip install -r /requirements.txt && \
apk del .tmp-deps && \
adduser --disabled-password --no-create-home app && \
mkdir -p /vol/web/static && \
mkdir -p /vol/web/media && \
chown -R app:app /vol && \
chmod -R 755 /vol && \
chmod -R +x /scripts
ENV PATH="/scripts:/py/bin:$PATH"
USER app
CMD ["run.sh"]
And have the following running to do the conversion:
subprocess.call(["soffice", "--headless", "--convert-to", "pdf", new_cert.cert.path])
But I am running into the following error:
LibreOffice 7.2 - Fatal Error: The application cannot be started.
User installation could not be completed.
I have spent hours on this and cannot figure out what im missing.
I would be more than happy to use something other than Libreoffice, but cannot find something that will work, other than libreoffice.

Dockerfile returns dns error when trying to create an image

I'm trying to build a docker image, return error:
DNS lookup error
Dockerfile:
FROM python:3.7-alpine
LABEL maintainer="r.ofc#hotmail.com"
ENV PROJECT_ROOT /app
WORKDIR $PROJECT_ROOT
RUN apk update \
&& apk add mariadb-dev \
gcc\
python3-dev \
pango-dev \
cairo-dev \
libtool \
linux-headers \
musl-dev \
libffi-dev \
openssl-dev \
jpeg-dev \
zlib-dev
RUN pip install --upgrade pip
COPY requirements.txt requirements.txt
RUN pip install -r requirements.txt
COPY . .
CMD python manage.py runserver 0.0.0.0:8000
i'm runing kubernetes locale useing minikube

Environment variables are not being read in django/docker environment

I am running a dockerized django app and would like to pass my AWS credentials into my settings.py. So I set my environment variables in my .bash_profile and then in settings.py I do: os.getenv[AWS_ACCESS_KEY_ID]. I also tried os.environ[...].
When I enter my shell for debugging I get:
/usr/local/lib/python3.6/os.py in __getitem__(self, key)
667 except KeyError:
668 # raise KeyError with the original key value
--> 669 raise KeyError(key) from None
670 return self.decodevalue(value)
So i figure the environment variable is None, thus is not read. Is that a problem with docker/bash? I mean that it is not accessible? Or should it be? Or do I have to set it in my Dockerfile? If so, how?
Any hint or help is very much appreciated! Thanks in advance!
Here is my dockerfile:
FROM python:3.6-alpine
ENV PYTHONUNBUFFERED 1
RUN apk update \
# psycopg2 dependencies
&& apk add --virtual build-deps gcc python3-dev musl-dev \
&& apk add postgresql-dev \
# Pillow dependencies
&& apk add jpeg-dev zlib-dev freetype-dev lcms2-dev openjpeg-dev tiff-dev tk-dev tcl-dev \
# CFFI dependencies
&& apk add libffi-dev py-cffi \
# Translations dependencies
&& apk add gettext \
# https://docs.djangoproject.com/en/dev/ref/django-admin/#dbshell
&& apk add postgresql-client
# Requirements are installed here to ensure they will be cached.
COPY ./requirements /requirements
RUN pip install -r /requirements/local.txt
COPY ./compose/production/django/entrypoint /entrypoint
RUN sed -i 's/\r//' /entrypoint
RUN chmod +x /entrypoint
COPY ./compose/local/django/start /start
RUN sed -i 's/\r//' /start
RUN chmod +x /start
WORKDIR /app
ENTRYPOINT ["/entrypoint"]
And I run my docker with: docker-compose -f local.yml up
I build it with docker-compose -f local.yml build

Categories