How to run docker container by giving an external file as argument - python

So I have a docker project, which is some kind of Python pytest that runs subprocess on an executable file as a blackbox test. I would like to build the container, and then run it each time by copying the executable file to a dedicated folder inside the container WORKDIR (e.g. exec/). But I am not sure how to do it.
Currently, I have to first include the executable in the folder then build the container.
The structure is currently like this:
my_test_repo
| |-- exec
| | |-- my_executable
| |-- tests
| | |-- test_the_executable.py
| |-- Dockerfile
I skipped over some other such as setup.
In the Dockerfile, I do the following:
FROM python:3.7.7-buster
WORKDIR /app
COPY . /app
RUN pip install --trusted-host pypi.python.org .
RUN pip install pytest
ENV NAME "Docker"
RUN pytest ./tests/ --executable=./exec/my_executable
For the last time, I setup a pytest fixture to accept the path of the executable.
I can run the test by building it:
docker build --tag testproject:1.0 .
How can I edit it so that the containers only consists of all the tests file. And it interacts with users so that I can cp my executable from my local dir to the container then run the test?
Many thanks.

What do you mean by edit container?
you can copy the executable files from local dir to the container using docker cp command. but only one file can be copied at a time.
docker cp path_of_executable/file_name docker_container_name:path_to_be_copy/

If I understand your question correctly, change your Dockerfile to look like this:
FROM python:3.7.7-buster
WORKDIR /app
COPY tests/ /app
RUN pip install --trusted-host pypi.python.org .
RUN pip install pytest
ENV NAME "Docker"
ENTRYPOINT ["pytest", "./tests/"]
CMD []
Then entrypoint will be executed at runtime (not buildtime) along with any arguments passed in (this is handled by CMD.)
You can run it like this (after building it like you indicated):
docker testproject --executable=<path to executable>
The documentation for ENTRYPOINT and CMD can be found here.

Related

How to import python file into another python file so that it works in Docker?

When I run my app.py file I can access it on localhost and it works.
After running (and having no issues): docker build -t flask-container .
When I run: docker run -p 5000:5000 flask-container
I get: from helpers import apology, login_required, usd
ModuleNotFoundError: No module named 'helpers'
In app.py I have: from helpers import apology, login_required, usd
I have tried putting in an empty __init__.py folder in the main folder, still doesn't work.
Question: How do I fix the Module Not Found Error when trying to run docker?
Dockerfile
FROM python:3.8-alpine
# By default, listen on port 5000
EXPOSE 5000/tcp
# Set the working directory in the container
WORKDIR /app
# Copy the dependencies file to the working directory
COPY requirements.txt .
# Install any dependencies
RUN pip install -r requirements.txt
# Copy the content of the local src directory to the working directory
COPY app.py .
# Specify the command to run on container start
CMD [ "python", "./app.py" ]
requirements.txt
flask===2.1.0
flask_session
Python Version: 3.10.5
Please copy the helpers.py as well into the working directory.
COPY helpers.py .
OR
ADD . /app
#This will add all files in the current local dir to /app

When building a Docker container of a webapp, how to import functions from different folders to where the main file (app.py) is located?

My repo has the following structure (simplified):
project
+--x
| +--y
| +--z
| +--app.py
+--util
| +--database.py
In app.py, we import a function from database.py:
from util.database import create_database_connection
I want to deploy app.py as a webapp using Docker, and created a Dockerfile to do so:
FROM felipederodrigues/python37withpyodbc:v1
# Copy required folders and set working directory.
COPY . .
WORKDIR /x/y/z
# Install required packages.
RUN pip install --upgrade pip && pip install gunicorn eventlet==0.30.2
RUN pip install --no-cache-dir -r requirements.txt
# Set Python path.
ENV PYTHONPATH "${PYTHONPATH}:/home/site/wwwroot/"
# Expose port 8050.
EXPOSE 8050
# Start Flask app.
CMD gunicorn -b :8050 -k eventlet -w 1 app:server
Building the image and pushing it to the container registry works as expected, however, the webapp itself does not work because of the following error:
ModuleNotFoundError: No module named 'util'
Obviously, the app is unable to reach the util folder. Strangely enough, when I move app.py to folder x, and changed the WORKDIR in the Dockerfile to /x, the webapp does actually work.
So my question is, how can I make this work using the /x/y/z folder structure instead of only /x?

"COPY failed: " While Building a Python Docker Image

I'm trying to create a Docker image using the following Dockerfile.
# syntax=docker/dockerfile:1
FROM python:latest
WORKDIR /project4
COPY pythonCode1.py /project4/pythonCode1.py
COPY requirements.txt /project4/requirements.txt
RUN pip3 install -r requirements.txt
CMD ["python3 ", "pythonCode1.py"]
I get the Error:
COPY failed: file not found in build context or excluded by .dockerignore: stat pythonCode1.py: file does not exist
I have not set up a .dockerignore file and I'm building in a directory that contains:
project4
|-- Dockerfile
|-- pythonCode1.py
|-- requirements.txt
I read some other posts that refered to the docker context, mine just had the default:
`default * Current DOCKER_HOST based configuration unix:///var/run/docker.sock swarm
The problem is that by using
docker build - < Dockerfile
the Build Context does not include the whole dicrectory therefore the file pythonCode1.py is unknown to the Docker Engine.
Use the following docker build command instead.
# the . marks this directory as the directory with the Dockerfile and sets the build context
docker build -t myrepo/myimage .
More information on build context and what it is and why it is required in this blog post.

"No module named x" for user-defined modules in Python deployment using Dockerfile and docker-compose

I'm trying to deploy a Python app as a Docker container using Dockerfile and docker-compose.
The project structure is this:
ms-request
- src
__init__.py
- exceptions
__init__.py
ms_request_exceptions.py
- messaging
__init__.py
receive_rabbit.py
send_rabbit.py
- request
__init__.py
bsrequest.py
- test
__init__.py
test_bsrequest.py
Dockerfile
requirements.txt
In my receive_rabbit.py script, I am importing functions from the request and messaging packages like so:
from src.request import bsrequest
from src.messaging.send_rabbit import send_message
Executing this using PyCharm works fine. Running it from the command line initially didn't work, until I updated the PYTHONPATH using export PYTHONPATH=${PYTHONPATH}:..
I would like to deploy this as a Docker container, so I created a Dockerfile and an entry in my docker-compose.yml for the project.
Dockerfile:
FROM python:3
WORKDIR /bsreq
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY src/ ./src
COPY test/ ./test
RUN export PYTHONPATH=${PYTHONPATH}:.
CMD [ "python", "/bsreq/src/messaging/receive_rabbit.py" ]
docker-compose.yml:
version: "3.3"
services:
rabbitmq: [...]
bs-request:
build: ./ms-request/
depends_on:
- rabbitmq
env_file:
- rabbit.env
[...]
Running this using docker-compose up bs-request always ends in a crash with the error No module named 'src'.
I have tried multiple variations of inputs for the WORKDIR, COPY, PYTHONPATH and CMD lines in the Dockerfile. All lead to the same error. I've tried relative imports, which throw Attempted relative import with no known parent package.
I hope this is an issue others have encountered before. What do I need to do to get this deployment working?
docker layers-way building an image makes your export unusable right after the associated RUN command.
FROM python:3
WORKDIR /bsreq
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY src/ ./src
COPY test/ ./test
RUN export PYTHONPATH=${PYTHONPATH}:. #--> exporting
CMD [ "python", "/bsreq/src/messaging/receive_rabbit.py" ] #--> last export is not persistent
as a workaround you set environment variables that will persist through the build AND in the final image with ENV PYTHONPATH=${PYTHONPATH}:. command.
extra read: https://vsupalov.com/docker-build-time-env-values/
any how, suggested method is to write a setup.py file and install your package with python setup.py install so it would be installed as a package and imports would work.
P.S
a better, more updated way would be to use tools such as poetry that uses pyproject.toml (according to PEP 518, 517) which the future way of python!
bonus read: https://python-poetry.org/
good luck!

How to create a multistage dockerfile for a python app?

Below is the directory structure and the dockerfile for my python application. In order to run the main.py, I need to create a data set by running generate_data.py, which is in the data directory. How can I create a multistage dockerfile in order to first create the data and then run the main.py file? I'm new to using docker and I feel overwhelmed.
FROM python:3.7.2-slim
WORKDIR /usr/src/app
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY . /usr/src/app
CMD ["python", "./src/main.py"]
You can create a shell script then use that for CMD
start.sh:
#!/bin/bash
python generate_data.py
python ./src/main.py
Dockerfile:
FROM python:3.7.2-slim
WORKDIR /usr/src/app
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY . /usr/src/app
CMD ["sh", "start.sh"]
A key point of using docker might be to isolate your programs, so at first glance, you might want to move them to separate containers and talk to each other using a shared volume or a docker network, but if you really need them to run in the same container, you can achieve this by using a bash script. and replacing CMD with:
COPY run.sh
RUN chmod a+x run.sh
CMD ["./run.sh"]
You can also include if statements into a bash script and pass arguments to the bash script through docker.

Categories