I have successfully run this docker file
# Use an official Python runtime as the base image
FROM python:3.9.12
# Set the working directory in the container
WORKDIR /app
# Copy the current directory contents into the container at /app
COPY . /app
# Install dependencies
RUN pip install --no-cache-dir -r requirements.txt
# Specify the command to run when the container starts
CMD ["python", "app.py"]
However, if I do this:
# Use an official Python runtime as the base image
FROM python:3.9.12-alpine
# Set the working directory in the container
WORKDIR /app
# Copy the current directory contents into the container at /app
COPY . /app
# Install dependencies
RUN pip install --no-cache-dir -r requirements.txt
# Specify the command to run when the container starts
CMD ["python", "app.py"]
It tells me openblas, atlas, lapack... are not found in ['/usr/local/lib', '/usr/lib']. Why would alpine not work and how do I fix this?
Related
I am using LINUX WSL in my Windows 10 system to build docker image but always encounter error:
ERROR: Could not open requirements file: [Errno 2] No such file or directory: 'requirements.txt'
I have already included COPY command inside the dockerfile to copy everything (including the requirements.txt) to the /app directory. This thing always happens when I directly instruct the docker build command without copying the repository folder from the Windows 10 host to the LINUX WSL (using /mnt directory to locate where is the dockerfile in the host system).
However, if I copy the repository folder to the WSL first, it works without problem. I attached the dockerfile below,
#get python
FROM python:3.7
#install odbc unix distribution
RUN apt-get update && apt-get install -y --no-install-recommends \
unixodbc-dev \
unixodbc \
libpq-dev
#set working directory
WORKDIR /app
# Copy the rest of the working directory contents into the container at /app
COPY . .
# Install any needed packages specified in requirements.txt
RUN pip install -r requirements.txt
I get this error: python: can't open file '/src/main.py': [Errno 2] No such file or directory when I try to run a container with an image that was build with the following docker file:
FROM python:3.9-slim AS compile
RUN python -m venv /opt/venv
ENV PATH="/opt/venv/bin:$PATH"
WORKDIR /my-app
COPY requirements.txt .
RUN pip install -r requirements.txt
ADD src/ ./src
RUN pip install .
FROM python:3.9-slim AS build
COPY --from=compile/opt/venv /opt/venv
ENV PATH="/opt/venv/bin:$PATH"
CMD ["python", "/src/main.py"]
I tried this as well and it still gives me the same type of error about not finding the main.py: i tried ./src/main/py, /src/main.py, /src/main.py, ./main.py. I tried everything, I'm starting to suspect the error is elsewhere
The problem is you have multistaged build (2x FROM) and you only add them to the first stage.
FROM python:3.9-slim AS compile
[..]
ADD src/ ./src
ADD setup.py .
RUN pip install .
FROM python:3.9-slim AS build
[..]
You can fix this with a second COPY --from= statement in the 2. stage.
Additionally your CMD is wrong. Either give it the fullpath or start a relative path with a . the dir-/filename (/my-app/src/main.py, ./src/main.py, src/main.py).
FROM python:3.9-slim AS compile
RUN python -m venv /opt/venv
ENV PATH="/opt/venv/bin:$PATH"
WORKDIR /my-app
COPY requirements.txt .
RUN pip install -r requirements.txt
ADD src/ ./src
ADD setup.py .
RUN pip install .
FROM python:3.9-slim AS build
COPY --from=compile/opt/venv /opt/venv
COPY --from=compile/my-app /my-app # ADDED
WORKDIR /my-app # ADDED
ENV PATH="/opt/venv/bin:$PATH"
CMD ["python", "/my-app/src/main.py"] # FIXED
Lastly, you're only setting the workdir in the stage you throw away, but that's only relevant if you don't give the cmd the full path or you need a specific workdir.
/src/main.py is absolute path from the system root.
In order to be relative to you current directory use ./src/main.py
I would simplify your docker file as below:
# base image
FROM amazonlinux:1
# Set the working directory
WORKDIR /app
# Copy the current directory contents into the container at /app
COPY . /app
# Install requirements
RUN pip install -r requirements.txt
# Define environment variable
ENV PYTHONPATH "${PYTHONPATH}:/app"
# Run main.py when the container launches
ENTRYPOINT ["python", "-u", "src/main.py"]
I have basic python docker container file like this:
FROM python:3.8
RUN pip install --upgrade pip
EXPOSE 8000
ENV PYTHONDONTWRITEBYTECODE=1
# Turns off buffering for easier container logging
ENV PYTHONUNBUFFERED=1
# Install pip requirements
COPY requirements.txt .
RUN python -m pip install -r requirements.txt
WORKDIR /app
COPY . /app
RUN useradd appuser && chown -R appuser /app
USER appuser
CMD ["gunicorn", "--bind", "0.0.0.0:8000", "app:app"]
I want to run my flask application in a docker container by using this definition file. Locally I can start a new virtual env, install everything via pip install -r requirements.txt on python 3.8 and it does not fail.
When building the docker image it fails to install all packages from the requirements.txt. For example this package fails:
ERROR: Could not find a version that satisfies the requirement cvxopt==1.2.5.post1
ERROR: No matching distribution found for cvxopt==1.2.5.post1
When I comment out the package in the requirements.txt everything seems to work. The package itself claims to be compatible with python >2.7. Same behavior for the package pywin32==228 here.
Looing at the wheel files in the package, cvxopt.1.2.5.post1 only contains a build for Windows. For Linux (such as the docker container), you should use cvxopt.1.2.5.
You should replace the version with 1.2.5 (pip install cvxopt==1.2.5)
The latest version cvxopt 1.2.5.post1 is not compatible with all architectures: https://pypi.org/project/cvxopt/1.2.5.post1/#files
The previous one is compatible with a lot more hardware and should be able to run on your Docker image: https://pypi.org/project/cvxopt/1.2.5/#files
I'm new to Docker and I'm trying to encapsulate my pytest python project into docker image, I've written the python code using pytest library in order to run the program.
I've created the 'requirement.txt' file for docker.
This is how my 'dockerfile' looks like right now:
FROM python:3.8.3
ADD tests/test_class.py /
COPY requirements.txt ./
RUN pip install -r requirements.txt
But when I run the docker as following:
docker run -it <image_id> py.test -s -v
It produces the following response:
platform linux -- Python 3.8.3, pytest-5.4.3, py-1.8.2, pluggy-0.13.1 -- /usr/local/bin/python
cachedir: .pytest_cache
metadata: {'Python': '3.8.3', 'Platform': 'Linux-4.19.76-linuxkit-x86_64-with-glibc2.2.5', 'Packages': {'pytest': '5.4.3', 'py': '1.8.2', 'pluggy': '0.13.1'}, 'Plugins': {'html': '2.1.1', 'metadata': '1.10.0'}}
rootdir: /
plugins: html-2.1.1, metadata-1.10.0
collecting ...
and it got stuck on the collecting ... without running the test
when I go to the dashboard I notice that my image is not running
First create a Dockerfile in the root directory as below
FROM python:3.6
# Create app directory
WORKDIR /app
# copy the requirements file to the working directory
COPY requirements.txt ./
# then install the requirements before running the app
RUN pip install --no-cache-dir -r requirements.txt
# copy the rest of the file
COPY . /app
then build the image
docker build -t my-pytest . # here . indicates the path to your Dockerfile
Once this is done, take the image id:
docker images | grep my-pytest
then execute the command
docker run -it <IMAGE ID> py.test - s -v
Use something like this:
FROM python:3.6
# Create app directory
WORKDIR /app
# Install app dependencies
COPY src/requirements.txt ./
RUN pip install -r requirements.txt
# Bundle app source
COPY src /app
EXPOSE 8080
CMD [ "python", "server.py" ]
In the folder with your Python project code, create a Dockerfile similar to this:
# Starting from Python 3 base image
FROM python:3
# Set the WORKDIR to /app so all following commands run in /app
WORKDIR /app
# Adding (dev-)requirements files before installing requirements
COPY requirements.txt dev-requirements.txt ./
# Install requirements and dev requirements through pip. Those should include
# nostest, pytest or any other test framework you use
RUN pip install -r requirements.txt -r dev-requirements.txt
# Adding the rest of your project code to the image
COPY . ./
Then, run your tests in the container like this:
docker run -it <image_id> pytest
I've got some non-pip packages,
which I've written into my requirements.txt as:
git+https://github.com/manahl/arctic.git
This seems to work OK on my localhost, but when I do docker build I get this:
Collecting git+https://github.com/manahl/arctic.git (from -r scripts/requirements.txt (line 11))
│ Cloning https://github.com/manahl/arctic.git to /tmp/pip-1gw7spz2-build
And it just seems to hang. It moves on silently after several minutes, but it doesn't look like it's worked at all. It seems to do this for every git based dependency.
What am I doing wrong?
Dockerfile:
FROM python:3.6.1
# Set the working directory to /app
WORKDIR /app
# Copy the current directory contents into the container at /app
ADD . /app
RUN apt-get update && apt-get install -y \
git\
build-essential
# Install any needed packages specified in requirements.txt
RUN pip install -r scripts/requirements.txt
# Run app.py when the container launches
CMD ["python", "scheduler.py"]
If scripts folder exists in current directory try RUN pip install -r /scripts/requirements.txt