Dockerfile: /bin/sh: 1: apt-get: not found - python

When building a Docker file, I get the error
"/bin/sh: 1: apt-get: not found"
docker file:
FROM python:3.8
FROM ubuntu:20.04
ENV PATH="/env/bin/activate"
RUN apt-get update -y && apt-get upgrade -y
WORKDIR /var/www/html/
COPY . .
RUN pip install -r requirements.txt
EXPOSE 8000
CMD ["python", "manage.py"]

You are setting the PATH to /env/bin/activate and that is then the only place where apt-get is searched for. There is no need to activate a virtual env inside the container, just get rid of that line. pip can install the packages in requirements.txt to the "system" Python without issues.
You cannot layer 2 images like you are attempting to do, with multiple FROM statements. Just use FROM python:3.8 and drop the ubuntu. Multiple FROM statements are used in multi-stage builds where you have intermediate images which produce artifacts that are copied to the final image.
So just do:
FROM python:3.8
RUN apt-get update -y && apt-get upgrade -y
WORKDIR /var/www/html/
COPY . .
RUN pip install -r requirements.txt
EXPOSE 8000
CMD ["python", "manage.py"]
.. although why you would put Python code in /var/www/html beats me. Probably you don't.

Related

How can I run a Docker container with Python3.7 and Pipenv for a Flask app?

My Dockerfile is:
FROM ubuntu:18.04
RUN apt-get -y update
RUN apt-get install -y software-properties-common
RUN add-apt-repository ppa:deadsnakes/ppa
RUN apt-get update -y
RUN apt-get install -y python3.7 build-essential python3-pip
ENV LC_ALL C.UTF-8
ENV LANG C.UTF-8
RUN pip3 install pipenv
COPY . /app
WORKDIR /app
RUN pipenv install
EXPOSE 5000
CMD ["pipenv", "run", "python3", "application.py"]
When I do docker build -t flask-sample:latest ., it builds fine (I think).
I run it with docker run -d -p 5000:5000 flask-sample and it looks okay
But when I go to http://localhost:5000, nothing loads. What am I doing wrong?
Why do you need a virtual environment ? Why do you use Ubuntu as base layer:
A simpler approach would be:
Dockerfile:
FROM python:3
WORKDIR /usr/src/
COPY requirements.txt ./
RUN pip install --no-cache-dir -r requirements.txt
COPY app.py .
ENTRYPOINT FLASK_APP=/usr/src/app.py flask run --host=0.0.0.0
You put in your requirements.txt the desired packages (e.g flask).
Build image:
docker build -t dejdej/flasky:latest .
Start container:
docker run -it -p 5000:5000 dejdej/flasky
If it is mandatory to use virtual environment , you can try it with
venv:
FROM python:2.7
RUN virtualenv /YOURENV
RUN /YOURENV/bin/pip install flask
CMD ["/YOURENV/bin/python", "application.py"]
Short answer:
Your container is running pipenv, not your application. You need to fix the last line.
CMD ["pipenv", "run", "python3", "application.py"] should be only CMD ["python3", "application.py"]
Right answer:
I completely agree that there isn´t any reason to use pipenv. Better solution is replace your Dockfile to use a python image and forget pipenv. You already in a container, no reason to use a enviroment.

Run python mysql client on slim python 3.6 docker image

I have a working service running on a python:3.6-jessie image.
I am trying to reduce the size of it to speed up serverless cold starts.
I have tried the images python:3.6-alpine, python:3.6-slim-buster and python:3.6-slim-jessie.
With all of them I end up having to install many additional packages and I end up with the follwing error that I cannot fix with more packages:
ImportError: libmysqlclient.so.18: cannot open shared object file: No such file or directory
My current Dockerfile is
FROM python:3.6-jessie as build
ENV PYTHONUNBUFFERED 0
ENV FLASK_APP "api/app.py"
RUN python -m venv /opt/venv
ENV PATH="/opt/venv/bin:$PATH"
WORKDIR /opt/venv
COPY requirements.txt requirements.txt
RUN pip install -r requirements.txt
COPY . .
FROM python:3.6-slim-jessie
COPY --from=build /opt/venv /opt/venv
WORKDIR /opt/venv
RUN apt-get update
RUN apt-get --assume-yes install gcc
RUN apt-get --assume-yes install python-mysqldb
ENV PATH="/opt/venv/bin:$PATH"
RUN rm -rf configs tests draw_results env .idea .git .pytest_cache
EXPOSE 8000
CMD ["/opt/venv/run.sh"]
The relevant lines from requirements.txt:
mysqlclient==1.4.2.post1
PyMySQL==0.9.3
Flask-SQLAlchemy==2.3.2
SQLAlchemy==1.3.0
The run.sh is just my gunicorn start command.
Is there any package I can use to fix this last issue, is there some other mysql library I should be using or some other way for me to fix this. Or should I just stick to full python:3.6 images when I want a mysql client?
I'm using python:3.7-slim and using the following command
RUN apt-get -y install default-libmysqlclient-dev
Try to add this line to the dockerfile:
RUN apt-get install -y libmysqlclient-dev
For python slim-buster (debian os) use can run this command on Dockerfile.
RUN apt-get update && apt-get install -y default-mysql-client
This worked for me.
I have used python:3.10.6-slim-buster

Why can't my container find a pip installed package (via git)?

I have a Dockerfile
FROM ubuntu:xenial
LABEL maintainer="info#martin-thoma.com"
# Settings for the local user to create
ENV APP_USER docker
ENV APP_USER_UID 9999
ENV APP_USER_GROUP docker
ENV APP_USER_GROUP_GID 4711
ENV PYTHONIOENCODING utf-8
# Install and update software
RUN apt-get update -y && apt-get install -y --fix-missing git python-pip python-dev build-essential poppler-utils libmysqlclient-dev
RUN pip install pip --upgrade
# Copy projects code
COPY . /opt/app
WORKDIR /opt/app
RUN pip install -r requirements.txt
# Create user
RUN groupadd --gid ${APP_USER_GROUP_GID} ${APP_USER_GROUP} \
&& useradd --uid ${APP_USER_UID} --create-home -g ${APP_USER_GROUP} ${APP_USER} \
&& chown -R $APP_USER:$APP_USER_GROUP /opt/app
# Start app
USER docker
RUN mkdir -p /opt/app/filestorage
ENTRYPOINT ["python"]
CMD ["app.py"]
and a requirements.txt
-e git+https://github.com/ecederstrand/exchangelib.git#85eada6d59d0e2c757ef17c6ce143f3c976d2a90#egg=exchangelib
Flask==0.12.2
fuzzywuzzy==0.15.1
When I change the exchangelib line to exchangelib (hence not using git, but the version on PyPI) it works (but my code doesn't work as I need some of the recent changes).
When I have this, I get:
web_1 | ImportError: No module named exchangelib
What is the problem? Why can't my container find a pip installed package (via git)? How do I fix it?
My intuition is that the problem is that I install it as the root user, but the application runs as another user. The PyPI packages seem to get installed for all users while the editable is only local. But I still don't know how to fix it.
Simply using
git+git://github.com/ecederstrand/exchangelib.git#85eada6d59d0e2c757ef17c6ce143f3c976d2a90#egg=exchangelib
as a line in the requirements.txt worked. No change in the docker file was necessary.

Docker build seems to hang when installing non-pip Python packages

I've got some non-pip packages,
which I've written into my requirements.txt as:
git+https://github.com/manahl/arctic.git
This seems to work OK on my localhost, but when I do docker build I get this:
Collecting git+https://github.com/manahl/arctic.git (from -r scripts/requirements.txt (line 11))
│ Cloning https://github.com/manahl/arctic.git to /tmp/pip-1gw7spz2-build
And it just seems to hang. It moves on silently after several minutes, but it doesn't look like it's worked at all. It seems to do this for every git based dependency.
What am I doing wrong?
Dockerfile:
FROM python:3.6.1
# Set the working directory to /app
WORKDIR /app
# Copy the current directory contents into the container at /app
ADD . /app
RUN apt-get update && apt-get install -y \
git\
build-essential
# Install any needed packages specified in requirements.txt
RUN pip install -r scripts/requirements.txt
# Run app.py when the container launches
CMD ["python", "scheduler.py"]
If scripts folder exists in current directory try RUN pip install -r /scripts/requirements.txt

Docker input/output outside the container

I created a docker container with a python script. The python script takes an input file, does some processing and saves output file at some specified location.
docker run /app/script.py --input /data/input.csv --output /data/output.csv
Since the input file can be different every time I run the script, I want it to be outside the docker container. I also would like to store the output somewhere outside the container.
docker run /app/script.py --input /my/local/location/outside/docker/input.csv --output /my/local/location/outside/docker/output.csv
Does docker support this? If so, how would one be able to achieve it?
My Dockerfile looks like the following:
FROM phusion/baseimage
RUN apt-get update
RUN apt-get install -y build-essential
RUN apt-get install -y python-dev
RUN apt-get install -y python-pip
RUN apt-get install -y python-numpy && \
apt-get install -y python-scipy
COPY ./requirements.txt /app/requirements.txt
COPY ./src/script.py /app/script.py
WORKDIR /app
COPY . /app
You could mount a directory with the file inside as a Docker data volume using the -v option: https://docs.docker.com/engine/tutorials/dockervolumes/
docker run -d -P --name myapp -v /app mydir/app python script.py
This will have the added benefit of allowing you to stop the container, make changes to the file, and start the container and see the change reflected within the container.
so you should add to your Dockerfile a line
ENTRYPOINT ["python","/app/script.py"]
and a
CMD myinput
or something similar,
read
What is the difference between CMD and ENTRYPOINT in a Dockerfile?
read the docs about
https://docs.docker.com/engine/reference/builder/#entrypoint
and
https://docs.docker.com/engine/reference/builder/#cmd

Categories