I'm using python3.7-slim-buster docker image for my django project. Now I want to use Geo features of django. But it seems I have to install GDAL. So, I do RUN apt-get install gdal and it raises exception "E: Unable to locate package gdal-bin".
Here is my docker file:
FROM python:3.7-slim-buster
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# DB vars
ENV DB_USER_NAME ${DB_USER_NAME}
ENV DB_NAME ${DB_NAME}
ENV DB_HOST ${DB_HOST}
ENV DB_PORT ${DB_PORT}
ENV DB_PASSWORD ${DB_PASSWORD}
ENV DJANGO_SECRET_KEY ${DJANGO_SECRET_KEY}
RUN apt-get install -y gdal-bin python-gdal python3-gdal
RUN ["adduser", "${USER_NAME}", "--disabled-password", "--ingroup", "www-data", "--quiet"]
USER ${USER_NAME}
ADD ${PROJECT_NAME}/ /home/${USER_NAME}/${PROJECT_NAME}
WORKDIR /home/${USER_NAME}/${PROJECT_NAME}
ENV PATH="/home/${USER_NAME}/.local/bin:\${PATH}:/usr/local/python3/bin"
RUN pip install --user -r requirements.txt
CMD python manage.py runserver 0.0.0.0:9000
#CMD gunicorn ${PROJECT_NAME}.wsgi:application --bind 0.0.0.0:8000
EXPOSE 8000
you need to do the following:
RUN apt-get update
RUN apt-get install -y software-properties-common && apt-get update
RUN apt-get install -y python3.7-dev
RUN add-apt-repository ppa:ubuntugis/ppa && apt-get update
RUN apt-get install -y gdal-bin libgdal-dev
ARG CPLUS_INCLUDE_PATH=/usr/include/gdal
ARG C_INCLUDE_PATH=/usr/include/gdal
RUN pip install GDAL
If you can use other base image, here is one with gdal installed:
FROM osgeo/gdal:ubuntu-small-3.2.0
That's because your image doesn't have repository which contain gdal-bin package. So you have to add repository (you can see the guideline here) and install it:
RUN add-apt-repository ppa:ubuntugis/ppa && apt-get update && apt-get install -y gdal-bin python-gdal python3-gdal
Related
I have started learning Docker and I have developed a Python package (not published anywhere, it is just used internally) that installs and works fine locally (here I will call it mypackage). However, when trying to install it in a Docker container, Python in the container fails to recognise it even though during the build of the image no error was raised. The Dockerfile looks like this:
# install Ubuntu 20.04
FROM ubuntu:20.04
# update Ubuntu packages
ARG DEBIAN_FRONTEND=noninteractive
RUN apt update
RUN apt upgrade -y
RUN apt install -y apt-utils \
build-essential \
curl \
mysql-server \
libmysqlclient-dev \
libffi-dev \
libssl-dev \
libxml2-dev \
libxslt1-dev \
unzip \
zlib1g-dev
# install Python 3.9
RUN apt-get install -y software-properties-common gcc && \
add-apt-repository -y ppa:deadsnakes/ppa
RUN apt-get update && apt-get install -y python3.9 python3.9-dev python3.9-distutils python3-pip python3-apt python3.9-venv
# make symlink (overriding default Python3.8 installed with Ubuntu)
RUN rm /usr/bin/python3
RUN ln -s /usr/bin/python3.9 /usr/bin/python3
# copy package files and source code
RUN mkdir mypackage
COPY pyproject.toml setup.cfg setup.py requirements.txt ./mypackage/
COPY src mypackage/src/
# add path
ENV PACKAGE_PATH=/mypackage/
ENV PATH="$PACKAGE_PATH/:$PATH"
# install mypackage
RUN pip3 install -e ./mypackage
CMD ["python3.9", "main.py"]
So the above runs successfully, but if I run sudo docker run -it test_image bin/bash and run pip3 list, the package will not be there and a ModuleNotFoundError when running code depending on mypackage. Interestingly if I create a virtual environment by replacing this:
ENV PACKAGE_PATH=/mypackage/
ENV PATH="$PACKAGE_PATH/:$PATH"
by this:
ENV VIRTUAL_ENV=/opt/venv
RUN python3.9 -m venv $VIRTUAL_ENV
ENV PATH="$VIRTUAL_ENV/bin:$PATH"
it works. Ideally, I want to know why I need to create a virtual environment and how can I run local packages in a container without creating virtual environments.
Python Program does create folder and put some files over there. But when i try to run the program inside docker via CMD
It creates the folder and put files over there and upon completion, the folder somehow gets removed or doesnt show inside the docker image.
I have tried the following things:
Check Folder Exist after creating - It shows folder created over there.
Check inside the docker image using bash - It doesnt show the folder and contents.
The dockerfile is
FROM ubuntu:18.04
# Upgrade installed packages
RUN apt update
RUN apt upgrade -y
ENV TZ=Europe/London
RUN ln -snf /usr/share/zoneinfo/$TZ /etc/localtime && echo $TZ > /etc/timezone
RUN apt-get install -y libreadline-gplv2-dev libncursesw5-dev libssl-dev libsqlite3-dev tk-dev libgdbm-dev libc6-dev libbz2-dev
WORKDIR /code
RUN apt-get -y install python3-pip
RUN apt-get -y install python3-venv
RUN apt -y install python3-setuptools libffi-dev python3-dev
RUN apt install -y curl
RUN apt install -y unzip
RUN apt-get install -y build-essential swig
WORKDIR /code
RUN python3 -m venv .env
RUN . .env/bin/activate && pip install --upgrade pip && curl https://raw.githubusercontent.com/automl/auto-sklearn/master/requirements.txt | LC_ALL=C.UTF-8 xargs -n 1 -L 1 pip install
COPY requirements.txt requirements.txt
RUN . .env/bin/activate && pip install pyenchant && pip install -r requirements.txt
RUN apt install -y libgl1-mesa-glx
RUN apt-get install -y libglib2.0-0
RUN apt-get install -y libenchant1c2a
RUN mkdir embeddings
COPY . .
RUN curl -L http://nlp.stanford.edu/data/glove.6B.zip --output glove.zip
RUN unzip -o glove.zip -d embeddings/
RUN . .env/bin/activate && python nltk_install.py
CMD . .env/bin/activate && python main.py
Changes to filesystem are not stored in docker image. They exist in container created from an image but if you use 'docker run' command a new container is created.
I made the image from ubuntu:18.04 and install python.
However when I did this in docker-compose
command: python manage.py runserver
it shows the path error.
Maybe I didn't set the path??
but how I set the path for docker user??
ERROR: for django Cannot start service django: OCI runtime create failed: container_linux.go:349: starting container process caused "exec: \"python\": executable file not found in $PATH": unknown
ERROR: Encountered errors while bringing up the project.
FROM ubuntu:18.04
ENV PYTHONUNBUFFERED 1
RUN apt-get -y update
RUN apt-get -y install emacs wget
RUN apt-get -y install apache2-dev mysql-client
RUN apt-get -y install mysql-server libmysqlclient-dev
RUN apt-get install -y software-properties-common
RUN add-apt-repository -y ppa:deadsnakes/ppa
RUN apt-get install -y python3.7
RUN apt-get install -y python-pip
RUN pip install uwsgi django mysqlclient tensorflow_hub django-mysql django-extensions djangorestframework django-filter requests_oauthlib mecab-python3 neologdn gensim janome --no-input
RUN pip install keras tensorflow==1.14.0 --no-cache-dir --no-input
RUN mkdir /code
WORKDIR /code
ADD ./src /code/
You can solve this in two ways (works for me):
in docker-compose add:
command: bash -c 'python manage.py runserver'
or you can add CMD command in your Dockerfile:
CMD: python manage.py runserver
I have my python project with tesseract running locally, and it works in Pycharm.
I used docker-compose.yml, having two containers (app and t4re) as follows:
version: '3'
services:
app:
build: .
image: ocr_app:latest
depends_on:
- tesseract
tesseract:
image: tesseractshadow/tesseract4re
container_name: t4re
and my Dockerfile is as follows:
FROM python:3.6.1
# Create app directory
WORKDIR /app
# Bundle app source
COPY venv/src ./src
COPY venv/data ./data
# Install app dependencies
RUN pip install -r src/requirements.txt
CMD python src/ocr.py
and I keep getting these errors:
FileNotFoundError: [Errno 2] No such file or directory: 'tesseract'
pytesseract.pytesseract.TesseractNotFoundError: tesseract is not installed or it's not in your path
I am new to docker and read tons of documents, but I still cannot manage to fix this error. I've read the following answers. I guess I have to link tesseract to the python app with an environment variable, but I do not know how.
Use Tesseract 4 - Docker Container from uwsgi-nginx-flask-docker
TesseractNotFoundError: tesseract is not installed or it's not in your path
You need to install tesseract in your docker image before using it. By default python:3.6.1 image does not have tesseract in it. You need to take ubuntu base image install tesseract and python in it then continue your work.
Here is the docker file for the solution:
FROM ubuntu:18.04
RUN apt-get --fix-missing update && apt-get --fix-broken install && apt-get install -y poppler-utils && apt-get install -y tesseract-ocr && \
apt-get install -y libtesseract-dev && apt-get install -y libleptonica-dev && ldconfig && apt-get install -y python3.6 && \
apt-get install -y python3-pip && apt install -y libsm6 libxext6
Please adjust the python version as per your requirement.
I had this issue on one of my projects that runs on Docker (a Ubuntu container).
To solve that, I had to:
- install pytesseract via requirements.txt; so it your requirements.txt should contain:
pytesseract
- you have to install tesseract-ocr. To do that, you have to include the following lines in your dockerfile:
FROM ubuntu:18.04
ENV PYTHONUNBUFFERED 1
RUN apt-get update && apt-get install -y software-properties-common && add-apt-repository -y ppa:alex-p/tesseract-ocr
RUN apt-get update && apt-get install -y tesseract-ocr-all
RUN apt-get install -y python3-pip python3-minimal libsm6 libxext6
# To make sure that tesseract-ocr is installed, uncomment the following line.
# RUN tesseract --version
I'm trying to dockerize django/python project : badgr-server from here:
I succeed to deployed it on localhost on ubuntu 18.04 without docker.
then I tried to dockerize, the build went well. when I did :
docker container run -it -p 8000:8000 badgr python root/badgr/code/manage.py runserver
and there is nothing on localhost:8000
note: docker container run -it -p 8000:8000 badgr python ./manage.py won't work.
output:
?: (rest_framework.W001) You have specified a default PAGE_SIZE pagination rest_framework setting,without specifying also a DEFAULT_PAGINATION_CLASS.
HINT: The default for DEFAULT_PAGINATION_CLASS is None. In previous versions this was PageNumberPagination. If you wish to define PAGE_SIZE globally whilst defining pagination_class on a per-view basis you may silence this check.
System check identified 1 issue (0 silenced).
August 06, 2019 - 10:01:22
Django version 1.11.21, using settings 'mainsite.settings_local'
Starting development server at http://127.0.0.1:8000/
Quit the server with CONTROL-C.
I changed in setting_local.py the ALLOWED_HOSTS to :
ALLOWED_HOSTS = ['*']
Thanks!
** extra advises are more than welcome!**
this is the Dockerfile :
FROM ubuntu:18.04
# Preparation
RUN apt-get update
# Install server dependencies
RUN apt-get install -y curl git git-core python-virtualenv gcc python-pip python-dev libjpeg-turbo8 libjpeg-turbo8-dev zlib1g-dev libldap2-dev libsasl2-dev swig libxslt-dev automake autoconf libtool libffi-dev libcairo2-dev libssl-dev
RUN pip install virtualenv --upgrade
#RUN apt install libjpeg8-dev zlib1g-dev -y libcairo2
RUN pip install pillow
# Install database
Run apt-get install -y libmariadbclient-dev zlib1g-dev libssl-dev
# Install main dependencies
Run apt-get install -y libffi-dev libxslt-dev libsasl2-dev libldap2-dev
Run apt-get install -y libmariadbclient-dev zlib1g-dev python-dev libssl-dev python-virtualenv
# Install other useful tools
RUN apt-get install -y git vim sudo curl unzip
RUN apt-get install -y sqlite3
# Cleaning
RUN apt-get clean
RUN apt-get purge
# ADD settings.py /root/settings.py
ADD settings_local.py /root/settings_local.py
# Install the backend
RUN mkdir ~/badgr \
&& cd ~/badgr \
&& git clone https://github.com/concentricsky/badgr-server.git code \
&& cd code \
&& pip install -r requirements.txt \
&& cp /root/settings_local.py apps/mainsite/ \
&& ./manage.py migrate \
&& ./manage.py dist
EXPOSE 8000
docker container run --net=host -it -p 8000:8000 badgrrr python root/badgr/code/manage.py runserver and it's worked!
Do anyone know why it's doesn't work on the default network?
does it's wrong to run it like dis?
Tx.