how to make docker build run python manage.py migrate - python

I'm very new to docker, am trying to use it with Django, here is my DockerFile :
FROM python:3.6
RUN mkdir /app
WORKDIR /app
ADD . /app/
ENV PYTHONUNBUFFERED 1
ENV LANG C.UTF-8
ENV DEBIAN_FRONTEND=noninteractive
ENV PORT=8000
RUN apt-get update && apt-get install -y --no-install-recommends \
tzdata \
python3-setuptools \
python3-pip \
python3-dev \
python3-venv \
git \
&& \
apt-get clean && \
rm -rf /var/lib/apt/lists/*
RUN pip3 install --upgrade pip
RUN pip3 install pipenv
RUN pip install -r requirements.txt && python manage.py migrate
EXPOSE 8888
CMD gunicorn g_attend.wsgi:application --bind 0.0.0.0:$PORT
it works normally but it never does the migrations, any help?
Note Pardon me if the question is a beginner question, it is my 1st time with docker and can't find clear documentation for Docker/Django

First of all you should not run migrations in your custom Dockerfile. A good practice is creating entrypoint.sh.
This is example entrypoint file:
#!/bin/bash
set -e
echo "${0}: running migrations."
python manage.py makemigrations --merge
python manage.py migrate --noinput
echo "${0}: collecting statics."
python manage.py collectstatic --noinput
cp -rv static/* static_shared/
gunicorn yourapp.wsgi:application \
--env DJANGO_SETTINGS_MODULE=yourapp.production_settings \
--name yourapp \
--bind 0.0.0.0:8000 \
--timeout 600 \
--workers 4 \
--log-level=info \
--reload
Additionally I recommend using docker-compose, which helps to organize your deployment in one place.
Example:
version: '3'
web:
build:
context: .
dockerfile: Dockerfile
command:
- /bin/sh
- '-c'
- '/code/entrypoint.sh'
ports:
- '8000:8000'
volumes:
- '.:/code'
- 'media_volume:/media'
And example Dockerfile
FROM python:3.6.8
RUN apt-get update;
ENV PYTHONUNBUFFERED 1
RUN mkdir /code
ADD requirements.txt /code
ADD entrypoint.sh /code
WORKDIR /code
RUN chmod +x *.sh
RUN pip install --upgrade pip
RUN pip install -r requirements.txt
ADD . /code

Based on #sebb answer, I've created a docker-compose.yml file but the entrypoint.sh didn't work as expected, after some searches, I've added the migration line to the docker-compose file, so here is how files looked at the end :
Dockerfile
FROM python:3
ENV PYTHONUNBUFFERED 1
RUN mkdir /code
WORKDIR /code
COPY requirements.txt /code/
COPY entrypoint.sh /code/
RUN pip install -r requirements.txt
COPY . /code/
docker-compose.yml
version: '3'
services:
db:
image: postgres
web:
build: .
command: bash -c "python manage.py makemigrations && python manage.py migrate && python manage.py runserver 0.0.0.0:8000"
volumes:
- .:/code
ports:
- "8000:8000"
depends_on:
- db
And Finally, it worked.

Related

How to deploy a Docker image and the environement variables in a virtual machine?

I created two images with docker-compose in my laptop and I pushed them in a private repo. Now, I would like to deploy the images in a virtual machine so I did docker pull to fetch them and docker run -d to run the containers. Problem is that a container can't start because the .env file with all the variables isn't found.
/entrypoint: line 12: STRATEGY_DB_NAME: unbound variable
Waiting for PostgreSQL to become available...
This file is present in my laptop in ./strategy/.env and Im wondering where I'm supposed to copy this file in the virtual machine.
I'm new with deployment and it's not clear if I need to upload all my project files in the virtual machine with git push and then docker run the Docker image. I assume it's not necessary because these files are embeded in the Docker image.
My question is why the .env file can't be found and where am'I supposed to copy it ?
docker-compose.yml
version: '3.8'
services:
django:
build:
context: .
dockerfile: ./compose/local/django/Dockerfile
image: strategy
command: /start
volumes:
- .:/app
ports:
- "8004:8004"
env_file:
- strategy/.env <--------------- file with secrets
depends_on:
- redis-4
networks:
- mynetwork
Dockerfile
FROM python:3.10
ENV PYTHONUNBUFFERED 1
ENV PYTHONDONTWRITEBYTECODE 1
WORKDIR /app
RUN apt-get update \
&& apt-get install -y gcc build-essential \
&& apt-get install -y libpq-dev \
&& apt-get install -y gettext \
&& apt-get install -y git \
&& apt-get install -y openssh-client \
&& apt-get install -y libcurl4-openssl-dev libssl-dev \
&& apt-get install -y build-essential \
&& apt-get install -y libpq-dev \
&& apt-get install -y procps telnet \
&& apt-get install -y nano \
&& apt-get install -y postgresql-client \
&& apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false \
&& rm -rf /var/lib/apt/lists/*
RUN mkdir logs \
&& touch logs\flat_line.log \
&& touch logs\json.log
# Requirements are installed here to ensure they will be cached.
COPY ./requirements.txt /requirements.txt
RUN pip install -r /requirements.txt
RUN pip install ccxt --upgrade \
&& pip install numpy --upgrade \
&& pip install psycopg2 --upgrade
COPY ./compose/local/django/entrypoint /entrypoint
RUN chmod +x /entrypoint
COPY ./compose/local/django/start /start
RUN chmod +x /start
COPY ./compose/local/django/celery/worker/start /start-celeryworker
RUN chmod +x /start-celeryworker
COPY ./compose/local/django/celery/beat/start /start-celerybeat
RUN chmod +x /start-celerybeat
COPY ./compose/local/django/celery/flower/start /start-flower
RUN chmod +x /start-flower
RUN mkdir /app/strategy
COPY ssl /app/strategy
ENTRYPOINT ["/entrypoint"]
entrypoint
#!/bin/bash
# if any of the commands in your code fails for any reason, the entire script fails
# fail exit if one of your pipe command fails
# exits if any of your variables is not set
set -o errexit
set -o pipefail
set -o nounset
postgres_ready() {
python << END
import sys
import psycopg2
try:
psycopg2.connect(
dbname="${STRATEGY_DB_NAME}",
user="${STRATEGY_DB_USER}",
password="${STRATEGY_DB_PASSWORD}",
host="${STRATEGY_DB_HOST}",
port="${STRATEGY_DB_PORT}",
sslmode="require",
)
except psycopg2.OperationalError:
sys.exit(-1)
sys.exit(0)
END
}
until postgres_ready; do
>&2 echo 'Waiting for PostgreSQL to become available...'
sleep 1
done
>&2 echo 'PostgreSQL is available'
exec "$#"
Supposing you want to run a container from your image with docker-compose, you need to specifiy your environnement file with env_file:
#docker-compose.yml
version: '3.8'
services:
container_name:
image: imagee:1.0
...
env_file: .env
...
As with Docker, you can add argument --env-file, such as:
docker run image:1.0 --env-file .env

Multistage docker build for Django

I am dockerizing my Django application with docker multi-stage build. Now am facing an issue with dependencies
Dockerfile
FROM python:3.8-slim-buster AS base
WORKDIR /app
RUN python -m venv venv
ENV PATH="/app/venv:$PATH"
COPY requirements.txt .
RUN pip install -r requirements.txt \
&& pip install gunicorn
COPY entrypoint.sh .
COPY . .
FROM python:3.8-slim-buster
WORKDIR /app
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
COPY --from=base /app /app/
ENV PATH="/app/venv:$PATH"
ENTRYPOINT sh entrypoint.sh
When running the container it raises import error.
ImportError: Couldn't import Django. Are you sure it's installed and available on your PYTHONPATH environment variable? Did you forget to activate a virtual environment?
I went to the same situation few month ago and it was a conflict between 2 packages, so django was not installed during the pip install.
You can add on your docker build command the '--progress=plain' option and see if everything is ok, during the docker build :
$ docker build --no-cache --progress=plain -t my_service_name .
Baptiste.
here is a working multistage docker build for django.
ENV PYTHONBUFFERED 1
WORKDIR /opt/webapp/
ENV VIRTUAL_ENV=/opt/venv
RUN python3 -m venv $VIRTUAL_ENV
ENV PATH="$VIRTUAL_ENV/bin:$PATH"
# copy requirements.txt
COPY ./requirements.txt /opt/webapp/requirements.txt
RUN pip3 install -r requirements.txt --no-cache-dir
# runner stage
FROM python:3.8-slim-buster AS runner
ARG SECRET_KEY
ARG DEBUG
WORKDIR /opt/webapp
RUN groupadd -r django \
&& useradd -d /opt/webapp -r -g django django \
&& chown django:django -R /opt/webapp
USER django
COPY --from=builder /opt/venv /opt/venv
ENV PATH="/opt/venv/bin:$PATH"
COPY --chown=django:django . /opt/webapp/
# run the server
CMD gunicorn conf.wsgi:application --bind 0.0.0.0:8000

Django database does not exists in postgreSQL container?

How can I connect my postgreSQL database container with my django application.
How can I create a database in postgreSQL while building the image but the case is I have separate container for postgreSQL and in this case How can I connect my postgreSQL.
Dockerfile
FROM ubuntu
ENV PATH="/scripts:${PATH}"
RUN apt update -y
RUN apt-get install debconf-utils
RUN apt install python3.8 -y
RUN apt install python3-pip -y
RUN echo 'tzdata tzdata/Areas select Asia' | debconf-set-selections
RUN echo 'tzdata tzdata/Zones/Asia select Kolkata' | debconf-set-selections
RUN DEBIAN_FRONTEND="noninteractive" apt install -y tzdata
RUN apt-get install -y gdal-bin
RUN apt-get install -y libgdal-dev
COPY ./requirements.txt /requirements.txt
RUN pip install -r requirements.txt
RUN mkdir /app
COPY ./app /app
WORKDIR /app
COPY ./scripts /scripts
RUN chmod +x /scripts/*
# RUN mkdir -p /vol/web/media
# RUN mkdir -p /vol/web/static
# RUN mkdir -p /vol/web/media
# RUN adduser --disabled-password user
# RUN chown -R user:user /vol
# RUN chmod -R 755 /vol/web
# USER user
CMD ["entrypoint.sh"]
docker-compose.yml
version: '3.8'
services:
app:
build:
context: .
environment:
- SECRET_KEY=changeme
- ALLOWED_HOSTS=127.0.0.1,localhost
depends_on:
- db
db:
image: postgres
restart: always
volumes:
- static_data:/static/db
ports:
- 5432:5432
container_name: ae73234b58e8
proxy:
build:
context: ./proxy
volumes:
- static_data:/vol/static
ports:
- 80:8080
depends_on:
- app
volumes:
static_data:
So, here I need to create a database while I build the Dockerfile image and How can I do that?
You can add environment variables in db.
Set the POSTGRES_DB environment variable with name of database you use in django. It will be created once postgres is built.

Directory not copied - Docker

Dockerfile:
# syntax=docker/dockerfile:1
FROM python:alpine3.14 AS cython-compile
WORKDIR /tmp/cython
COPY /data/python .
RUN pip3 install --upgrade pip && \
pip3 install --no-cache-dir cython && \
apk add --no-cache --virtual .build-dependencies gcc musl-dev && \
python3 setup.py build
FROM alpine:latest
WORKDIR /data
COPY --from=cython-compile /tmp/cython .
docker-compose.yml:
version: "3.9"
services:
testtest:
container_name: ztz-test
build:
context: .
dockerfile: Dockerfile
ports:
- "7776:7776"
volumes:
- .:/data
When I run the command docker-compose build there is no error at all but the file compiled by cython is not copied. I have confirmed that the file is in /tmp/cython by commenting this line:
FROM alpine:latest
WORKDIR /data
COPY --from=cython-compile /tmp/cython .

Permission denied after creating django app inside docker container

So I am following this tutorial and have gotten all the way to the 'media' section and when I run the command:
docker-compose exec web python manage.py startapp upload
it all works fine but when I open the newly created views.py file and edit and try to save I get a permission denied error. I can open the file as root and edit it but now thru my Atom code editor. I don't know where I am going wrong can someone help me? Here's my code:
Dockerfile:
# pull official base image
FROM python:3.8.3-alpine
# set work directory
WORKDIR /usr/src/app
# set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# install psycopg2 dependencies
RUN apk update \
&& apk add postgresql-dev gcc python3-dev musl-dev
# install dependencies
RUN pip install --upgrade pip
COPY ./requirements.txt .
RUN pip install -r requirements.txt
# copy entrypoint.sh
COPY ./entrypoint.sh .
# copy project
COPY . .
# run entrypoint.sh
ENTRYPOINT ["/usr/src/app/entrypoint.sh"]
docker-compose.yml:
services:
web:
build: ./app
command: python manage.py runserver 0.0.0.0:8000
volumes:
- ./app/:/usr/src/app/
ports:
- 8000:8000
env_file:
- ./.env.dev
depends_on:
- db
db:
image: postgres:12.0-alpine
volumes:
- postgres_data:/var/lib/postgresql/data/
environment:
- POSTGRES_USER=hello_django
- POSTGRES_PASSWORD=hello_django
- POSTGRES_DB=hello_django_dev
volumes:
postgres_data:
entrypoint.sh:
if [ "$DATABASE" = "postgres" ]
then
echo "Waiting for postgres..."
while ! nc -z $SQL_HOST $SQL_PORT; do
sleep 0.1
done
echo "PostgreSQL started"
fi
# python manage.py flush --no-input
# python manage.py migrate
exec "$#"
try to issue chmod 777 -R in the folder where it is located.

Categories