Unable to build the image in docker - python

When trying to build the image, i'm getting the error below. I am also adding the related project files.
Dockerfile
docker-compose.yml
init.py
manage.py
Error:
Building users-service
Step 1/7 : FROM python:3.6.1
ERROR: Service 'users-service' failed to build: Get https://registry-1.docker.io/v2/: dial tcp 52.206.156.207:443: getsockopt: connection refused
Here is my Dockerfile
FROM python:3.6.1
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
ADD ./requirements.txt /usr/src/app/requirements.txt
RUN pip install -r requirements.txt
ADD . /usr/src/app
CMD python manage.py runserver -h 0.0.0.0
Here is the docker-compose.yml
version: '2.1'
services:
users-service:
container_name: users-service
build: .
volumes:
- '.:/usr/src/app'
ports:
- 5001:5000 # expose ports - HOST:CONTAINER
init.py
from flask import Flask, jsonify
# instantiate the app
app = Flask(__name__)
# set config
app.config.from_object('project.config.DevelopmentConfig')
#app.route('/ping', methods=['GET'])
def ping_pong():
return jsonify({'status': 'success',
'message': "pong"})
manage.py
from flask_script import Manager
from project import app
# configure your app
manager = Manager(app)
if __name__ == '__main__':
manager.run()

Related

How do I run uvicorn in a docker container that exposes the port?

I am developing a fastapi inside a docker container in windows/ubuntu (code below). When I test the app outside the container by running python -m uvicorn app:app --reload in the terminal and then navigating to 127.0.0.1:8000/home everything works fine:
{
Data: "Test"
}
However, when I docker-compose up I can neither run python -m uvicorn app:app --reload in the container (due to the port already being used), nor see anything returned in the browser. I have tried 127.0.0.1:8000/home, host.docker.internal:8000/home and localhost:8000/home and I always receive:
{
detail: "Not Found"
}
What step am I missing?
Dockerfile:
FROM python:3.8-slim
EXPOSE 8000
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
COPY requirements.txt .
RUN python -m pip install -r requirements.txt
WORKDIR /app
COPY . /app
RUN adduser -u nnnn --disabled-password --gecos "" appuser && chown -R appuser /app
USER appuser
CMD ["gunicorn", "--bind", "0.0.0.0:8000", "-k", "uvicorn.workers.UvicornWorker", "app:app"]
Docker-compose:
version: '3.9'
services:
fastapitest:
image: fastapitest
build:
context: .
dockerfile: ./Dockerfile
ports:
- 8000:8000
extra_hosts:
- "host.docker.internal:host-gateway"
app.py:
from fastapi import FastAPI
app = FastAPI()
#app.get("/home") #must be one line above the function fro the route
def home():
return {"Data": "Test"}
if __name__ == '__main__':
import uvicorn
uvicorn.run(app, host="127.0.0.1", port=8000)
The issue here is that when you specify host="127.0.0.1" to uvicorn, that means you can only access that port from that same machine. Now, when you run outside docker, you are on the same machine, so everything works. But since a docker container is (at least to some degree) a different computer, you need to tell it to allow connections from outside the container as well. To do this, switch to host="0.0.0.0", and then you should be able to access your dockerized API on http://localhost:8000.

localhost using docker compose up not working

I am trying to test a simple server endpoint on my local machine when running docker compose up but it does not seem the ports are exposed when running docker this way. If I just do a docker build and docker run I can use localhost to get a successful endpoint call but not when I use my docker compose file.
docker-compose.yml file:
version: '3'
services:
simple:
build:
context: .
dockerfile: Dockerfile
container_name: simple
ports:
- 3000:80
environment:
- SOMEKEY=ABCD
- ANOTHERKEY=EFG
Dockerfile
FROM python:3.9.5
ARG VERSION
ARG SERVICE_NAME
ENV PYTHONPATH=/app
COPY requirements.txt /app/requirements.txt
RUN pip install -r /app/requirements.txt
COPY app /app/app
COPY main.py /app/
CMD ["python", "./app/main.py"]
And then my main.py file
import uvicorn
from fastapi import FastAPI
app = FastAPI()
#app.get("/")
def read_root():
return {"Hello": "World"}
if __name__ == '__main__':
uvicorn.run(app, port=3000, host="0.0.0.0")
docker compose up does not seem to want to expose to local host.
What I use with build and run that does expose:
docker build -t test-test .
docker run -p 3000:3000 test-test
Is there a way to expose the port to localhost with docker compose up?
The syntax for ports is HOST:CONTAINER. The port on the container is 3000, so you've got it backwards.
version: '3'
services:
simple:
build:
context: .
dockerfile: Dockerfile
container_name: simple
ports:
- 80:3000
environment:
- SOMEKEY=ABCD
- ANOTHERKEY=EFG

Running Redis rq worker on Docker

I am trying to make a queue of tasks using redis rq. I was trying to follow a tutorial but I am using docker. Below is my code-
app.py
from flask import Flask, request
import redis
from rq import Queue
import time
app = Flask(__name__)
r = redis.Redis()
q = Queue(connection=r)
def background_task(n):
""" Function that returns len(n) and simulates a delay """
delay = 2
print("Task running")
print(f"Simulating a {delay} second delay")
time.sleep(delay)
print(len(n))
print("Task complete")
return len(n)
def index():
if request.args.get("n"):
job = q.enqueue(background_task, request.args.get("n"))
return f"Task ({job.id}) added to queue at {job.enqueued_at}"
return "No value for count provided"
if __name__ == "__main__":
app.run()
Docker compose file-
version: "3.8"
services:
web:
build: .
ports:
- "5000:5000"
volumes:
- .:/code
environment:
FLASK_ENV: development
redis:
image: "redis:alpine"
Dockerfile
FROM python:3.7-alpine
WORKDIR /code
ENV FLASK_APP=app.py
ENV FLASK_RUN_HOST=0.0.0.0
RUN apk add --no-cache gcc musl-dev linux-headers
COPY requirements.txt requirements.txt
RUN pip install -r requirements.txt
EXPOSE 5000
COPY . .
CMD ["flask", "run"]
Whenever I run '''docker-compose up --build''' and open http://localhost:5000/ I get Url not found
Where am I going wrong?
How is one supposed to use rq worker command in docker containers.
redis:
image: "redis:alpine"
The issue is that the image specified in your Docker compose YAML should be the image built by your Dockerfile.
Because you have a dockerfile you want to use for this image you can specify it in-line, see the documentation here:
https://docs.docker.com/compose/compose-file/compose-file-v3/
version: "3.9"
services:
webapp:
build:
context: ./dir
dockerfile: Dockerfile-alternate
args:
buildno: 1
As a good practice, instead of calling your service "redis" in the docker compose file, you should provide a custom name to represent your worker script.

Flask App runs in terminal but not in Docker Container

So, I've got this flask app that works fine when I run it via the terminal, but for some reason, if I start up a docker container, it instantly exits with 'exit code zero'.
Here is the folder structure:
https://imgur.com/a/BOGCt6S
docker-compose.yml:
version: '3.2'
services:
flask-app:
build: ./Flask_App
volumes:
- ./Flask_App/static:/Flask_App/static
- ./Flask_App/db:/Flask_App/db
ports:
- '5001:5001'
# restart: unless-stopped
Dockerfile:
FROM python:3
# Setup env
COPY requirements.txt ./
RUN pip install -r requirements.txt
# Setup App
RUN mkdir -p /Flask_App
COPY . /Flask_App
WORKDIR /Flask_App
EXPOSE 5001
ENTRYPOINT [ 'python3' ]
CMD [ 'app.py' ]
and the app.py file: (I know its just imports, but it works fine when I run it via the terminal on the host, so it probably is working ok)
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
from flask_login import LoginManager
from flask_bootstrap import Bootstrap
from forms import *
app = Flask(__name__)
app.config.from_pyfile('config.py')
Bootstrap(app)
db = SQLAlchemy(app)
login_manager = LoginManager()
login_manager.init_app(app)
login_manager.login_view = 'login'
from views import *
if __name__ == '__main__':
app.run(host=app.config['HOST'], port=app.config['PORT'], debug=app.config['DEBUG'])
and just in case, here is a part of the 'config.py' file:
DEBUG = False
HOST = '0.0.0.0'
PORT = 5001
As said by David Maze in a comment, the single quotes(') I used in ENTRYPOINT and CMD should be changed to double quotes ("), since they are supposed to be JSON arrays.

Enable to import flask in python

I have react app which communicates with flask API and display data. I had both of these projects in separate folders and everything worked fine.
Then I wanted to containerize Flask + React app with docker-compose for practice and then I created a folder in which I have my middleware(flask) and frontend(react) folders. Then I created a virtual environment and installed flask. Now when I import flask inside python file I get an error.
I do not understand why simply adding the folder inside another folder would affect my project. You can see the project structure and error in the picture below.
Dockerfile react app
FROM node:latest
# Create app directory
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY package.json /usr/src/app/
RUN npm install
CMD [ "npm", "start" ]
Dockerfile flask api
FROM python:3.7.2
# set working directory
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
# add requirements (to leverage Docker cache)
ADD ./requirements.txt /usr/src/app/requirements.txt
# install requirements
RUN pip install -r requirements.txt
# add app
ADD . /usr/src/app
# run server
CMD python app.py runserver -h 0.0.0.0
docker-compose.yml
version: '3'
services:
middleware:
build: ./middleware
expose:
- 5000
ports:
- 5000:5000
volumes:
- ./middleware:/usr/src/app
environment:
- FLASK_ENV=development
- FLASK_APP=app.py
- FLASK_DEBUG=1
frontend:
build: ./frontend
expose:
- 3000
ports:
- 3000:3000
volumes:
- ./frontend/src:/usr/src/app/src
- ./frontend/public:/usr/src/app/public
links:
- "middleware:middleware"
When moving folders around, you should change the python path in your vscode/.settings file. Otherwise you'll be using the wrong Python interpreter - one without Flask.

Categories