I am unable to connect my postgreSQL database to AWS RDS while using Django.
I am not sure what i am doing wrong but i have the configuration below:
Django
docker-compose
postgres
#DJANGO Settings
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': "dbtest",
'USER': "mastername",
'PASSWORD': "somepassword",
'HOST': 'dbtest.zmrHnee2mhLunL.us-east-2.rds.amazonaws.com',
'PORT': 5432
}
}
docker-compose
version: "3.7"
services:
backend:
build: ./backend
container_name: backend
stdin_open: true
tty: true
restart: "on-failure"
env_file:
.env
volumes:
- ./backend:/backend
- ./backend/static:/backend/static
ports:
- 8000:8000
depends_on:
- db
links:
- db:db
networks:
- db_network
db:
image: postgres:11.5-alpine
container_name: db
restart: always
environment:
- POSTGRES_DB=dbtest
- POSTGRES_USER=mastername
- POSTGRES_PASSWORD=somepassword
volumes:
- postgres_data:/var/lib/postgresql/data/
ports:
- 5432:5432
networks:
- db_network
networks:
db_network:
driver: bridge
volumes:
postgres_data:
after starting the container docker-compose up -d --build
Now when i check docker-compose logs
I see the traceback below:
File "/usr/local/lib/python3.8/site-packages/psycopg2/__init__.py", line 127, in connect
backend | conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
backend | django.db.utils.OperationalError: could not translate host name "dbtest.zmrHnee2mhLunL.us-east-2.rds.amazonaws.com" to address: Name or service not known
and after creating another db instance and allowing it to be publicly accessible
File "/usr/local/lib/python3.8/site-packages/psycopg2/__init__.py", line 127, in connect
backend | conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
backend | django.db.utils.OperationalError: FATAL: database "dbtest" does not exist
security group
I have been trying to debug for hours, any help would be really helpful
Related
I am trying to run python backend with mysql database in docker container locally and stumble upon the error when I run docker-compose up command
backend_1 | django.db.utils.OperationalError: (2002, "Can't connect
to MySQL server on 'db' (115)")
My Docker file:
FROM python:3.9
ENV PYTHONUNBUFFERED 1
WORKDIR /app
COPY requirements.txt /app/requirements.txt
RUN pip install -r requirements.txt
COPY . /app
CMD python manage.py runserver 0.0.0.0:8000
docker-compose.yml
version: '3.8'
services:
backend:
restart: always
build:
context: .
dockerfile: Dockerfile
command: 'python manage.py runserver 0.0.0.0:8000'
ports:
- 8000:8000
volumes:
- .:/app
depends_on:
- db
links:
- db
db:
image: mysql:5.7.22
restart: always
environment:
MYSQL_DATABASE: mysql
MYSQL_USER: admin
MYSQL_PASSWORD: password
MYSQL_ROOT_PASSWORD: password
volumes:
- .dbdata:/var/lib/mysql
ports:
- 33066:3306
In settings.py
# Database
# https://docs.djangoproject.com/en/3.1/ref/settings/#databases
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'mysql',
'USER': 'admin',
'PASSWORD': 'password',
'HOST': 'db',
'PORT': '3306',
}
}
In docker-compose.yml I had to add
hostname: 'db'
like that:
version: '3.8'
services:
backend:
restart: always
build:
context: .
dockerfile: Dockerfile
command: 'python manage.py runserver 0.0.0.0:8000'
ports:
- 8000:8000
volumes:
- .:/app
depends_on:
- db
links:
- db
db:
image: mysql:5.7.22
hostname: 'db'
restart: always
environment:
MYSQL_DATABASE: mysql
MYSQL_USER: admin
MYSQL_PASSWORD: password
MYSQL_ROOT_PASSWORD: password
volumes:
- .dbdata:/var/lib/mysql
ports:
- 33066:3306
and it worked well
UPDATE:
Not really. When I run docker-compose the second time I get the same error
I am starting a Flask project and I want to run unit tests in PyCharm using Docker remote interpreter, but I am not being able to connect to the mysql database container when running the tests. The application runs normally, so the database is reachable outside the container. In the past I managed to do that in PhpStorm, but the configurations in PyCharm are not the same and I am having some trouble setting everything up. I already managed to use the remote interpreter to run tests, but the only trouble is when I need to connect to the database.
I am getting the following error when trying to connect:
mysql.connector.errors.InterfaceError: 2003: Can't connect to MySQL server on 'localhost:3306' (111 Connection refused)
So, the server is reachable, but for whatever reason it is not allowing to connect.
Here is the docker-compose.yml
version: "2"
networks:
learning_flask:
name: learning_flask
driver: bridge
services:
mysql:
image: mysql:5.7
container_name: mysql
restart: unless-stopped
tty: true
ports:
- "127.0.0.1:3306:3306"
environment:
MYSQL_ROOT_PASSWORD: root
volumes:
- ./db:/docker-entrypoint-initdb.d/:ro
networks:
- learning_flask
app:
build: ./app
container_name: learning_flask_app
ports:
- "5000:5000"
volumes:
[ './app:/app' ]
depends_on:
- mysql
networks:
- learning_flask
and then the code I am trying to execute:
import unittest
import mysql.connector
class TestCase(unittest.TestCase):
def test_something(self):
config = {
'user': 'root',
'password': 'root',
'host': 'localhost',
'port': '3306'
}
connection = mysql.connector.connect(**config)
if __name__ == '__main__':
unittest.main()
If I try to change the host on the connection config to mysql, I get the following error:
mysql.connector.errors.InterfaceError: 2003: Can't connect to MySQL server on 'mysql:3306' (-2 Name or service not known)
I am developing a django-react app and using a mongoDB cluster to store data. When I run the app without using docker, I am able to make requests to the database without issue. However, when I run the docker containers (one for my backend and one for my frontend) I run into this error on the backend:
File "/usr/local/lib/python3.9/site-packages/pymongo/topology.py", line 215, in _select_servers_loop
raise ServerSelectionTimeoutError(
pymongo.errors.ServerSelectionTimeoutError: localhost:27017: [Errno 111] Connection refused, Timeout: 30s, Topology Description: <TopologyDescription id: 5f9ece0f7962ee81cb819b63, topology_type: Single, servers: [<ServerDescription ('localhost', 27017) server_type: Unknown, rtt: None, error=AutoReconnect('localhost:27017: [Errno 111] Connection refused')>]>
I have the mongodb host in both mongo_client.py and settings.py. In settings.py I have:
DATABASES = {
'default': {
'ENGINE': 'djongo',
'NAME': '<mydb>',
'HOST': 'mongodb+srv://mike:<mypassword>#cluster0.5u0xf.mongodb.net/<mydb>?retryWrites=true&w=majority',
'USER': 'mike',
'PASSWORD': '<mypassword>',
}
}
My docker-compose yaml looks like:
version: "3.2"
services:
portalbackend:
restart: always
container_name: code
command: bash -c "python manage.py makemigrations &&
python manage.py migrate &&
python manage.py runserver 0.0.0.0:8000"
build:
context: ./PortalBackend/
dockerfile: Dockerfile
ports:
- "8000:8000"
networks:
- db-net
portal:
restart: always
command : npm start
container_name: front
build:
context: ./portal/
dockerfile: Dockerfile
ports:
- "3000:3000"
stdin_open: true
depends_on:
- portalbackend
networks:
- db-net
networks:
db-net:
driver: bridge
Do I need to create a container for mongodb? I originally tried that with a local mongodb instance but I was running into the same issue, so I tried rolling with a cluster. Still running into the same problem.
No you don't need to add a mongo container, as your database is in Atlas.
Please see my answer posted yesterday for a similar problem: Django + Mongo + Docker getting pymongo.errors.ServerSelectionTimeoutError
I am trying to use Postgresql with python. I have used the following docker compose the file.
version: '3.1'
services:
db:
image: postgres
restart: always
environment:
POSTGRES_PASSWORD: admin_123
POSTGRES_USER: admin
adminer:
image: adminer
restart: always
ports:
- 8080:8080
With the following code, I am trying to connect with the database.
conn = psycopg2.connect(
database = "db_test",
user ="admin",
password = "admin_123",
host = "db"
)
But I am getting this error.
OperationalError: could not translate host name "db" to address:
nodename nor servname provided, or not known
What I am doing wrong ?
You need to expose the BD port in the docker compose like this :
db:
image: postgres
restart: always
environment:
POSTGRES_PASSWORD: admin_123
POSTGRES_USER: admin
ports:
- "5432:5432"
And then connect with localhost:5432
Another possible scenario,
Check if ports have been used or not by other docker container.
Use command:
$ docker container ls --format "table {{.ID}}\t{{.Names}}\t{{.Ports}}" -a
Here is my docker-compose.yml
$ cat docker-compose.yml
version: '3.1' # specify docker-compose version
services:
dockerpgdb:
image: postgres
ports:
- "5432:5432"
restart: always
environment:
POSTGRES_PASSWORD: Password
POSTGRES_DB: dockerpgdb
POSTGRES_USER: abcUser
volumes:
- ./data:/var/lib/postgresql%
Now in PgAdmin4 you can setup a new server as below to test the connection:
host: localhost
port: 5432
maintenance database: postgres
username: abcUser
password: Password
I'm new to Docker, and I'm trying to put my Django rest API in a container with Nginx, Gunicorn and Postgres, using docker-compose and docker-machine. Following this tutorial: https://realpython.com/blog/python/django-development-with-docker-compose-and-machine/
Most of my code is the same as the tutorial's (https://github.com/realpython/dockerizing-django). with some minor name changes.
this my docker-compose.yml (I changed the gunicorn command to runserver for debugging purposes)
web:
restart: always
build: ./web
expose:
- "8000"
links:
- postgres:postgres
- redis:redis
volumes:
- /usr/src/app
- /usr/src/app/static
env_file: .env
environment:
DEBUG: 'true'
command: /usr/local/bin/python manage.py runserver
nginx:
restart: always
build: ./nginx/
ports:
- "80:80"
volumes:
- /www/static
volumes_from:
- web
links:
- web:web
postgres:
restart: always
image: postgres:latest
ports:
- "5432:5432"
volumes:
- pgdata:/var/lib/postgresql/data/
redis:
restart: always
image: redis:latest
ports:
- "6379:6379"
volumes:
- redisdata:/data
And this is in my settings.py of Django:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'postgres',
'USER': 'postgres',
'PASSWORD': 'postgres',
'HOST': 'postgres',
'PORT': '5432',
}
}
Nginx and postgres (and redis) are up and running, however my django server wont start, by this error:
web_1 | django.db.utils.OperationalError: could not connect to server: Connection refused
web_1 | Is the server running on host "localhost" (::1) and accepting
web_1 | TCP/IP connections on port 5432?
web_1 | could not connect to server: Connection refused
web_1 | Is the server running on host "localhost" (127.0.0.1) and accepting
web_1 | TCP/IP connections on port 5432?
I've googled a lot and I've verified that postgres is running, on port 5432, I can connect to it using psql command .
I am lost. What is my mistake?
EDIT: It appears that it is not using my settings.py file or something, since it's asking if the server is running on localhost, while settings should be looking for postgres.
When those who have this problem please check your settings.py:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'dbname',
'USER': 'user',
'PASSWORD': 'password',
'HOST': 'db'
}
}
Your HOST:'db' and docker-compose file db name should be same. If you want to rename from db, make sure that you change in docker-compose file and setting.py:
db:
restart: always
image: postgres:latest
ports:
- "5432:5432"
volumes:
- pgdata:/var/lib/postgresql/data/
Checkout your manage.py,
there should be a line
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "myproject.settings")
if there is no such line, put it
set your DJANGO_SETTINGS_MODULE with respect to PYTHONPATH.
UPD i cloned your repo and launched the web service by changing command in docker-compose.yml
- command: /usr/local/bin/gunicorn docker_django.wsgi:application -w 2 -b :8000
+ command: python manage.py runserver 0.0.0.0:8000
I'm sure DJANGO_SETTINGS_MODULE is correct.
I'm exactly facing the same issue, while runing my django app with docker on aws ec2 instance.
I noticed that this error only happend for the first time the docker image is build, so to fix i juste ran :
CTRL + C then docker-compose up again and everything worked fine.