Can not connect to Postgres from Python [duplicate] - python

This question already has answers here:
Connecting to Postgresql in a docker container from outside
(17 answers)
Closed 1 year ago.
I created a docker image with Postgres via Dockerfile:
FROM postgres:9.6-alpine
After I started this docker container, I'm checking that it's up and running using a connection from the different docker container that has pre-installed psql:
docker run -it --rm --link ml-postgres:postgres postgres:12.2-alpine
psql --dbname mlpython -h postgres -U postgres
The result is that I'm able to connect to the first container with postgres and perform all regular operations with the postgres DB.
Troubles begin when I want to connect to the container with postgres DB from a Python script that I created locally:
import psycopg2
conn = psycopg2.connect(
host="127.0.0.1",
database="mlpython",
user="postgres",
password="test",
port="5431"
)
cur = conn.cursor()
cursor.execute('SELECT COUNT(*) FROM mytable LIMIT 10')
cur.close()
Here is an error which I get:
> psycopg2.OperationalError: server closed the connection unexpectedly
> This probably means the server terminated abnormally before or while
> processing the request.
What do I miss while trying to bootstrap this simple code sample where Python interacts with Postgres?

Please read through the README.md of a docker image you use. It should answer your questions.
I'm not sure, if you did, because:
I see, you started psql - the client. Why, if you're going to connect from python? And have you started the server?
I can't see if you exposed any container port to a host machine

Related

Python psycopg2, NO postgresql etc service running on mashine... Where is the postgress driver? where is pg_hba.conf? (FATAL: no pg_hba.conf entry)

I am using -in Linux- python 3 and psycopg2 to connect to some postres databases:
import psycopg2 as pg
connection = None
try:
connection = pg.connect(
user = "username",
password = "...",
host = "host_ip",
port = "5432",
database = "db_name",
)
cursor = connection.cursor()
# Print PostgreSQL Connection properties
print ( connection.get_dsn_parameters(),"\n")
# Print PostgreSQL version
cursor.execute("SELECT version();")
record = cursor.fetchone()
print("You are connected to - ", record,"\n")
cursor.close()
connection.close()
print("PostgreSQL connection is closed")
except (Exception, pg.Error) as error :
print ("Error while connecting to PostgreSQL", error)
For one of the DBs this works, but for the other I am getting:
Error while connecting to PostgreSQL FATAL: no pg_hba.conf entry for host "XXX.XXX.XXX.XXX", user "YYYY", database "ZZZZ", SSL off
I have checked the web and stackoverflow, and there are a lot of similar questions,
e.g. Psycopg2 reporting pg_hba.conf error
However, I am not root on the machine where I used pip/anaconda, and there seems to be no
sql service or anything similar running:
$ sudo systemctl status postgres*
$ sudo systemctl status postgres
Unit postgres.service could not be found.
$ sudo systemctl status postgresql
Unit postgresql.service could not be found.
$ sudo systemctl status post*
So none of the answers seem to be relevant, because this question seems to be based on the
postgress service running, or on the existence of pg_hba.conf, either of which do not in my system. Though note that a sample is included in my envs/py3/share/ (where py3 the name of my environment):
$ locate pg_hba.conf
/home/nick/anaconda3/envs/py3/share/pg_hba.conf.sample
My question here aims to -apart to find a way to solve my immediate problem- understand what psycopg2 is / how it ends up using pg_hba.conf, seemingly used in a postgresql service that does not seem to exist in my system:
Does psycopg2 is/uses a driver? Why does it seem to include pg_hba.conf.sample and what is one supposed to do with it? where to place pg_hba.conf(???) it to make psycopg2 read it?
Notes / Info based on comments:
The DB is not locally hosted. It is running on a different server.
I am able to access that DB using DBeaver and my local Ubuntu python, but a container (same psycopg2 version is not), so I speculate it is not a DB server issue.
It seems pg_hba.conf is a file that should only be on the server? (If so, that actually is part of the answer I am looking for...)

Why am I getting "psql: server closed the connection unexpectedly" from python script?

I have a python script that first creates a postgres docker container using the docker-py library:
client = docker.from_env()
CONTAINER = 'test_cont'
container = client.containers.run("postgres", ports={'5432/tcp': 27432},
environment=["POSTGRES_PASSWORD=password",
"POSTGRES_DB=my_db"],
name=CONTAINER,
detach=True)
The docker container starts correctly. I'd like to run some commands from the python script, so I tried:
subprocess.call('psql -h localhost -p 27432 -U postgres -c "CREATE USER xxx WITH PASSWORD \'password\';"', shell=True, env={'PGPASSWORD': 'password'})
However, this command always returns the following error:
psql: server closed the connection unexpectedly
This probably means the server terminated abnormally
before or while processing the request.
If I run the psql command from the command line and not from the python code it works without problems.
Solved the problem, if we run a postgres docker container in daemon mode it doesn't wait for the postgres server to finish the boot sequence, so the psql command was run before the postgres server finished booting. To solve it, I added a sleep time after the docker run command in the python script in order to allow the db to boot.

running python and mysql with docker error - cannot connect

I am using mysql-connector, when ever I run the container using docker I get this error:
mysql.connector.errors.InterfaceError: 2003: Can't connect to MySQL server on 'db:3306' (-5 No address associated with hostname)
but when I run the project using python only it executes with no errors
I want to use phpmyadmin only for the database please help.
To create a docker from your linux machine:
docker pull mysql:latest
To run it and mount a persistent folder with port access on 3306 (std port for mysql):
docker run --name=mysql_dockerdb --env="MYSQL_ROOT_PASSWORD=<your_password>" -p 3306:3306 -v /home/ubuntu/sql_db/<your_dbasename>:/var/lib/mysql -d mysql:latest
To connect to the docker instance so that you can create the database within the docker:
docker exec -it mysql_dockerdb mysql -uroot -p<your_password>
My SQL code to establish the database:
CREATE DATABASE dockerdb;
CREATE USER 'newuser'#'%' IDENTIFIED BY 'newpassword';
GRANT ALL PRIVILEGES ON dockerdb.* to 'newuser'#'%';
ALTER USER 'username'#'%' IDENTIFIED WITH mysql_native_password BY 'userpassword';
You will now have a docker running with a persistent SQL database. You connect to it from your Python code. I am running Flask mySql. You will want to keep your passwords in environment variables. I am using a Mac so therefore my ~/.bash_profile contains:
export RDS_LOGIN="mysql+pymysql://<username>:<userpassword>#<dockerhost_ip>/dockerdb"
Within Python:
import os
SQLALCHEMY_DATABASE_URI = os.environ.get('RDS_LOGIN')
And at that point you should be able to connect in your usual Python manner. Note that I've glossed over any security aspects on the presumption this is local behind a firewall.

sqlalchemy.exc.OperationalError : Can't connect to mysql in docker

I am working with sqlalchemy and mysql, the process is working fine for mysql installed locally but I am not able to connect it with a mysql docker image. I am using pymysql as a driver. Here is the line of commands that I run and I am getting an error shown below.
Following are the portions of /docker-compose.yml and the python file. Also I have a script that creates a database named "sqlalchemy" in docker mysql which is not shown below.
/docker-compose.yml
services:
db:
build: ./database
restart: always
ports:
- "3309:3306"
environment:
- MYSQL_USER=root
- MYSQL_ROOT_PASSWORD=password
- MYSQL_HOST=db
/sqlalchemy.py
msqldb_uri = 'mysql+pymysql://root:password#db:3309/sqlalchemy'
engine = create_engine(msqldb_uri)
sqlalchemy.exc.OperationalError: (pymysql.err.OperationalError) (2003, "Can't connect to MySQL server on 'db' ([Errno 111] Connection refused)")
If the script is running inside container then you do not use the publish port in the connection but you should use 3306 within localhost or same network. The publish port is for outer world.
msqldb_uri = 'mysql+pymysql://root:password#localhost:3306/sqlalchemy'
engine = create_engine(msqldb_uri)
If the script is runnin in other container that is in the same docker-compose file then you also you do not need the publish port.
msqldb_uri = 'mysql+pymysql://root:password#db:3306/sqlalchemy'
engine = create_engine(msqldb_uri)
if the script in running on host and DB is running inside container then you need connect with the publish port but you should not use container name as a IP.
msqldb_uri = 'mysql+pymysql://root:password#localhost:3309/sqlalchemy'
engine = create_engine(msqldb_uri)

Connecting from psycopg2 on local machine to PostgreSQL db on Docker

I have used the following commands to create a Docker image with Postgres running on it:
docker pull postgres
docker run --name test-db -e POSTGRES_PASSWORD=my_secret_password -d postgres
I then created a table called test and inserted some random data into a couple of rows.
I am now trying to make a connection to this database table through psycopg2 in Python on my local machine.
I used the command docker-machine ip default to find out the IP address of the machine as 192.168.99.100 and am using the following to try and connect:
conn = psycopg2.connect("dbname='test-db' user='postgres' host='192.168.99.100' password='my_secret_password' port='5432'")
This is not working with the error message of "OperationalError: could not connect to server: Connection refused (0x0000274D/10061)"
.
.
Everything seems to be in order so I can't think why this would be refused.
According to the documentation for this postgres image, (at https://hub.docker.com/_/postgres/) this image includes EXPOSE 5432 (the postgres port) and the default username is postgres.
I also tried to get the IP address of the image itself with docker inspect test-db | grep IPAddress | awk 'print{$2}' | tr -d '",' that I found on SA to a slightly related article, but that IP address didn't work either.
The EXPOSE instruction may not be doing what you expect. It is used for links and inter-container communication inside the Docker network. When connecting to a container from outside the Docker bridge network you need to publish to port with -p. Try adding -p 5432:5432 to your docker run command so that it looks like:
docker run --name test-db -e POSTGRES_PASSWORD=my_secret_password -d -p 5432:5432 postgres
Here is a decent explanation of the differences between publish and exposed ports: https://stackoverflow.com/a/22150099/684908. Hope this helps!

Categories