Django cannot create superuser on PostgreSQL. "server closed the connection unexpectedly" - python

I'm having an issue with a Django site that cannot create a superuser with a PostgreSQL database.
From the development machine
python manage.py createsuperuser --username user1 --email email#email.com
returns:
django.db.utils.OperationalError: server closed the connection
unexpectedly
This probably means the server terminated abnormally
before or while processing the request.
Now, the PostgreSQL server is up. I can run python manage.py migrate and create tables over there. I can also tell the server is up by putting in incorrect credentials, and receiving an error for having bad creds. I've got Dbeaver up and running on the same machine the Django site is being developed on, and it can connect and see the database as the user the Django site's settings.py is using.
From the PostgreSQL Server machine
Running cat /var/lib/pgsql/data/pg_log/postgresql-Mon.log yields the following after an attempted superuser creation:
LOG: could not receive data from client: Connection reset by peer
LOG: unexpected EOF on client connection with an open transaction
Because I can connect to the server and have success with other queries, I'm not sure where this issue lies.
Where should I look next to troubleshoot this issue?
EDIT: UPDATE
I changed postgresql.conf to reflect log_statement = 'all'. Now, upon running the createsuperuser command, I see the following in the PostgreSQL log:
LOG: statement:
SELECT c.relname, c.relkind
FROM pg_catalog.pg_class c
LEFT JOIN pg_catalog.pg_namespace n ON n.oid = c.relnamespace
WHERE c.relkind IN ('r', 'v')
AND n.nspname NOT IN ('pg_catalog', 'pg_toast')
AND pg_catalog.pg_table_is_visible(c.oid)
LOG: statement: SELECT "django_migrations"."app", "django_migrations"."name" FROM "django_migrations"
LOG: statement: SELECT "auth_user"."id", "auth_user"."password", "auth_user"."last_login", "auth_user"."is_superuser", "auth_user"."username", "auth_user"."first_name", "auth_user"."last_name", "auth_user"."email", "auth_user"."is_staff", "auth_user"."is_active", "auth_user"."date_joined" FROM "auth_user" WHERE "auth_user"."username" = 'user1'
LOG: statement: BEGIN
LOG: could not receive data from client: Connection reset by peer
LOG: unexpected EOF on client connection with an open transaction

This turned out to be a networking issue (I'm not too surprised.)
So, the issue goes a bit deeper than just PostgreSQL configuration and/or Django configuration. On that note, I've thought about deleting this question, since my solution is so far from the question asked. However, if someone else ever ends up in this same situation, this could come in handy for them out here on the web.
The solution(s):
Turning off port-forwarding on routing device for the network the PostgreSQL server is on
OR having the database connection string point to the external IP address of the network, instead of the LAN IP address.
Explanation:
The issue ended up having everything to do with the router that facilitates the LAN that both the PostgreSQL server and Django development machine are on. The router had port forwarding turned on for the PostgreSQL server's port. By doing this, the router interrupted internal LAN connections to the PostgreSQL server partially, but not completely. I verified this claim by changing the Django settings.py file and having HOST point to the external address of the network. Once I did that, I was able to create the superuser and see some clean commands coming through the PostgreSQL log.
I doubled down on verifying this issue by disabling port forwarding, changing the HOST back to the internal LAN address, and updating the password of the user I had just created. Everything worked great. The PostgreSQL log looked clean, I could see the update command and its COMMIT. Ultimately, I disabled port forwarding and stuck with the internal address for my HOST in settings.py.
So, this turned out to be a networking issue, and not a code issue, but I think this information could be helpful to someone in the future. So, for now, I think I'll leave this question up.

Related

Issue in connecting Python with MySQL on Google Cloud Platform

I have used the following code in Python:
import mysql.connector as mysql
import sys
HOST = "34.87.95.90"
DATABASE = "CAO_db"
USER = "root"
PASSWORD = "*********"
db_connection = mysql.connect(user=USER, password=PASSWORD, host=HOST, database=DATABASE)
cur = db_connection.cursor()
When I run the above code, I get the following error messages:
TimeoutError: [WinError 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond
InterfaceError: 2003: Can't connect to MySQL server on '34.87.95.90:3306' (10060 A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond)
I am not sure of how to fix my code and/or resolve the given errors. Please ask me if you would like more details of the error messages to help with the issue. I would greatly appreciate all the help I can get towards resolving the issues.
One thing I'm not seeing here is whether or not you have configured your Cloud SQL instance to accept connections.
You can configure it to accept connections from within the GCP stratosphere using their "Private IP" internal networking magic, AND you can configure it to accept connections from other machines using a combination of Public IP and either an authorized external network (like if you were accessing your GCP Cloud SQL instance from, say, an Amazon EC2 instance), or their Cloud SQL Proxy tool (which is what I use to connect to my Cloud SQL instance from my laptop).
In the GCP Console, go to your project
From the hamburger menu, select SQL
Click on your Cloud SQL instance
In the left nav, click on Connections
If you have Private IP checked and you're running this code on a GCP Compute/GKE resource, confirm that the "Network" field is set to the network used by that resource.
If you're just trying to get a connection from your local machine and you don't have a static IP to whitelist, your best option is to use Public IP in combination with Cloud SQL Proxy.
Cloud SQL Proxy essentially creates a TCP tunnel that allows your laptop to connect to 'localhost' on a port you specify, and it then redirects your connection to the remote Cloud SQL instance.
Once you've established that your networking situation isn't the problem, you could use the same Python connection code that you wrote above, but change HOST to 127.0.0.1 and add an attribute for PORT=3308.
EDITED to add: I suggest using PORT=3308 for your cloud_sql_proxy connection so that it doesn't interfere with any existing port 3306 (MySQL default) connections that you may already be actually running on your local machine. If this isn't the case, you can either omit the PORT attribute or keep it explicit, but change it to 3306.

DB2 : Python ibm_db2 connecting, which port to use?

I am attempting to connect to a remote db2 instance. I seem to be having connection port issues or protocol issues. Below is a sample connection setting. What is the default connection port using TCPIP & python? I am reaching the server but unable to create a connection to database. Database exists.
connection = ibm_db.connect("DATABASE=DATABASE_NAME;HOSTNAME=host;PORT=50000;PROTOCOL=TCPIP;UID=username;PWD=password;", "", "")
Im receiving the following error:
Exception: [IBM][CLI Driver] SQL30061N The database alias or database name "DATABASE_NAME " was not found at the remote node. SQLSTATE=08004 SQLCODE=-30061
The error message seems clear but the cause might vary. Most likely either the database-name or the port-number is incorrect.
You get that message if a Db2-server responded indicating Db2 cannot find the specified database on HOSTNAME in the Db2-instance listening on the specified port-number.
A Db2-LUW hostname might have more than one Db2-instance running concurrently (each listening on different port-numbers), according to the hardware-resources available.
A Db2-Linux/Unix/Windows instance can have many physical databases inside it, each with a distinct name and one or more aliases.
Ask your DBA or a colleague for the correct database-name and port-number per hostname.
Alternatively ssh (or remote-desktop) to that hostname, find the owner (userid) of the process listening on port 50,000 (or whatever port you are using), become that userid (for Linux/Unix: use su or sudo ) and use db2 list db directory command to show local databases in that Db2-instance. For Db2-servers on Windows: start > db2cwadmin.bat > db2 list db directory . On Linux/Unix, use ps -ef | grep db2sysc to see how many Db2-instances are running and you can use that information (along with netstat) to discover the port on which they are listening.

Remote access to mySQL using Python

I have trouble connecting to my mySQL database remotely through Python.
I use the following to connect to mySQL:
import mysql.connector
cnx = mysql.connector.connect(host='XXX.XXX.XXX.X',user='XXXX',password='XXXXXX',database='testdb')
But I get the following error:
2003: Can't connect to MySQL server on '%HOST%:3306' (10060 A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond)
The server is running and when I run the same code on the computer I run the server from using 'localhost'
import mysql.connector
cnx =
mysql.connector.connect(host='localhost',user='XXXX',password='XXXXXX',database='testdb')
it works and I can modify the data in the database. I'm trying to connect it remotely from another computer though.
I've tried using GRANT ALL ON *.* TO User#Host IDENTIFIED BY 'password'; but no result. I checked my firewall and allowed all incoming and outgoing connections through port 3306 which is used by default.
I'm new to mySQL and really have no clue what to do. I don't even know if I use the correct hostname :') I use the IP address of the computer I run the server from,I think that's right.
You dont need to GRANT ALL privilage to the user. You need to tell MYSQL that this user is allowed to login from a remote location.
In fact as you are allowing remote access through this user account now, you should make sure that it can access only the database(s) it needs to, and definitely cannot use GRANT
For example
CREATE USER 'myuser'#'%' IDENTIFIED BY 'mypass';
Will allow this user to connect from any ip address. To be more secure you should try to be more specific and specify a individual ip address if you can
CREATE USER 'myuser'#'11.22.33.44' IDENTIFIED BY 'mypass';
Remember, you are creating a new user account here because you already have a
myuser#localhost
Either way you need to make sure that the password is a strong one, specially is you use the % any ip option

python mysql database connection error

I am trying to access the remote database from one Linux server to another which is connected via LAN.
but it is not working.. after some time it will generate an error
`_mysql_exceptions.OperationalError: (2003, "Can't connect to MySQL server on '192.168.0.101' (99)")'
this error is random it will raise any time.
each time create a new db object in all methods.
and close the connection as well then also why this error raise.
can any one please help me to sort out this problem
This issue is due to so many pending request on the remote database.
So in this situation MySql closes the connection to the running script.
to overcome this situation put
time.sleep(sec) # here int is a seconds in number that to sleep the script.
it will solve this issue.. without transferring database to local server or any other administrative task on mysql
My solution was to collect more queries for one commit statement if those were insert queries.

Pymongo connection timeout from remote machine

I have a Bitnami MEAN Stack running on AWS EC2. I'm trying to connect from a remote machine using PyMongo.
from pymongo import MongoClient
conn = MongoClient('mongodb://username:password#ec2blah.us-east-1.compute.amazonaws.com:27017/dbname')
but I keep getting an error along the lines of pymongo.errors.ConnectionFailure: timed out
I have edited /opt/bitnami/mongodb/mongodb.conf to supposedly allow external connections by commenting out bind_ip = 127.0.0.1 and uncommented bind_ip = 0.0.0.0 and all permutations of commenting/uncommenting those lines.
I've looked over the web for about 90 minutes now trying different things but without luck!
On the mongoDB server, do the port connection test, and make sure the DB service running well. If not, start the service.
telnet ec2blah.us-east-1.compute.amazonaws.com 27017
On the remote machine, do the port connection test, to make sure there is no firewall issue.
telnet ec2blah.us-east-1.compute.amazonaws.com 27017
If you have issue to connect, you need check security groups on this instance.
Click the ec2 instance name --> Description --> view rules, you should see the ports are opened
If not, create a new security group , such as `mongoDB`, tcp port 27017 should be opened for inbound traffic, then assign to that instance.
You should be fine to connect it now.
At the time of start-up of MongoDB, set the bind_ip argument to ::,0.0.0.0
mongod --bind_ip ::,0.0.0.0
Read more in the docs of MongoDB: IP Binding.

Categories