i want to connect a python code with a database localized in a Google Cloud Platform PostgreSQL database. I've been running this code and this working correctly, but recently i chaged me server to a VPS provided by a host. Now when i try to connect to the database with python3 i receive the follow message:
Psycopg2.OperationalError: could not connect to server: Connection timed out
Is the server running on host "35.199.90.49" and accepting
TCP/IP connections on port 5432?
I have authorized the IP of the server in the Google Cloud Platform, i tried to open all IPv4 address too, but nothing is working. I changed the /XX from IPv4 for my server. I disabled the firewall of the server. Nothing is working but when i connect with the psql command at ubuntu server, the database connection is okay.
Related
The project is a django web app, packed on docker containers. For some reason, I cannot run the server and launch it.
when I run "manage.py runserver", this error occurs:
File "C:\Users\andre\AppData\Local\Programs\Python\Python310\lib\site-packages\psycopg2\__init__.py", line 122, in connect
conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
django.db.utils.OperationalError: connection to server at "158.160.17.21", port 5432 failed: Connection refused (0x0000274D/10061)
Is the server running on that host and accepting TCP/IP connections?
What do you think can be done with it? Thank you.
The first line of your error tells me that you're using a locally installed copy of psycopg2, a Python client library for PostgreSQL, to connect to the database. This is happening behind the scenes when Django needs to make that connection, but it's interesting that it's installed locally, rather than in a docker container, given you said your Django webapp is running in Docker.
The third line of your error is very simple: it tells you that
the programme is expecting Postgres to be served on a machine with the IP address 158.160.17.21 on port 5432 (the default port for Postgres)
it isn't there
Now, this IP address is not a default and doesn't refer to your own machine, so it must be something you've provided. An IP lookup suggests it's in Venezuela (does that sound right to you?). Perhaps you are expecting Postgres to be served on a third party machine; if so, you'll need to check that you have the right IP address and that Postgres is being served there.
Otherwise, you'll need to reconfigure Django to seek Postgres elsewhere.
Open Task Manager > Services tab > Right-Click on Mongo > Start.
Now go re-run the server again.
I have an existing database (Postgres) that i want to connect to apache-Airflow on my host machine(Windows 10), I installed the apache-airflow on the WSL running ubuntu. The installation was smooth and working fine since i was able to get the airflow webserver running on my localhost(port:8081).
I tried connecting airflow to my existing database (carPrices) passing all the necessary parameters which were all correct. I also confirmed my database is up and running on port(5432). Whenever i click the connect button it will report this error..."could not connect to server: Connection refused Is the server running on host "localhost" (127.0.0.1) and accepting TCP/IP connections on port 5432?"
I dont know what exactly is the problem as i am new to airflow.
I tried setting the connection parameter of airflow by setting it through the airflow.cfg file and through the Airflow UI home. In the first case i cant even "airflow db init" as it report the same problem of connection refusal. the second case will setup a default sqlite db for the airflow UI to run. then i tried connecting using the UI but same error message was given.
I check using if the postgres is up and running using netstat -ab and posgres is up and listening.
I was expecting the connection to report succesful since i am sure of all the database parameters passed but instead i got this.
Found out the problem is with WSL 2, you cant connect to localhost from WSL2 without some complicated tweaks... The simplest thing to do is downgrade to WSL 1
running this command in powershell:
wsl.exe --set-version Ubuntu-20.04 1
I launched an AWS linux instance and installed and ran mongo as instructed here. The mongo service is running and accepting connections on 27017. However, when I go to the server publik dns with port 27017 the server does not respond and I don't see the default mongo message.
I am trying to run a Python(Flask) server on another instance and trying to connect to the mongo server using the private ip, but the connection does not happen. I get this error message on the terminal :
pymongo.errors.ServerSelectionTimeoutError: xxx.xx.xx.xx:27017: [Errno
111] Connection refused
Is this not the right way to use mongo db on aws ? If this approach is feasible, what is causing the connection to not happen ?
All inputs appreciated, much thanks!
It is possible that your mongodb is configured to only accept connection from local host. Edit /etc/mongod.conf file to comment out the line that bindIP like in the example below -
# network interfaces
net:
port: 27017
# bindIp: 127.0.0.1 # Listen to local interface only, comment to listen on all interfaces.
I'm having a unique problem with Windows Azure that I don't see on other providers. I've been running connections from remote VMs to a MySQL database running on a DigitalOcean VM. I've successfully connected with AWS, Rackspace, Google, and all other providers, but for some reason, Microsoft Azure VMs don't seem to work.
VM OS: Ubuntu 14.04
I'm trying to connect using PyMySQL and SQLAlchemy.
What I've Tried:
The port is open and listening
The user definitely has permission to upload data into the DB (other remote connections with this user all work fine).
I have even tried "ufw disable" for the Firewall on the Windows Azure VM
I've set 3306 as an endpoint on the Azure VM
Despite all my attempts, the connection cannot be established. Is there something I'm missing on the setup?
As Azure VMs disable ICMP and we can use SSH tunnels to allow outside access to internal network resources. However I don’t have resource to create a DigitalOcean VM, but I have created 2 Azure VMs in 2 Cloud Services to try to reproduce the issue.
I installed mysql-server in VM.1 and mysql-client in VM.2.
Then I tried to connect MySQL server directly from VM.2, I got message “can’t connect to MySQL…”.
To work around this issue, I followed this post, created a SSH tunnel in VM.1 which hosted the MySQL server:
Open port 3306, so a remote client can connect to your MySQL Server. Run the following command to open TCP port 3306
iptables -A INPUT -i eth0 -p tcp -m tcp --dport 3306 -j ACCEPT
Now let’s check if the port 3306 is open by running this command:
sudo netstat -anltp|grep :3306
Create a SSH tunnel for port 3307
sudo ssh -fNg -L 3307:127.0.0.1:3306 azurevmuser#servername
Create an endpoint for the port 3307 in the dashboard of the VM in Azure management portal. For more details , See how to add endpoint to you Virtual machine. Now your Database host name is <vm_name>.cloudapp.net:3307
Then connect MySQL server from VM.2 using command:
# mysql -h <vm_1_name>.cloudapp.net -P 3307 -u user –pPassword
and it would work fine. Feel free to let us know if we have any misunderstood on your issue.
I am trying to connect to VPS mysql database from my PC. I use sqlalchemy framework, but I need establish SSH tunnel before connection.
Usual way, when web app run on VPS:
create_engine('mysql://user:pswd#localhost/dbname')
How can I connect to this database from another PC. Assume there are connections credentials: IP, username, password
Your MySQL server is listening to local connections only. To make it listen to outside connections:
Edit the /etc/mysl/my.cnf file
Comment out the line bind-address = 127.0.0.1
Restart mysqld