Pymongo connection timeout from remote machine - python

I have a Bitnami MEAN Stack running on AWS EC2. I'm trying to connect from a remote machine using PyMongo.
from pymongo import MongoClient
conn = MongoClient('mongodb://username:password#ec2blah.us-east-1.compute.amazonaws.com:27017/dbname')
but I keep getting an error along the lines of pymongo.errors.ConnectionFailure: timed out
I have edited /opt/bitnami/mongodb/mongodb.conf to supposedly allow external connections by commenting out bind_ip = 127.0.0.1 and uncommented bind_ip = 0.0.0.0 and all permutations of commenting/uncommenting those lines.
I've looked over the web for about 90 minutes now trying different things but without luck!

On the mongoDB server, do the port connection test, and make sure the DB service running well. If not, start the service.
telnet ec2blah.us-east-1.compute.amazonaws.com 27017
On the remote machine, do the port connection test, to make sure there is no firewall issue.
telnet ec2blah.us-east-1.compute.amazonaws.com 27017
If you have issue to connect, you need check security groups on this instance.
Click the ec2 instance name --> Description --> view rules, you should see the ports are opened
If not, create a new security group , such as `mongoDB`, tcp port 27017 should be opened for inbound traffic, then assign to that instance.
You should be fine to connect it now.

At the time of start-up of MongoDB, set the bind_ip argument to ::,0.0.0.0
mongod --bind_ip ::,0.0.0.0
Read more in the docs of MongoDB: IP Binding.

Related

Redshift new cluster used to be accessible but not anymore

Last week, I was able to successfully connect to Redshift clusters. This week I am unable to connect even though I gave same configs for the following:
Virtual Private Cloud VPC
Security Groups
Cluster subnet group
Publicly accessible Cluster permissions
But this week I get the error
Traceback (most recent call last):
File "create_staging_tables.py", line 93, in <module>
conn = psycopg2.connect(
File "/Users/bsubramanian/.pyenv/versions/3.8.2/lib/python3.8/site-packages/psycopg2/__init__.py", line 122, in connect
conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
psycopg2.OperationalError: could not connect to server: Operation timed out
Is the server running on host "clustername.region.redshift.amazonaws.com" (54.243.82.201) and accepting
TCP/IP connections on port 5439?
when running from a Python script which is used to connect to redshift cluster and create some tables.
How do I debug what is wrong?
Typically these issues are network related. Checking connectivity from your client system to the database is a good start.
First off check the connection information - go to the Redshift console and confirm the IP address given in the error message is the IP address of the leader node. If these don't match your code has some wrong configuration. (Note that Redshift can also have a public IP if you configured the cluster as such. Most users don't do this for security reasons. If you do you likely should be using that IP address.)
Next a simple test of network connectivity is a good step. The Linux command telnet can do this - telnet 5439. Now telnet cannot talk to Redshift but if you get any response other than a time out telnet is able to make the initial connection to Redshift. If this doesn't work then a lot more information about your network configuration will be needed to debug.
Now all of this assumes you don't have a connection pool server in between your client and the DB. It looks to be the case but ...
If you can connect via IP address but not with the cluster DNS name then a DNS issue is likely. We'll need more info on your DNS setup (and some on the network). This doesn't look to be the issue but ...
If telnet can connect but your client cannot (with the same info) then it could be a security group configuration issue.
There are lots of possibilities. Start by checking the connection info and update the issue as you learn more.
I was able to resolve this by creating new instances of the following
Virtual Private Cloud(VPC)
VPC Security Group
Cluster Subnet group

Issue in connecting Python with MySQL on Google Cloud Platform

I have used the following code in Python:
import mysql.connector as mysql
import sys
HOST = "34.87.95.90"
DATABASE = "CAO_db"
USER = "root"
PASSWORD = "*********"
db_connection = mysql.connect(user=USER, password=PASSWORD, host=HOST, database=DATABASE)
cur = db_connection.cursor()
When I run the above code, I get the following error messages:
TimeoutError: [WinError 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond
InterfaceError: 2003: Can't connect to MySQL server on '34.87.95.90:3306' (10060 A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond)
I am not sure of how to fix my code and/or resolve the given errors. Please ask me if you would like more details of the error messages to help with the issue. I would greatly appreciate all the help I can get towards resolving the issues.
One thing I'm not seeing here is whether or not you have configured your Cloud SQL instance to accept connections.
You can configure it to accept connections from within the GCP stratosphere using their "Private IP" internal networking magic, AND you can configure it to accept connections from other machines using a combination of Public IP and either an authorized external network (like if you were accessing your GCP Cloud SQL instance from, say, an Amazon EC2 instance), or their Cloud SQL Proxy tool (which is what I use to connect to my Cloud SQL instance from my laptop).
In the GCP Console, go to your project
From the hamburger menu, select SQL
Click on your Cloud SQL instance
In the left nav, click on Connections
If you have Private IP checked and you're running this code on a GCP Compute/GKE resource, confirm that the "Network" field is set to the network used by that resource.
If you're just trying to get a connection from your local machine and you don't have a static IP to whitelist, your best option is to use Public IP in combination with Cloud SQL Proxy.
Cloud SQL Proxy essentially creates a TCP tunnel that allows your laptop to connect to 'localhost' on a port you specify, and it then redirects your connection to the remote Cloud SQL instance.
Once you've established that your networking situation isn't the problem, you could use the same Python connection code that you wrote above, but change HOST to 127.0.0.1 and add an attribute for PORT=3308.
EDITED to add: I suggest using PORT=3308 for your cloud_sql_proxy connection so that it doesn't interfere with any existing port 3306 (MySQL default) connections that you may already be actually running on your local machine. If this isn't the case, you can either omit the PORT attribute or keep it explicit, but change it to 3306.

Redash - Change Postgres connection port

Is it possible to change the port Redash connects to Postgres?
I had initially set-up Redash successfully and connected to Postgres, but after a few days, it was impossible to start Postgres on port 5432. I even tried reinstalling but it forcefully sets the port to 5433. I have tried to change the port to 5433 on redash/query_runner/pg.py but there is no change.
How can I change the port flask listens to, to 5433 or any other for that matter?
The file 'query_runner/pg.py' is designed to connnect remote database like mysql.py, oracle.py and so on. It's not the backend database saving users' infomation. So you should try to change the value like
SQLALCHEMY_DATABASE_URI = os.environ.get(
"REDASH_DATABASE_URL", os.environ.get("DATABASE_URL", "postgresql:///postgres"))
It is in 'redash/settings/__init__.py', but I am sorry that I don't know the detail about how to change it.

Connecting to MongoDb DB that is installed in different server

We have to servers. I have installed MongoDB on one of the servers (UBUNTU - Digital Ocean VPS).
When I run a script to retrieve data from the same server using a localhost, I can do that perfectly.
import pymongo
//SERVER = 'mongodb://localhost:27017/myproject'
SERVER = 'mongodb://root:password#x.x.x.x:27017/myproject' where x.x.x.x is the address of my server
connection=pymongo.MongoClient(SERVER)
db = connection.myproject
print list(db.coll.find())
The problem is thqt I can't connect to this DB. Note that I can ssh and run the script using localhost inside the server; but not the case out of the server.
Do I need to go through some configuration:
You must allow remote access
vi /etc/mongod.conf
Listen only local interface.
bind_ip = 127.0.0.1
you must add the IP of your other servers. For Example:
Listen local interface and 192.168.0.100.
bind_ip = 127.0.0.1, 192.168.0.100
Comment out to listen on all interfaces
Nota: Comma Separated
I hope to help
For development purposes you can open an ssh tunnel like
ssh <UBUNTU - Digital Ocean VPS> -L27018:localhost:27017
and then connect to the remote db as
SERVER = 'mongodb://root:password#localhost:27018/myproject'
while ssh connection remains open. You can use any free port instead of 27018.
Otherwise you need to reconfigure mongodb to listen to all interfaces. Comment out bindIp line in mongodb config and restart the server. This will make the DB publicly accessible, so make sure you use strong passwords and don't allow anonymous access.
Finally, if you are using VPN, you need to uncomment bindIp line in the mongodb config, and add VPN interface there, e.g.:
bindIp = 127.0.0.1,10.0.1.12
where 10.0.1.12 should be replaced with vpn interface of your ubuntu box. You can find exact value with ifconfig. Important: there are no spaces around coma.

Using Pymongo to connect to MongoDB on AWS instance from Windows

An error is repeatedly being thrown at this line:
client = MongoClient('ec2-12-345-67-89.us-east-2.compute.amazonaws.com', 27017,
ssl=True, ssl_keyfile='C:\\mongo.pem')
(Paths and instance name changed for obvious reasons)
The port (27017) for mongo is allowed inbound connections from my AWS security group. First, I allowed only my IP, now I'm allowing all via that port. I have tried preceding the connection string with "mongodb://" and removing the SSL arguments (I'm fairly certain I don't need it).
The error IntelliJ keeps throwing me is:
pymongo.errors.ConnectionFailure: [WinError 10061] No connection could be made because the target machine actively refused it
It works if I transport the script to the AWS instance and replace the DNS with 'localhost' and remove SSL parameters, but I need this to work remotely.
Three ideas:
Ensure "bind_ip" is set to "0.0.0.0" in your mongod.conf and restart mongod, as #ajduke suggests.
Make sure mongod is running.
Try to connect to the mongod from your client machine using the "mongo" shell to see if it gives you a more informative error.

Categories