RDS Postgres, can connect from local but not from EC2 - python

Okay this is a little weird, I have set up an RDS Postgres instance that I'm able to connect from pgAdmin and psql-commandline running locally, but not able to connect from command-line psql or webserver (flask) from the EC2 instance.
I'm guessing it has to do something with outbound connections from EC2 but my security group setting allows all outbound.
FYI: I have setup flask using elasticbeanstalk if that makes any difference.

Related

How to connect to my PostgreSQL database from RDP?

I have a discord.py bot and I host it on Google Cloud RDP (windows). I was working and testing PostgreSQL database on my local computer and it was working like charm now when I tried to use the same code and same connection method on my RDP, I got an error.
This is how I connect to database on my local pc:
How do I connect it with RDP now? Do I need to make any changes to the database like whitelisting the IP and if so How do I do it?
Thanks
If the database is running on your local computer inside your firewall/router, then it cannot be reached from the Internet. If you control your router, you can try forwarding port 5432 on your public IP to port 5432 on your Windows computer. However, it might be better if you just moved the Postgres instance to your cloud instance.

Connect Python to kafka on AWS EC2 from local machine

I am trying to connect my python application to kafka running on AWS EC2. I am able to connect with ec2 via terminal i check with telnet <ec2 ip> 9092. I am able to connect via this but not able to connect with python application.
Even if my python application starts with without any error with ec2 ip address, i am not able to receive any data from my kafka topic from ec2 to local machine.
When i add my pulic ip address to:
advertised.listeners=PLAINTEXT://<local ip addrss>:9092
Debezium connector with kafka-connect won't start , but without enabling advertised.listeners it works.
How do i configure kafka and kafka-connect so that i can consume kafka topic from ec2 instance on my local machine?
You need to set advertised.listeners to be the EC2 Public DNS/IP, restart the broker, then open the VPC / firewall connection on the listening port.
Debezium's rest.advertised.listener property is different from Kafka broker's, and you woudn't need it set on your local machine.
Python and Kafka Connect should share the same bootstrap.server protocol
You can test your listeners better using kafkacat -L -b <bootstrap>:9092

Connecting to Redis from localhost

I am using ElastiCache for Redis on AWS. But, I'm not able to connect to Redis from localhost. I have used the endpoint on AWS. It always shows connection timed out. Is there a way to make it work?
The first thing you should make sure is the VPC Security Group that the Elasticache Redis is attached, allows Custom TCP traffic from your address to port number 6379 (default for Redis).
For me the connection timed out issue arised due to the inaccessibility to the VPC where the Redis cluster resides.
Check your security group of Elastic cache cluster. It should allow port 6379 from the EC2 instance. You can allow your EC2 instance's security group in Elastic cache security group to access port 6379.
You can't connect AWS ElastiCache Redis outside of your VPC network or from your laptop. You have to go through tunnel to connect redis from your laptop.
As elaticache redis is not avaiable to outside network that's why you are getting timeout error because redis couldn't find the aws elasicache host.
You can easily connect to aws elaticache redis with in aws vpc, without any timeout error.
My suggestion is use your localhost to connect redis in your laptop for development and use aws elasticache redis host/port to connect inside aws server.

Python Flask server only connects to MySQL in docker host network_mode

I have a simple Python Flask web server, running in a docker container along with an Nginx reverse proxy, and a MySQL database.
When the docker compose is run in network_mode: 'host', everything (nginx-python-mysql) works fine.
But when it is switched to bridge networking, Nginx and the python endpoint app seem to work fine, but the python app is unable to connect to the MySQL instance.
I am exposing the MySQL port 3306. Everything is running on a CentOS-7 vm.
The Python Flask app is running on an gunicorn Python WSGI HTTP Server instance. The MySQL database is based on tutum/mysql.
I have tried both pymysql and SQLAlchemy clients on the python endpoint, and I can only get them to connect to the mysql server in host network_mode.
And as far as I can tell from the docker instance info, the MySQL container is fine, and on the same network as the python endpoint container.
Unfortunately, this is all running on an air gaped server, so its difficult to post the config files here, but the only difference between the host and bridge mode settings is that the nginx ssl config proxy pass has https://endpoint instead of https://localhost, and connecting via https to the python app works in both host and bridge modes. The python app connects to the mysql database on host 0.0.0.0
What could cause the python app to mysql database connection problem to only work in host mode?

How can uwsgi re-establish connection to remote mysql database whenever mysql services are restarted

I have a python web application, where in, the application connects to a remote database.
Application: flask+uwsgi+nginx .
Database :mysql (remote).
The application exposes rest api for which data is served from remote database.
Everyday after db restoration, mysql service is restarted in the remote database. The connection between my application and remote database breaks, and it starts throwing error message
MySQL server has gone away.
until I manually restart the uwsgi service in my application
sudo service uwsgi restart
The duration between mysql service restart in remote db and uwsgi service restart in my system is the downtime.
Can my application re establish connection as soon as the mysql services are restarted ?
Please suggest any solutions?
It really depends on the way you connect to your database
In case you're using the popular ORMs:
SQLAlchemy ORM
engine = create_engine('mysql+mysqldb://...', pool_recycle=3600)
as stated in the documentation here
Peewee
#app.before_request
def _db_connect():
database.connect()
#app.teardown_request
def _db_close(exc):
if not database.is_closed():
database.close()
as stated in the documentation here
The issue is explained in depth in the mysql documentation here

Categories