I have an application that is being run on Google's App Engine and I want it to use the associated PostgreSQL database. I'm using psycopg2 to help me with my SQL queries. However I am not sure how I can set up the connection. This is what I currently have:
con = psycopg2.connect(
host=HOST_NAME, # the IP address of the SQL database
database=DATABASE_NAME, # the name of the database (I'm using the default, so this is "postgres"
user=USER_NAME, # just the user name that I created
password=PASSWORD # the password associated to that user
)
However, when I try to make a request, I get the error psycopg2.OperationalError: could not connect to server: Connection timed out when creating this connection. Is there something I am missing?
It is a little bit tricky, but here is what worked for me. I will help you to set up the Quickstart App Engine with psycopg2 and after that you will get the idea.
Use the Quickstart for Python in the App Engine Flexible Environment documentation to set up and deploy your app.
Use the Connecting from App Engine documentation to connect to your App Engine app to the Cloud SQL Postgre SQL.
I have did slightly modifications in order to make that work:
In app.yaml add:
beta_settings:
cloud_sql_instances: [INSTANCE_CONNECTION_NAME]=tcp:5432
#[INSTANCE_CONNECTION_NAME] = [PROJECT_NAME]:[INSTANCE_ZONE]:[INSTANCE_NAME]
#[INSTANCE_CONNECTION_NAME] can be found at Google Cloud Console Cloud SQL's instance page, under "Instance connection name".
In requirements.txt add:
psycopg2
psycopg2-binary
In main.py add:
#app.route('/connect')
def connect():
try:
#host='172.17.0.1' is the defult IP for the docker container that it is being created during the deployment of the App Engine
conn = psycopg2.connect("dbname='postgres' user='postgres' host='172.17.0.1' password='test'")
return "Connection was established!"
except:
return "I am unable to connect to the database"
Use the gcloud app deploy command to deploy your app.
After the deployment, use the gcloud app browse command to open the app in the browser.
When accessing the link https://[PROJECT_ID].appspot.com/connect
It should respond with Connection was established!
Related
I am trying to migrate my database on Heroku using heroku run python manage.py db migrate on my Flask app. But I am getting this error:
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) could not connect to server: Connection
refused
Is the server running on host "localhost" (127.0.0.1) and accepting
TCP/IP connections on port 5432?
Here is the code to my manage.py file:
import os
from flask_script import Manager
from flask_migrate import Migrate, MigrateCommand
from app import create_app, db
app=create_app()
with app.app_context():
migrate = Migrate(app, db)
manager = Manager(app)
manager.add_command('db', MigrateCommand)
if __name__ == '__main__':
manager.run()
Here is the configuration file to the database:
import os
class Config:
SECRET_KEY= os.environ.get('SECRET_KEY')
SQLALCHEMY_DATABASE_URI= os.environ.get('SQLALCHEMY_DATABASE_URI')
MAIL_SERVER='smtp.googlemail.com'
MAIL_PORT=587
MAIL_USE_TLS = True
MAIL_USERNAME= os.environ.get('EMAIL_USER')
MAIL_PASSWORD= os.environ.get('EMAIL_PASS')
I have set up the sqlalchemy_database_uri to a database in PostgreSQL in this format postgres://YourUserName:YourPassword#YourHost:5432/YourDatabase.
This error has been bugging me and I cannot find a solution anywhere.
Why isn't this working?
You can't connect your Heroku application to your local database without some major drawbacks. It's a huge pain to set up (you'll need to deal with port forwarding and likely at least one firewall, probably dynamic IP addresses, ...) and your application won't run if your laptop is off.
A better solution would be to have a database in the cloud, and given that you are already using Heroku and PostgreSQL the most natural choice is Heroku's own Postgres service. If you're depending on psycopg2, there's a good chance that one has already been provisioned for you.
If it has, you'll see a DATABASE_URL environment variable containing your connection string. Simply set your SQLAlchemy database URI from that:
SQLALCHEMY_DATABASE_URI= os.environ.get('DATABASE_URL')
If a database hasn't been provisioned for you (check by running heroku addons) you can provision one using the free tier easily:
heroku addons:create heroku-postgresql:hobby-dev
Note that you shouldn't be running manage.py db migrate on Heroku at all. This generates migration files, which will be lost on Heroku's ephemeral filesystem. Generate migrations locally, and commit the migration files. You'll want to run manage.py db upgrade in both places, though.
I have a basic flask application in which I am connecting with a local Postgres server.
The problem is that the application works fine when I run it using python manager. But when I use uWSGI to run application, the query doesn't executes i.e., it timeouts without throwing any error message.
I have a relatively simple wsgi configuration file like
[uwsgi]
module = wsgi
master = true
processes = 1
http = 0.0.0.0:5000
buffer-size = 32768
vacuum = true
die-on-term = true
My flask application uses psycopg2 to connect with localhost like
connection = psycopg2.connect(
host='127.0.0.1',
port=5432,
database='db',
user='user',
password='password'
)
I think this might probably be happening because postgresql socket is not accessible by uWSGI (reference). I believe this can be solved by providing postgres socket permission to uWSGI, but I am unable to figure out how. If there is another solution than that will also work.
I'm trying to connect to Google Cloud MySQL from Google App engine but getting OperationalError: (2002, "Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)")
I've tried using following settings:
app.yaml:
runtime: python37
env: standard
handlers:
- url: /static
static_dir: static
runtime_config:
python_version: 3
db connection string: mysql+pymysql://{user}:{password}#localhost/{db}?unix_socket=/cloudsql/{conn_name}
My concern is it was working before but when I changed the db user/password it started giving me above error and reverting my last change did not solved my problem. Is there some issue with my settings or could it be some cache issue on app engine?
The best way to connect to your Google Cloud SQL instance from App Engine standard environment you can find in Google documentation here.
I am new to GCP App engine. I am trying to make web server using flask on App engine. I tested the version of my code on localhost and it is working fine. But when I am trying to deploy it in the App Engine of GCP it is giving me this strange error "app logs".
error logs
Here is my code for the flask
app.run(threaded = True, host='127.0.0.1', port=80)
Thanks!!
You don't need to call app.run() in your main.py file -- App Engine will do that for you. Simply initialize the app variable in this file instead.
I have a Flask app that uses SQLAlchemy (Flask-SQLAlchemy) and Alembic (Flask-Migrate). The app runs on Google App Engine. I want to use Google Cloud SQL.
On my machine, I run python manage.py db upgrade to run my migrations against my local database. Since GAE does not allow arbitrary shell commands to be run, how do I run the migrations on it?
Whitelist your local machine's IP: https://console.cloud.google.com/sql/instances/INSTANCENAME/access-control/authorization?project=PROJECTNAME
Create an user: https://console.cloud.google.com/sql/instances/INSTANCENAME/access-control/users?project=PROJECTNAME
Assign an external IP address to the instance: https://console.cloud.google.com/sql/instances/INSTANCENAME/access-control/ip?project=PROJECTNAME
Use the following SQLAlchemy connection URI: SQLALCHEMY_DATABASE_URI = 'mysql://user:pw#ip:3306/DBNAME'
Remember to release the IP later as you are charged for every hour it's not used
It's all just code you can run, so you can create an admin endpoint with which to effect an upgrade:
#app.route('/admin/dbupgrade')
def dbupgrade():
from flask_migrate import upgrade, Migrate
migrate = Migrate(app, db)
upgrade(directory=migrate.directory)
return 'migrated'
(Dropwizard, for instance, caters nicely for such admin things via tasks)
You can whitelist the ip of your local machine for the Google Cloud SQL instance, then you run the script on your local machine.