Run Alembic migrations on Google App Engine - python

I have a Flask app that uses SQLAlchemy (Flask-SQLAlchemy) and Alembic (Flask-Migrate). The app runs on Google App Engine. I want to use Google Cloud SQL.
On my machine, I run python manage.py db upgrade to run my migrations against my local database. Since GAE does not allow arbitrary shell commands to be run, how do I run the migrations on it?

Whitelist your local machine's IP: https://console.cloud.google.com/sql/instances/INSTANCENAME/access-control/authorization?project=PROJECTNAME
Create an user: https://console.cloud.google.com/sql/instances/INSTANCENAME/access-control/users?project=PROJECTNAME
Assign an external IP address to the instance: https://console.cloud.google.com/sql/instances/INSTANCENAME/access-control/ip?project=PROJECTNAME
Use the following SQLAlchemy connection URI: SQLALCHEMY_DATABASE_URI = 'mysql://user:pw#ip:3306/DBNAME'
Remember to release the IP later as you are charged for every hour it's not used

It's all just code you can run, so you can create an admin endpoint with which to effect an upgrade:
#app.route('/admin/dbupgrade')
def dbupgrade():
from flask_migrate import upgrade, Migrate
migrate = Migrate(app, db)
upgrade(directory=migrate.directory)
return 'migrated'
(Dropwizard, for instance, caters nicely for such admin things via tasks)

You can whitelist the ip of your local machine for the Google Cloud SQL instance, then you run the script on your local machine.

Related

Django on Google App Engine: how to connect to cloud database to run makemigrations?

I have been following this tutorial to create Django cloud app. I have been stuck on the 'Run the app on your local computer' part of the tutorial. Before running cloud_sql_proxy.exe command, I have created .env file and pasted its contents into Permissions on Google Cloud, so theoretically, after running set GOOGLE_CLOUD_PROJECT=PROJECT_ID, I could delete this .env file from repository as it would recognize it anyway. But for now, I left it. What is more, I activate env correctly in the project dir when I ran command in this location, gcloud sql instances describe INSTANCE_NAME it works OK and displays database info.
Then, I have opened new Cloud SDK and ran command: cloud_sql_proxy.exe -instances="PROJECT_ID:REGION:INSTANCE_NAME"=tcp:5434.
The result is:
2021/11/08 17:11:11 Listening on 127.0.0.1:5434 for PROJECT_ID:REGION:INSTANCE_NAME
2021/11/08 17:11:11 Ready for new connections
2021/11/08 17:11:11 Generated RSA key in 116.9931ms
The reason behind why it is 5434 and not 5432 and 5433 as that these ports were busy. I must say that I have also downloaded postgresql and specified these:
information.
After running in env (Google SDK) respectively:
set GOOGLE_CLOUD_PROJECT=PROJECT_ID
set USE_CLOUD_SQL_AUTH_PROXY=true
python manage.py makemigrations , this error occurs:
C:\Users\User\Desktop\cloud\python-docs-samples\appengine\standard_python3\django\env\lib\site-packages\django\core\management\commands\makemigrations.py:105: RuntimeWarning: Got an error checking a consistent migration history performed for database connection 'default': FATAL: database "DATABASE_NAME" does not exist
warnings.warn(
No changes detected
I believe that it is because I must somehow know which port to use locally instead of 5434, to connect to a cloud through proxy. How to find this and fix the issue?
PS. note that i replaced real names to be PROJECT_ID, REGION, INSTANCE_NAME, DATABASE_NAME

Running manage.py on Heroku for Flask app gives "could not connect"

I am trying to migrate my database on Heroku using heroku run python manage.py db migrate on my Flask app. But I am getting this error:
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) could not connect to server: Connection
refused
Is the server running on host "localhost" (127.0.0.1) and accepting
TCP/IP connections on port 5432?
Here is the code to my manage.py file:
import os
from flask_script import Manager
from flask_migrate import Migrate, MigrateCommand
from app import create_app, db
app=create_app()
with app.app_context():
migrate = Migrate(app, db)
manager = Manager(app)
manager.add_command('db', MigrateCommand)
if __name__ == '__main__':
manager.run()
Here is the configuration file to the database:
import os
class Config:
SECRET_KEY= os.environ.get('SECRET_KEY')
SQLALCHEMY_DATABASE_URI= os.environ.get('SQLALCHEMY_DATABASE_URI')
MAIL_SERVER='smtp.googlemail.com'
MAIL_PORT=587
MAIL_USE_TLS = True
MAIL_USERNAME= os.environ.get('EMAIL_USER')
MAIL_PASSWORD= os.environ.get('EMAIL_PASS')
I have set up the sqlalchemy_database_uri to a database in PostgreSQL in this format postgres://YourUserName:YourPassword#YourHost:5432/YourDatabase.
This error has been bugging me and I cannot find a solution anywhere.
Why isn't this working?
You can't connect your Heroku application to your local database without some major drawbacks. It's a huge pain to set up (you'll need to deal with port forwarding and likely at least one firewall, probably dynamic IP addresses, ...) and your application won't run if your laptop is off.
A better solution would be to have a database in the cloud, and given that you are already using Heroku and PostgreSQL the most natural choice is Heroku's own Postgres service. If you're depending on psycopg2, there's a good chance that one has already been provisioned for you.
If it has, you'll see a DATABASE_URL environment variable containing your connection string. Simply set your SQLAlchemy database URI from that:
SQLALCHEMY_DATABASE_URI= os.environ.get('DATABASE_URL')
If a database hasn't been provisioned for you (check by running heroku addons) you can provision one using the free tier easily:
heroku addons:create heroku-postgresql:hobby-dev
Note that you shouldn't be running manage.py db migrate on Heroku at all. This generates migration files, which will be lost on Heroku's ephemeral filesystem. Generate migrations locally, and commit the migration files. You'll want to run manage.py db upgrade in both places, though.

Alembic migrations - script persistence between two deployments

I have problem with running automated migrations with alembic library (I use raw alembic library).
So this is setup of the application:
I have scheduler (python script which calculates something and then
stores it in database)
and flask REST API (which uses data stored in database by scheduler
to return adequate response)
I then deploy the app with script which runs these three commands:
alembic revision --autogenereate
alembic upgrade head
python run_scheduler.py
After initial deployment, alembic_version table is created in PostgreSQL database with identifier value under version_num column, and migration script (lets call this script xx.py) is created in alembic/versions/
When I redeploy the app (with running migrations and scheduler): I get the
" Can't locate revision identified by 'xxxxxxx'
Why?
Because there is no xx.py script anymore (docker is built from source control repo) and xx is the value under version_num column in alembic_version table.
How to approach to and solve this problem?
Quick fix of author: delete alembic_version table with code below (inside alembic/env.py script)
target_metadata = Base.metadata # for context
sql.execute('DROP TABLE IF EXISTS alembic_version', engine)

How to setup psycopg2 with Google App Engine PostgreSQL database

I have an application that is being run on Google's App Engine and I want it to use the associated PostgreSQL database. I'm using psycopg2 to help me with my SQL queries. However I am not sure how I can set up the connection. This is what I currently have:
con = psycopg2.connect(
host=HOST_NAME, # the IP address of the SQL database
database=DATABASE_NAME, # the name of the database (I'm using the default, so this is "postgres"
user=USER_NAME, # just the user name that I created
password=PASSWORD # the password associated to that user
)
However, when I try to make a request, I get the error psycopg2.OperationalError: could not connect to server: Connection timed out when creating this connection. Is there something I am missing?
It is a little bit tricky, but here is what worked for me. I will help you to set up the Quickstart App Engine with psycopg2 and after that you will get the idea.
Use the Quickstart for Python in the App Engine Flexible Environment documentation to set up and deploy your app.
Use the Connecting from App Engine documentation to connect to your App Engine app to the Cloud SQL Postgre SQL.
I have did slightly modifications in order to make that work:
In app.yaml add:
beta_settings:
cloud_sql_instances: [INSTANCE_CONNECTION_NAME]=tcp:5432
#[INSTANCE_CONNECTION_NAME] = [PROJECT_NAME]:[INSTANCE_ZONE]:[INSTANCE_NAME]
#[INSTANCE_CONNECTION_NAME] can be found at Google Cloud Console Cloud SQL's instance page, under "Instance connection name".
In requirements.txt add:
psycopg2
psycopg2-binary
In main.py add:
#app.route('/connect')
def connect():
try:
#host='172.17.0.1' is the defult IP for the docker container that it is being created during the deployment of the App Engine
conn = psycopg2.connect("dbname='postgres' user='postgres' host='172.17.0.1' password='test'")
return "Connection was established!"
except:
return "I am unable to connect to the database"
Use the gcloud app deploy command to deploy your app.
After the deployment, use the gcloud app browse command to open the app in the browser.
When accessing the link https://[PROJECT_ID].appspot.com/connect
It should respond with Connection was established!

How to use Flask-Migrate with Google App Engine?

Since I moved to Google App Engine I cannot run the Flask-Migrate command python manage.py db migrate because I have exceptions regarding some GAE related imports (No module named google.appengine.ext for example).
Is there a way to run this, or an alternative, to upgrade my database on GAE?
Yes, there is a way to run it, though it's not as straightforward as you'd might like.
You need to configure your Google Cloud SQL, add yourself as an authorized user (by entering your ip address) and request to have an IPv4 address. Deal with SSL as appropriate.
Using a script:
Replacing user, password, instance_id, db_name, and path below
# migrate_prod.py
DB_MIGRATION_URI = "mysql+mysqldb://user:password#instance_id/db_name?ssl_key=path/client-key.pem&ssl_cert=path/client-cert.pem&&ssl_ca=path/server-ca.pem"
from flask import Flask
from flask.ext.sqlalchemy import SQLAlchemy
from flask_script import Manager
from flask_migrate import Migrate, MigrateCommand
from models import * # not needed if migration file is already generated
app = Flask(__name__)
app.config["SQLALCHEMY_DATABASE_URI"] = DB_MIGRATION_URI
app.config["SQLALCHEMY_TRACK_MODIFICATIONS"] = False
db = SQLAlchemy(app)
migrate = Migrate(app, db)
manager = Manager(app)
manager.add_command('db', MigrateCommand)
if __name__ == '__main__':
manager.run()
Run the script as you would to migrate locally: python migrate_prod.py db upgrade, assuming your migration file is already there.
Release the IPv4, so that you're not charged for it.
I give much credit to other answers: how to connect via SSL and run alembic migrations on GAE (of which this is probably a duplicate).

Categories