SqlAlchemy extension:
https://pythonhosted.org/Flask-SQLAlchemy/index.html
and i want to setup the engine with customer configuration using the parameters here:
http://docs.sqlalchemy.org/en/rel_0_9/core/engines.html
I'm using the way described on Flask-SqlAlchemy documentation:
from flask import Flask
from flask.ext.sqlalchemy import SQLAlchemy
app = Flask(__name__)
app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:////tmp/test.db'
db = SQLAlchemy(app)
class User(db.Model):
id = db.Column(db.Integer, primary_key=True)
username = db.Column(db.String(80), unique=True)
email = db.Column(db.String(120), unique=True)
def __init__(self, username, email):
self.username = username
self.email = email
def __repr__(self):
return '<User %r>' % self.username
EDIT:
How can I pass the following engine parameters in this kind of configuration:
isolation_level = 'AUTOCOMMIT', encoding='latin1', echo=True
method: SQLAlchemy() doesn't take these as arguments.
it's an open issue: https://github.com/mitsuhiko/flask-sqlalchemy/issues/166
you can try this
class SQLiteAlchemy(SQLAlchemy):
def apply_driver_hacks(self, app, info, options):
options.update({
'isolation_level': 'AUTOCOMMIT',
'encoding': 'latin1',
'echo': True
})
super(SQLiteAlchemy, self).apply_driver_hacks(app, info, options)
db = SQLiteAlchemy(app)
it’s just a config option. Here’s ours:
SQLALCHEMY_ENGINE_OPTIONS = {
"pool_pre_ping": True,
"pool_recycle": 300,
}
I set {'pool_pre_ping':True} like above!TypeError: Invalid argument(s) 'pool_pre_ping' sent to create_engine(), using configuration MySQLDialect_pymysql/QueuePool/Engine.
Please check that the keyword arguments are appropriate for this combination of components.
you can define different engine options for different binds overwriting the apply_driver_hacks and define the options for each of your databases. Forexample, if you want to define different pool classes for different databases:
app.config['SQLALCHEMY_DATABASE_URI'] = "monetdb://..//.."
app.config['SQLALCHEMY_BINDS '] = {
'pg': 'postgres+psycopg2://..//..'
}
app.config['POOL_CLASS'] = {'monetdb' : StaticPool , "postgres+psycopg2" : QueuePool}
class MySQLAlchemy(SQLAlchemy):
def apply_driver_hacks(self, app, info, options):
super().apply_driver_hacks(app, info, options)
try:
options['poolclass'] = app.config['POOL_CLASS'][info.drivername]
except KeyError: #if the pool class is not defined it will be ignored, means that it will use the default option
pass
db = MySQLAlchemy(app)
Related
I need help with Enum field type as it is not accepted by Swagger and I am getting error message **TypeError: Object or Type eGameLevel is not JSON serializable**. Below is the complete set of code for table. Complete set of code with DB table and sqlalchemy settings is provided. I already tried it with Marshmallow-Enum Flask package and it didn't worked. Looking for kind help with some explanation about the solution so I can learn it well. :-)
I am using MySQL with the Flask. In Postgres its pretty easy to manage all the choice fields. All I need is a working example or a link to repository where MySQL choice fields are showing up in swagger drop down.
My Model:
import enum
from app import db
from typing import List
class eGameLevel(enum.Enum):
BEGINNER = 'Beginner'
ADVANCED = 'Advanced'
class Game(Base):
__tablename__ = 'game_stage'
id = db.Column(db.Integer(), primary_key=True)
game_level= db.Column(db.Enum(eGameLevel),
default=eGameLevel.BEGINNER, nullable=False)
user_id = db.Column(db.Integer(), db.ForeignKey('users.id', ondelete='CASCADE'), nullable=False)
user = db.relationship('User', backref='game__level_submissions', lazy=True)
def __init__(self, game_level, user_id):
self.game_level = game_level
self.user_id = user_id
def __repr__(self):
return 'Game(game_level%s, ' \
'user_id%s'%(self.game_level,
self.user_id)
def json(self):
return {'game_level':self.game_level,
'user_id':self.user_id}
#classmethod
def by_game_id(cls, _id):
return cls.query.filter_by(id=_id)
#classmethod
def find_by_game_level(cls, game_level):
return cls.query.filter_by(game_level=game_level)
#classmethod
def by_user_id(cls, _user_id):
return cls.query.filter_by(user_id=_user_id)
#classmethod
def find_all(cls) -> List["Game"]:
return cls.query.all()
def save_to_db(self) -> None:
db.session.add(self)
db.session.commit()
def delete_from_db(self) -> None:
db.session.delete(self)
db.session.commit()
My Schema
from app import ma
from app.models import Gode
class GameSchema(ma.SQLAlchemyAutoSchema):
game = ma.Nested('GameSchema', many=True)
class Meta:
model = Game
load_instance = True
include_fk= True
My Resources:
from flask_restx import Resource, fields, Namespace
from app.models import Game
from app import db
from app.schemas import GameSchema
GAME_REQUEST_NOT_FOUND = "Game request not found."
GAME_REQUEST_ALREADY_EXSISTS = "Game request '{}' Already exists."
game_ns = Namespace('Game', description='Available Game Requests')
games_ns = Namespace('Game Requests', description='All Games Requests')
game_schema = GameSchema()
games_list_schema = GameSchema(many=True)
gamerequest = game_ns.model('Game', {
'game_level': fields.String('Game Level: Must be one of: BEGINNER, ADVANCED.'),
'user_id': fields.Integer,
})
class GameRequestsListAPI(Resource):
#games_ns.doc('Get all Game requests.')
def get(self):
return games_list_schema.dump(Game.find_all()), 200
#games_ns.expect(gamerequest)
#games_ns.doc("Create a Game request.")
def post(self):
game_json = request.get_json()
game_data = game_schema.load(game_json)
game_data.save_to_db()
return game_schema.dump(game_data), 201
Instead of trying to manage Enum fields for MySQL schema I suggest to use another table with backref to your eGameLevel. You can get rid of this whole fuss and also in future if you needed to add another option in your choice field you won't have to hardcode it.
Simply create a main table as Game and sub table as eGameLevel (with only one string field). You will be able to access choices from your Game table.
Whenever I get stuck I go to basics as mentioned in here.
I made a small example just to test the serialization of an Enum
from enum import Enum
import sqlalchemy as sa
from flask import Flask
from flask_restx import Api, Namespace, Resource
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import scoped_session, sessionmaker
class eGameLevel(str, Enum):
BEGINNER = "Beginner"
ADVANCED = "Advanced"
engine = sa.create_engine("sqlite:///:memory:")
session = scoped_session(sessionmaker(bind=engine))
Base = declarative_base()
class Game(Base):
__tablename__ = "game"
id = sa.Column(sa.Integer, primary_key=True, autoincrement=True)
level = sa.Column(sa.Enum(eGameLevel), default=eGameLevel.BEGINNER, nullable=False)
def __repr__(self):
return f"Game(id={self.id}, level={self.level})"
def json(self):
data = {"id": self.id, "level": self.level}
return data
Base.metadata.create_all(engine)
g1 = Game(level=eGameLevel.BEGINNER)
g2 = Game(level=eGameLevel.ADVANCED)
session.add_all([g1, g2])
session.commit()
query_content = session.query(Game).all()
games_ns = Namespace("Game Requests", description="All Games Requests")
app = Flask(__name__)
api = Api(app)
#api.route("/game")
class GameRequestsListAPI(Resource):
#games_ns.doc("Get all Game requests.")
def get(self):
data = [x.json() for x in query_content]
return data, 200
app.run(debug=True)
This example works and I think the serialization is possible due to the str in the Enum declaration: class eGameLevel(str, Enum).
Instead of using Enum:
class eGameLevel(enum.Enum):
BEGINNER = 'Beginner'
ADVANCED = 'Advanced'
You can make use of dictionary:
eGameLevel = {"BEGINNER": 1, "ADVANCED": 2}
Then you can replace enum type for sql data model to String type as:
game_level= db.Column(db.Integer(),
default=eGameLevel["BEGINNER"], nullable=False)
And make appropriate checks using the defined dictionary throughout application. This will also solve issues with alembic as well for making db migrations.
You would also require modifying some of your python files. I would rather do it here directly, and then you can look up to modify them:
#Import at Resources
from flask import request
from app.models import Game, eGameLevel
Post Part:
# For post part
payload = request.json
game_obj = Game(game_level=eGameLevel[payload["game_level"]], user_id=payload["user_id"])
db.session.add(game_obj)
db.session.commit()
Furthermore, I did not understand what the from app.models import Gode meant.
I have a flask restful app connected to mySQL database and I am using SQLAlchemy. We can connect to the mySQL server using the following -
app.config['SQLALCHEMY_DATABASE_URI'] = f"mysql+pymysql://root:password#127.0.0.1:3306"
I am working on a use case where the database name will be provided on real-time basis through a GET request. Based on the database name provided, the app will connect to the respective database and perform the operations. For this purpose, I would like to have a way where I can tell the flask app to talk to the provided database (Flask app is already connected to the mySQL server). Currently, I am creating the connection again in the API class.
API: Calculate.py
from flask_restful import Resource, reqparse
from app import app
class Calculate(Resource):
def get(self):
parser = reqparse.RequestParser()
parser.add_argument('schema', type=str, required=True, help='Provide schema name.')
args = parser.parse_args()
session['schema_name'] = args.get('schema')
app.config['SQLALCHEMY_DATABASE_URI'] = f"mysql+pymysql://root:password#127.0.0.1:3306/{session['schema_name']}"
from db_models.User import User
...
DB Model: User.py
from flask import session
from flask_sqlalchemy import SQLAlchemy
from app import app
db = SQLAlchemy(app)
class User(db.Model):
__tablename__ = 'user'
__table_args__ = {"schema": session['schema_name']}
User_ID = db.Column(db.Integer, primary_key=True)
Name = db.Column(db.String(50))
db.create_all()
The above thing works for me. But I would want to understand if there is an alternative to this or a better way of doing this.
Edit: The above code does not work. It references the first schema name that was provided even if I provide a new schema name in the same running instance of the app.
you can write the SQLALCHEMY path like this:
SQLALCHEMY_DATABASE_URI='mysql+pymysql://root:password#localhost:3306/database name'
According to the docs not all values can be updated (first parragraph), in your use case you should use SQLALCHEMY_BINDS variable in your use case this is a dict and create a Model for each schema. Example:
Db Model
from flask import Flask
from flask.ext.sqlalchemy import SQLAlchemy
SQLALCHEMY_DATABASE_URI = f"mysql+pymysql://root:password#127.0.0.1:3306/schema_name1"
SQLALCHEMY_BINDS = {
'db1': SQLALCHEMY_DATABASE_URI, # default
'db2': f"mysql+pymysql://root:password#127.0.0.1:3306/schema_name2"
}
app = Flask(__name__)
db = SQLALchemy(app)
then create a model for each schema
class UserModeldb1(db.Model):
__tablename__ = 'user'
__bind_key__ = 'db1' #this parameter is set according to the database
id = db.Column(db.Integer, primary_key=True)
...
class UserModeldb2(db.Model):
__tablename__ = 'user'
__bind_key__ = 'db2'
id = db.Column(db.Integer, primary_key=True)
...
finally in your get method add some logic to capture the schema and execute your model accorddingly. you should look this question is really helpful Configuring Flask-SQLAlchemy to use multiple databases with Flask-Restless
I am using SQLAlchemy and I have the following code:
Model:
class User(db.Model):
__tablename__ = 'user'
__table_args__ = {'schema': 'task', 'useexisting': True}
id = Column(Integer, primary_key=True, autoincrement=True)
firstname = Column(String)
.env
SQLALCHEMY_DATABASE_URI = os.getenv('SQLALCHEMY_DATABASE_URI')
app.py
def create_app(config_file):
"""Create a Flask application using the app factory pattern."""
app = Flask(__name__)
"""Load configuration."""
app.config.from_pyfile(config_file)
"""Init app extensions."""
from .extensions import db
db.init_app(app)
This creates the SQLite file if it does not exist, but not the tables of each model.
The question is what can I do in order to create the tables for each model?
Just add:
db.create_all()
in app.py at the end of create_app().
create_all() will create the tables only when they don't exist and would not change the tables created before.
If you want to create the database and the tables from the command line you can just type:
python
from app.py import db
db.create_all()
exit()
The working example:
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
app = Flask(__name__)
app.secret_key = "Secret key"
app.config["SQLALCHEMY_DATABASE_URI"] = "sqlite:///my_database.sqlite3"
app.config["SQLALCHEMY_TRACK_MODIFICATIONS"] = False
db = SQLAlchemy(app)
class Data(db.Model):
id = db.Column(db.Integer, primary_key = True)
name = db.Column(db.String(50))
email = db.Column(db.String(50))
phone = db.Column(db.String(50))
db.create_all()
# add a row
# comment out after the 1st run
table_row = Data(name="My Name", email="myemail#mail.com", phone="123456")
db.session.add(table_row)
db.session.commit()
print "A row was added to the table"
# read the data
row = Data.query.filter_by(name="My Name").first()
print "Found:", row.email, row.phone
if __name__ == "__main__":
app.run(debug=True)
This is for Python 2.7, to run with Python 3.x just change the the print statements to call the print() function.
NOTE:
When using automatic model class constructor the arguments passed to model class constructor must be keyword arguments or there will be an error. Otherwise you can override the __init__() inside Data() class like this:
def __init__(self, name, email, phone, **kwargs):
super(Data, self).__init__(**kwargs)
self.name = name
self.email = email
self.phone = phone
In that case you don't have to use keyword arguments.
you need first to use Shell Context Processor to load automatically all Model objects
in app.py add
# import all models from all blueprints you have
from .users.models import User
#app.shell_context_processor
def make_shell_context():
return { 'db': db, 'User': User .. }
and then use Flask shell command
(venv) $ flask shell
>>> db
<SQLAlchemy engine=sqlite:///data-dev.sqlite> # something similar to that
>>>
>>> User
<class 'api.users.models.User'>
>>>
>>> # to create database if not exists and all tables, run the command below
>>> db.create_all()
maybe you'll need Flask-Migrate for advanced operations (migrations) on your database: create new table, update tables / fields ...
I m trying a tutorial, to make a database connection with flask, and postgreSQL database using json.
This is the code lines in models.py
from app import db
from sqlalchemy.dialects.postgresql import JSON
class Result(db.Model):
_tablename_= 'results'
id =db.Column(db.Integer, primary_key=True)
url = db.Column(db.String())
result_all = db.Column(JSON)
result_no_stop_words = db.Column(JSON)
def __init__(self, url, result_all, result_no_stop_words):
self.url = url
self.result_all = result_all
self.result_no_stop_words = result_no_stop_words
def __repr__(self):
return '<id {}>'.format(self.id)
Code in config.py
import os
basedir = os.path.abspath(os.path.dirname(__file__))
class Config(object):
DEBUG = False
TESTING = False
CSRF_ENABLED = True
SECRET_KEY = 'this-really-needs-to-be-changed'
SQLALCHEMY_DATABASE_URI = os.environ['postgresql://postgresql:bat123#localhost/DatabaseFirst']
class ProductionConfig(Config):
DEBUG = False
class StagingConfig(Config):
DEVELOPMENT = True
DEBUG = True
class DevelopmentConfig(Config):
DEVELOPMENT = True
DEBUG = True
class TestingConfig(Config):
TESTING = True
Code in manage.py
import os
from flask_script import Manager
from flask_migrate import Migrate, MigrateCommand
from app import app, db
app.config.from_object(os.environ['APP_SETTINGS'])
migrate = Migrate(app, db)
manager = Manager(app)
manager.add_command('db', MigrateCommand)
if __name__ == '__main__':
manager.run()
code in app.py
from flask import Flask
from flask.ext.sqlalchemy import SQLAlchemy
import os
app = Flask(__name__)
app.config.from_object(os.environ['APP_SETTINGS'])
app.config['SQLALCHEMY_TRACK_MODIFICATIONS'] = False
db = SQLAlchemy(app)
from models import Result
#app.route('/')
def hello():
return "Hello World!"
#app.route('/<name>')
def hello_name(name):
return "Hello {}!".format(name)
if __name__ == "__main__":
app.run()
I want to know before running this code lines should the database be created in postgreSQL, alone with the table, and columns,
Or these code lines creating the table, and columns in postgreSQL
class Result(db.Model):
_tablename_= 'results'
id =db.Column(db.Integer, primary_key=True)
url = db.Column(db.String())
result_all = db.Column(JSON)
result_no_stop_words = db.Column(JSON)
Basically i want to know the function or purpose served by the above set of code lines.(5 code lines)
manager.add_command('db', MigrateCommand) this piece adds a command called db so that you can run flask db which will create the tables and columns.
Note: inorder to use this command first you need to define FLASK_APP in the environment variables.
Eg:
export FLASK_APP=app.py
flask db
Also the model
class Result(db.Model): _tablename_= 'results' id =db.Column(db.Integer, primary_key=True) url = db.Column(db.String()) result_all = db.Column(JSON) result_no_stop_words = db.Column(JSON)
This defines the class representation of the table. It won't create table in database, it's just the representation. The MigrationCommand is responsible for the creation of tables in database.
class Result(db.Model):
__tablename__ = 'results'
id =db.Column(db.Integer, primary_key=True)
class Result(db.Model):
This code line is creating a class instance of Result in front end of Flask application and to pass those values to the Database postgreSQL or whatever respective database you will be using.
__tablename__ = 'results':
Here we are creating a table called results in the database in my case DatabaseFirst
id = db.Column(db.Integer, primary_key=True):
Here we are creating a column called id, in our table called results, which can hold only data of integer type and id column is assigned the primary key of the results table.
Here by the 3 code lines I mentioned above, the database tables and columns are created via the Flask application, and we can see the respective results on postgreSQL database.
Here is sqlamp docs.
I don't understand how I can connect sqlamp with predefined flask-sqlalchemy session. Docs says something like
from history_meta import VersionedMeta, VersionedListener
app = Flask(__name__)
db = SQLAlchemy(app, session_extensions=[VersionedListener()])
class User(db.Model):
__metaclass__ = VersionedMeta
username = db.Column(db.String(80), unique=True)
pw_hash = db.Column(db.String(80))
but there is no session_extensions in the latest version of flask-sqlalchemy. Maybe I've got to use a session_options, but it is unclear how to use it.
Slightly different syntax now:
db = SQLAlchemy(session_options={'extension': [VersionExtension()]})