Attach the same SQLAlchemy table to two models with different binds - python

I want to add two MySQL databases into my Flask app. Unfortunately, these database are almost identical.
They have same table and column names, but different data.
I am using SQLALCHEMY_BINDS in my config.py
SQLALCHEMY_BINDS = {
'old': 'mysql://[OLD_DB_HERE]',
'new': 'mysql://[NEW_DB_HERE]'
}
And then in my models.py
class CallOld(db.Model):
__bind_key__ = 'old'
__table__ = db.Model.metadata.tables['ConferenceCall2']
class CallNew(db.Model):
__bind_key__ = 'new'
__table__ = db.Model.metadata.tables['ConferenceCall2']
The problem is that when I call a query for both tables I get the same results.
For example, both CallOld.query.with_entities(CallOld.TenantName.distinct()).all() and CallNew.query.with_entities(CallNew.TenantName.distinct()).all()
return the same.
Interestingly, the output is always from the second of the two model classes. Apparently the second class (CallNew in that case) overwrites the first (CallOld).
How do I attach the same table definition to two models with different binds?

You should use a mixin for this:
A common need when using declarative is to share some functionality, such as a set of common columns...
The reason why the output is always from the second (new) model's bound database is that as you manually define the __table__ for the two models Flask's declarative extensions work their black magic:
def __init__(self, name, bases, d):
bind_key = d.pop('__bind_key__', None) or getattr(self, '__bind_key__', None)
DeclarativeMeta.__init__(self, name, bases, d)
if bind_key is not None and hasattr(self, '__table__'):
self.__table__.info['bind_key'] = bind_key
As can be seen the __table__.info['bind_key'] is overwritten in each declarative class that the table is passed to.

Related

Completely restart/reload declarative class with dynamic functionality in SQLAlchemy

I am using SQLAlchemy + SQLite3 for creating multiple databases based on user input. When initializing a new database, the user defines any number of arbitrary features and their types. I wrote a DBManager class to serve as an interface between user input and database creation/access.
Dynamically "injecting" these arbitrary features in the declarative model (the Features class) is working as expected. The problem I have is when the user wants to create a second/different database: I can't figure out how to completely "clear" or "refresh" the model or the declarative_base so that the user is able to create a new database (with possibly different features).
Below is a minimal reproducible example of my situation:
src.__init__.py:
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
Session = sessionmaker()
Base = declarative_base()
src.features.py
from sqlalchemy import Column, ForeignKey, Integer
from sqlalchemy.orm import relationship
from src import Base
class Features(Base):
__tablename__ = "features"
features_id = Column(Integer, primary_key=True)
#classmethod
def add_feature(cls, feature_name, feature_type):
setattr(cls, feature_name, Column(feature_type))
src.db_manager.py:
from typing import Optional, Dict
from sqlalchemy import create_engine
from src import Base, Session
from src.features import Features
class DBManager:
def __init__(self, path: str, features: Optional[Dict] = None) -> None:
self.engine = create_engine(f'sqlite:///{path}')
Session.configure(bind=self.engine)
self.session = Session()
self.features = features
if self.features: # user passed in some arbitrary features
self.bind_features_to_features_table()
Base.metadata.create_all(bind=self.engine)
def bind_features_to_features_table(self):
for feature_name, feature_type in self.features.items():
Features.add_feature(feature_name=feature_name, feature_type=feature_type)
I'd like to be able to do something like this:
from sqlalchemy import String, Float, Integer
from src.db_manager import DBManager
# User wants to create a database with these features
features = {
'name': String,
'height': Float,
}
db_manager = DBManager(path='my_database.db', features=features)
# ... User does some stuff with database here ...
# Now the user wants to create another database with these features
other_features = {
'age': Integer,
'weight': Float,
'city_of_residence': String,
'name': String,
}
db_manager = DBManager(path='another_database.db', features=other_features)
After executing the last line, I'm met with: InvalidRequestError: Implicitly combining column features.name with column features.name under attribute 'name'. Please configure one or more attributes for these same-named columns explicitly. The error wouldn't occur if the feature name did not appear on both databases, but then the feature height would be brought over to the second database, which is not desired.
Things I tried but didn't work:
call Base.metadata.clear() between DBManager instances: same error
call sqlalchemy.orm.clear_mappers() between DBManager instances: results in AttributeError: 'NoneType' object has no attribute 'instrument_attribute'
call delattr(Features, feature_name): results in NotImplementedError: Can't un-map individual mapped attributes on a mapped class..
This program will be running inside a GUI, so I can't really afford to exit/restart the script in order to connect to the second database. The user should be able to load/create different databases without having to close the program.
I understand that the error stems from the fact that the underlying Base object has not been "refreshed" and is still keeping track of the features created in my first DBManager instance. However I do not know how to fix this. What's worse, any attempt to overwrite/reload a new Base object will need to be applied to all modules that imported that object from __init__.py, which sounds tricky. Does anyone have a solution for this?
My solution was to define the Features declarative class inside a function, get_features, that takes a Base (declarative base) instance as an argument. The function returns the Features class object, so that every call essentially creates a new Features class as a whole.
The class DBManager is then responsible for calling that function, and Features becomes a instance attribute of DBManager. Creating a new instance of DBManager means creating an entire new class based on Features, to which I can then add any arbitrary features I'd like.
The code looks something like this:
def get_features(declarative_base):
class Features(declarative_base):
__tablename__ = "features"
features_id = Column(Integer, primary_key=True)
#classmethod
def add_feature(cls, feature_name, feature_type):
setattr(cls, feature_name, Column(feature_type))
return Features
class DBManager:
def __init__(self, path, features):
self.engine = create_engine(f'sqlite:///{path}')
Session.configure(bind=self.engine)
self.session = Session()
base = declarative_base()
self.features_table = get_features(base=base)
if self.features: # user passed in some arbitrary features
self.bind_features_to_features_table()
Base.metadata.create_all(bind=self.engine)
def bind_features_to_features_table(self):
for feature_name, feature_type in self.features.items():
self.features_table.add_feature(feature_name=feature_name, feature_type=feature_type)
It definitely feels a bit convoluted, and I have no idea if there are any caveats I'm not aware of, but as far as I can tell this approach solved my problem.

SQLAlchemy - defining a foreign key relationship in a different database

I'm using sqlalchemy declarative and python2.7 to read asset information from an existing database. The database uses a number of foreign keys for constant values. Many of the foreign keys exist on a different database.
How can I specify a foreign key relationship where the data exists on a separate database?
I've tried to use two separate Base classes, with the models inheriting from them separately.
I've also looked into specifying the primaryjoin keyword in relationship, but I've been unable to understand how it would be done in this case.
I think the problem is that I can only bind one engine to a session object. I can't see any way to ask sqlalchemy to use a different engine when making a query on a nested foreign key item.
OrgBase = declarative_base()
CommonBase = declarative_base()
class SomeClass:
def __init__(sql_user, sql_pass, sql_host, org_db, common_host, common)
self.engine = create_engine("{type}://{user}:{password}#{url}/{name}".format(type=db_type,
user=sql_user,
password=sql_pass,
url=sql_host,
name=org_db))
self.engine_common = create_engine("{type}://{user}:{password}#{url}/{name}".format(type=db_type,
user=sql_user,
password=sql_pass,
url=common_host,
name="common"))
self.session = sessionmaker(bind=self.engine)()
OrgBase.metadata.bind = self.engine
CommonBase.metadata.bind = self.engine_common
models.py:
class FrameRate(CommonBase):
__tablename__ = 'content_frame_rates'
__table_args__ = {'autoload': True}
class VideoAsset(OrgBase):
__tablename__ = 'content_video_files'
__table_args__ = {'autoload': True}
frame_rate_id = Column(Integer, ForeignKey('content_frame_rates.frame_rate_id'))
frame_rate = relationship(FrameRate, foreign_keys=[frame_rate_id])
Error with this code:
NoReferencedTableError: Foreign key associated with column 'content_video_files.frame_rate_id' could not find table 'content_frame_rates' with which to generate a foreign key to target column 'frame_rate_id'
if I run:
asset = self.session.query(self.VideoAsset).filter_by(uuid=asset_uuid).first()
My hope is that the VideoAsset model can nest frame_rate properly, finding the value on the separate database.
Thank you!

How to filter database query by context information from request? [duplicate]

I have a model with a field is_deleted, now I want all forms of query for this model to always filter by is_deleted=False in addition to whatever filtering arguments is passed to .filter and .filter_by.
In Django, I would normally override the manager and add my own filtering but I need help for SQLAlchemy.
UPDATE:
I ended-up doing the following:
class CustomQuery(Query):
def __new__(cls, *args, **kwargs):
if args and hasattr(args[0][0], "is_deleted"):
return Query(*args, **kwargs).filter_by(is_deleted=False)
else:
return object.__new__(cls)
session = scoped_session(sessionmaker(query_cls=CustomQuery))
It works but if I have more fields later on I imagine I'll have to add more condition, there must be a way to do this on the model level.
This is a very old question so I'm sure the OP solved their issue, but as it remains unanswered (in 2021) here's how we've approached applying a custom query class to all models:
Define the custom query as above
class CustomQuery(Query): ...
Then set this query class as the query attribute on your base model class:
class BaseModel(Model):
__abstract__ = True
query_class = CustomQuery
...
Then any models implementing the BaseModel will obviously inherit this behaviour:
class MyModel(BaseModel):
__tablename__ = 'my_model'
....
Note, in our case not all of the tables follow the soft delete pattern (we don't care about the history of every single table). Here, you could implement a separate BaseModel that uses the default query class.
class ImmutableBaseModel(Model):
__abstract__ = True
query_class = CustomQuery
...
class MutableBaseModel(Model):
__abstract__ = True
If you find yourself here and you've not read it yet check out this excellent blog post from Miguel Grinberg on implementing the soft delete pattern and accompanying repo
 

Change SqlAlchemy declarative model table schema at runtime

I am trying to build a declarative table that runs in both postgres and sqlite. The only difference between the tables is that the postgres table is going to run within a specific schema and the sqlite one will not. So far I've gotten the tables to build without a schema with the code below.
metadata = MetaData()
class Base(object):
__table_args__ = {'schema': None}
Base = declarative_base(cls=Base, metadata=metadata)
class Configuration(Base):
"""
Object representation of a row in the configuration table
"""
__tablename__ = 'configuration'
name = Column(String(90), primary_key=True)
value = Column(String(256))
def __init__(self, name="", value=""):
self.name = name
self.value = value
def build_tables(conn_str, schema=None):
global metadata
engine = create_engine(conn_str, echo=True)
if schema:
metadata.schema=schema
metadata.create_all(engine)
However, whenever I try to set a schema in build_tables(), the schema doesn't appear to be set in the newly built tables. It only seems to work if I set the schema initially at metadata = MetaData(schema='my_project') which I don't want to do until I know which database I will be running.
Is there another way to set the table schema dynamically using the declarative model? Is changing the metadata the wrong approach?
Altho this is not 100% the answer to what you are looking for, I think #Ilja Everilä was right the answer is partly in https://stackoverflow.com/a/9299021/3727050.
What I needed to do was to "copy" a model to a new declarative_base. As a result I faced a similar problem with you: I needed to:
Change the baseclass of my model to the new Base
Turns out we also need to change the autogenerated __table__ attribute of the model to point to the new metadata. Otherwise I was getting a lot of errors when looking up PK in that table
The solution that seems to be working for me is to clone the mode the following way:
def rebase(klass, new_base):
new_dict = {
k: v
for k, v in klass.__dict__.items()
if not k.startswith("_") or k in {"__tablename__", "__table_args__"}
}
# Associate the new table with the new metadata instead
# of the old/other pool
new_dict["__table__"] = klass.__table__.to_metadata(new_base.metadata)
# Construct and return a new type
return type(klass.__name__, (new_base,), new_dict)
This in your case can be used as:
...
# Your old base
Base = declarative_base(cls=Base, metadata=metadata)
# New metadata and base
metadata2 = MetaData(schema="<new_schema>")
Base2 = declarative_base(cls=Base, metadata=metadata)
# Register Model/Table in the new base and meta
NewConfiguration = rebase(Configuration, Base2)
metadata2.create_all(engine)
Notes/Warnings:
The above code is not tested
It looks to me too verbose and hacky ... there has to be a better solution for what you need (maybe via Pool configs?)

Flask + SQLAlchemy - custom metaclass to modify column setters (dynamic hybrid_property)

I have an existing, working Flask app that uses SQLAlchemy. Several of the models/tables in this app have columns that store raw HTML, and I'd like to inject a function on a column's setter so that the incoming raw html gets 'cleansed'. I want to do this in the model so I don't have to sprinkle "clean this data" all through the form or route code.
I can currently already do this like so:
from application import db, clean_the_data
from sqlalchemy.ext.hybrid import hybrid_property
class Example(db.Model):
__tablename__ = 'example'
normal_column = db.Column(db.Integer,
primary_key=True,
autoincrement=True)
_html_column = db.Column('html_column', db.Text,
nullable=False)
#hybrid_property
def html_column(self):
return self._html_column
#html_column.setter
def html_column(self, value):
self._html_column = clean_the_data(value)
This works like a charm - except for the model definition the _html_column name is never seen, the cleaner function is called, and the cleaned data is used. Hooray.
I could of course stop there and just eat the ugly handling of the columns, but why do that when you can mess with metaclasses?
Note: the following all assumes that 'application' is the main Flask module, and that it contains two children: 'db' - the SQLAlchemy handle and 'clean_the_data', the function to clean up the incoming HTML.
So, I went about trying to make a new base Model class that spotted a column that needs cleaning when the class is being created, and juggled things around automatically, so that instead of the above code, you could do something like this:
from application import db
class Example(db.Model):
__tablename__ = 'example'
__html_columns__ = ['html_column'] # Our oh-so-subtle hint
normal_column = db.Column(db.Integer,
primary_key=True,
autoincrement=True)
html_column = db.Column(db.Text,
nullable=False)
Of course, the combination of trickery with metaclasses going on behind the scenes with SQLAlchemy and Flask made this less than straight-forward (and is also why the nearly matching question "Custom metaclass to create hybrid properties in SQLAlchemy" doesn't quite help - Flask gets in the way too). I've almost gotten there with the following in application/models/__init__.py:
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.ext.hybrid import hybrid_property
# Yes, I'm importing _X stuff...I tried other ways to avoid this
# but to no avail
from flask_sqlalchemy import (Model as BaseModel,
_BoundDeclarativeMeta,
_QueryProperty)
from application import db, clean_the_data
class _HTMLBoundDeclarativeMeta(_BoundDeclarativeMeta):
def __new__(cls, name, bases, d):
# Move any fields named in __html_columns__ to a
# _field/field pair with a hybrid_property
if '__html_columns__' in d:
for field in d['__html_columns__']:
if field not in d:
continue
hidden = '_' + field
fget = lambda self: getattr(self, hidden)
fset = lambda self, value: setattr(self, hidden,
clean_the_data(value))
d[hidden] = d[field] # clobber...
d[hidden].name = field # So we don't have to explicitly
# name the column. Should probably
# force a quote on the name too
d[field] = hybrid_property(fget, fset)
del d['__html_columns__'] # Not needed any more
return _BoundDeclarativeMeta.__new__(cls, name, bases, d)
# The following copied from how flask_sqlalchemy creates it's Model
Model = declarative_base(cls=BaseModel, name='Model',
metaclass=_HTMLBoundDeclarativeMeta)
Model.query = _QueryProperty(db)
# Need to replace the original Model in flask_sqlalchemy, otherwise it
# uses the old one, while you use the new one, and tables aren't
# shared between them
db.Model = Model
Once that's set, your model class can look like:
from application import db
from application.models import Model
class Example(Model): # Or db.Model really, since it's been replaced
__tablename__ = 'example'
__html_columns__ = ['html_column'] # Our oh-so-subtle hint
normal_column = db.Column(db.Integer,
primary_key=True,
autoincrement=True)
html_column = db.Column(db.Text,
nullable=False)
This almost works, in that there's no errors, data is read and saved correctly, etc. Except the setter for the hybrid_property is never called. The getter is (I've confirmed with print statements in both), but the setter is ignored totally and the cleaner function is thus never called. The data is set though - changes are made quite happily with the un-cleaned data.
Obviously I've not quite completely emulated the static version of the code in my dynamic version, but I honestly have no idea where the issue is. As far as I can see, the hybrid_property should be registering the setter just like it has the getter, but it's just not. In the static version, the setter is registered and used just fine.
Any ideas on how to get that final step working?
Maybe use a custom type ?
from sqlalchemy import TypeDecorator, Text
class CleanedHtml(TypeDecorator):
impl = Text
def process_bind_param(self, value, dialect):
return clean_the_data(value)
Then you can just write your models this way:
class Example(db.Model):
__tablename__ = 'example'
normal_column = db.Column(db.Integer, primary_key=True, autoincrement=True)
html_column = db.Column(CleanedHtml)
More explanations are available in the documentation here: http://docs.sqlalchemy.org/en/latest/core/custom_types.html#augmenting-existing-types

Categories