Refresh the data within an sqlalchemy table model quicker - python

Due to complications with how Django and SQLAlchemy deal with JSON data, I've had to create a system that uses models from both of them as well as using standard SQLAlchemy.
The problem I'm having is that when I update information within a table via the table.update() method there's quite a considerable delay until my SQLAlchemy table model picks up the change.
Is there a way to force the model to update?
My code is along these lines:
# Database Connection
engine = create_engine('mysql+pymysql://'+dbusername+':'+dbuserpass+dbserver+dbname,
pool_recycle=3600, echo=False)
con = scoped_session(sessionmaker(autocommit=True,autoflush=False,bind=engine))
Session = sessionmaker(bind=engine)
sess = Session()
meta = MetaData(engine)
insp = inspect(engine)
Base = declarative_base()
con.close()
engine.dispose()
# sqlalchemy table model
class ContactsTable(Base):
__tablename__ = 'contacts_tbl'
db_id = Column(Integer, primary_key=True)
per_contact_id = Column(JSON)
createdDateTime = Column(JSON)
lastModifiedDateTime = Column(JSON)
distlists = Column(JSON)
# Theres a lot of code missing here you can see the basics of what I'm doing, adding data and then reading
def add_to_dist(contact,dist,tbl=contacts_tbl):
con.execute(tbl.update().values(distributionLists=dists).where(tbl.c.per_contact_id==contact))
def get_dist_members(name):
data = sess.query(ContactsTable).filter(ContactsTable.distributionLists.contains(name)).all()
Everything works. It's just that the query data is out of date and seems to take anywhere up to 10 minutes to refresh. This is annoying as it's running through a web page that displays the data. It really needs to reflect the changes instantly.
If in fact I'm doing this whole thing incorrectly then feel free to school me!

Solved it by adding isolation_level="READ UNCOMMITTED" to the engine:
engine = create_engine('mysql+pymysql://'+dbusername+':'+dbuserpass+dbserver+dbname, pool_recycle=3600, echo=False, isolation_level="READ UNCOMMITTED")
Thanks to Ilja for pointing me in the right direction.

Related

SQLAlchemy in Flask: do I need to close the database session?

This is my first time writing a web application and using SQLAlchemy and I'm not sure I completely understand the concept of sessions. Currently I am loading a new session whenever the db needs to be queried. Is it sufficient to close it with sql_session.close() as I have done below?
Does not closing it cause many problems?
engine = create_engine('sqlite:///database.db', echo=True)
Base = declarative_base(engine)
class Kinases(Base):
__tablename__ = 'Kinase'
full_name = Column(String)
uniprot_code = Column(String)
def loadSession():
metadata = Base.metadata
Session = sessionmaker(bind=engine)
session = Session()
return session
#app.route("/search/kinases/<query>")
def kinase_results(query):
sql_session = loadSession()
kinase = sql_session.query(Kinases).get(query)
if kinase is None:
return redirect(url_for('user_message', query=query))
name = kinase.full_name
sql_session.close()
In most cases, creating session in scope of view is a bad idea. Please read session basics for sqla.

Python SQLalchemy access huge DB data without creating models

I am using flaks python and sqlalchemy to connect to a huge db, where a lot of stats are saved. I need to create some useful insights with the use of these stats, so I only need to read/get the data and never modify.
The issue I have now is the following:
Before I can access a table I need to replicate the table in my models file. For example I see the table Login_Data in the DB. So I go into my models and recreate the exact same table.
class Login_Data(Base):
__tablename__ = 'login_data'
id = Column(Integer, primary_key=True)
date = Column(Date, nullable=False)
new_users = Column(Integer, nullable=True)
def __init__(self, date=None, new_users=None):
self.date = date
self.new_users = new_users
def get(self, id):
if self.id == id:
return self
else:
return None
def __repr__(self):
return '<%s(%r, %r, %r)>' % (self.__class__.__name__, self.id, self.date, self.new_users)
I do this because otherwise I cant query it using:
some_data = Login_Data.query.limit(10)
But this feels unnecessary, there must be a better way. Whats the point in recreating the models if they are already defined. What shall I use here:
some_data = [SOMETHING HERE SO I DONT NEED TO RECREATE THE TABLE].query.limit(10)
Simple question but I have not found a solution yet.
Thanks to Tryph for the right sources.
To access the data of an existing DB with sqlalchemy you need to use automap. In your configuration file where you load/declare your DB type. You need to use the automap_base(). After that you can create your models and use the correct table names of the DB without specifying everything yourself:
from sqlalchemy.ext.automap import automap_base
from sqlalchemy.orm import Session
from sqlalchemy import create_engine
import stats_config
Base = automap_base()
engine = create_engine(stats_config.DB_URI, convert_unicode=True)
# reflect the tables
Base.prepare(engine, reflect=True)
# mapped classes are now created with names by default
# matching that of the table name.
LoginData = Base.classes.login_data
db_session = Session(engine)
After this is done you can now use all your known sqlalchemy functions on:
some_data = db_session.query(LoginData).limit(10)
You may be interested by reflection and automap.
Unfortunately, since I never used any of those features, I am not able to tell you more about them. I just know that they allow to use the database schema without explicitly declaring it in Python.

Sqlalchemy with Flask - deleted data is still shown

Description
I have an Flask application with original SQLalchemy. Application is intended to be used internally in a company for easier saving of measurement data with MySQL
On one page I have a table with all devices used for measurement and a form that is used to add, remove or modify measurement devices.
Problem
The problem is that when I enter a new device in the database, the page is automatically refreshed to fetch new data from DB and new device is sometimes shown and sometimes it is not when I refresh the page. In other words, added row in table is appearing and dissapearing even though the row is visible on database. Same goes when i try to delete the device from database. The row is sometimes shown, sometimes not when refreshing the page with row being deleted from DB.
The same problem appears for all examples similar to this one (adding, deleting and modifying data).
What i have tried
Bellow is the code for table model:
class DvDevice(Base):
__tablename__ = "dvdevice"
id = Column("device_id", Integer, primary_key=True, autoincrement=True)
name = Column("device_name", String(50), nullable=True)
code = Column("device_code", String(10), nullable=True, unique=True)
hw_ver = Column("hw_ver", String(10), nullable=True)
fw_ver = Column("fw_ver", String(10), nullable=True)
sw_ver = Column("sw_ver", String(10), nullable=True)
And here is the code that inserts/deletes data from table.
#Insertion
device = DvDevice()
device.code = self.device_code
device.name = self.device_name
device.hw_ver = self.hw_ver
device.fw_ver = self.fw_ver
device.sw_ver = self.sw_ver
ses.add(device)
ses.commit()
ses.expire_all() #Should this be here?
# Deletion
ses.query(DvDevice).filter_by(id=self.device_id).delete()
ses.commit()
ses.expire_all() # Should this be here?
I have read from some posts on stack to include the following decorator function in models.py
#app.teardown_appcontext
def shutdown_session(exception=None):
ses.expire_all() #ses being database session object.
I tried this and it still doesn't work as it should be. Should I put the decorator function somewhere else?
Second thing i tried is to put ses.expire_all() after all commits and it still doesnt work.
What should I do to prevent this from happening?
Edit 1
from sqlalchemy import create_engine, update
from sqlalchemy.orm import sessionmaker
from sqlalchemy.pool import NullPool
from config import MYSQLCONNECT
engine = create_engine(MYSQLCONNECT)
Session = sessionmaker(bind=engine)
session = Session()
I solved the problem with the use of following function from http://docs.sqlalchemy.org/en/latest/orm/session_basics.html#when-do-i-construct-a-session-when-do-i-commit-it-and-when-do-i-close-it:
from contextlib import contextmanager
#contextmanager
def session_scope():
"""Provide a transactional scope around a series of operations."""
session = Session()
try:
yield session
session.commit()
except:
session.rollback()
raise
finally:
session.close()
with session_scope() as session:
... # code that uses session
The problem was that I created the session object in the beggining and then never closed it.

Using multiple databases with single sqlalchemy model

I want to use multiple database engines with a single sqlalchemy database model.
Following situation:
I have a photo album software (python) and the different albums are stored in different folders. In each folder is a separate sqlite database with additional information about the photos. I don't want to use a single global database because with this way I can simply move, delete and copy albums on a folder base.
Opening a single album is fairly straightforward:
Creating a db session:
maker = sessionmaker(autoflush=True, autocommit=False,
extension=ZopeTransactionExtension())
DBSession = scoped_session(maker)
Base class and metadata for db model:
DeclarativeBase = declarative_base()
metadata = DeclarativeBase.metadata
Defining database model (shortened):
pic_tag_table = Table('pic_tag', metadata,
Column('pic_id', Integer,
ForeignKey('pic.pic_id'),
primary_key=True),
Column('tag_id', Integer,
ForeignKey('tag.tag_id'),
primary_key=True))
class Picture(DeclarativeBase):
__tablename__ = 'pic'
pic_id = Column (Integer, autoincrement = True, primary_key=True)
...
class Tags(DeckarativeBase):
__tablename__ = 'tag'
tag_id = Column (Integer, autoincrement = True, primary_key=True)
...
pictures = relation('Picture', secondary=pic_tag_table, backref='tags')
And finally open the connection:
engine = engine_from_config(config, '...')
DBSession.configure(bind=engine)
metadata.bind = engine
This works well for opening one album. Now I want to open multiple albums (and db connections) the same time. Every album has the same database model so my hope is that I can reuse it. My problem is that the model class definition is inheritet from the declarative base which is connected to the metadata and the database engine. I want to connect the classes to different metadata with different enginges. Is this possible?
P.S.: I also want to query the databases via the ORM, e.g. DBSession.query(Picture).all() (or DBSession[0], ... for multiple sessions on different databases - so not one query for all pictures in all databases but one ORM style query for querying one database)
You can achieve this with multiple engines and sessions (you don't need multiple metadata):
engine1 = create_engine("sqlite:///tmp1.db")
engine2 = create_engine("sqlite:///tmp2.db")
Base.metadata.create_all(bind=engine1)
Base.metadata.create_all(bind=engine2)
session1 = Session(bind=engine1)
session2 = Session(bind=engine2)
print(session1.query(Picture).all()) # []
print(session2.query(Picture).all()) # []
session1.add(Picture())
session1.commit()
print(session1.query(Picture).all()) # [Picture]
print(session2.query(Picture).all()) # []
session2.add(Picture())
session2.commit()
print(session1.query(Picture).all()) # [Picture]
print(session2.query(Picture).all()) # [Picture]
session1.close()
session2.close()
For scoped_session, you can create multiple of those as well.
engine1 = create_engine("sqlite:///tmp1.db")
engine2 = create_engine("sqlite:///tmp2.db")
Base.metadata.create_all(bind=engine1)
Base.metadata.create_all(bind=engine2)
Session1 = scoped_session(sessionmaker(bind=engine1))
Session2 = scoped_session(sessionmaker(bind=engine2))
session1 = Session1()
session2 = Session2()
...
If you have a variable number of databases you need to have open, scoped_session might be a little cumbersome. You'll need some way to keep track of them.

Raw MySQL with SQLAlchemy using Pyramid framework

I have recently made a decision to start using the Pyramid (python web framework) for my projects from now on.
I have also decided to use SQLalchemy, and I want to use raw MySQL (personal reasons) but still keep the ORM features.
The first part of the code in models.py reads:
DBSession = scoped_session(sessionmaker(extension=ZopeTransactionExtension()))
Base = declarative_base()
Now from here how do I exectue a query for CREATE TABLE using raw MySQL.
the traditional SQLalchemy way would be:
class Page(Base):
__tablename__ = 'pages'
id = Column(Integer, primary_key=True)
name = Column(Text, unique=True)
data = Column(Text)
def __init__(self, name, data):
self.name = name
self.data = data
DBSession.execute('CREATE TABLE ....')
Have a look at sqlalchemy.text() for parametrized queries.
My own biased suggestion would be to use http://pypi.python.org/pypi/khufu_sqlalchemy to setup the sqlalchemy engine.
Then inside a pyramid view you can do something like:
from khufu_sqlalchemy import dbsession
db = dbsession(request)
db.execute("select * from table where id=:id", {'id':7})
Inside the views.py if you are adding form elements, first create an object of the database.
In your snippet, do it as
pg = Page()
and add it with
DBSession.add(pg)
for all the form elements you want to add e.g name and data from your snippet.
the final code would be similar to:
pg = Page()
name = request.params['name']
data = request.params['data']
DBSession.add(pg)

Categories