Sqlalchemy class is not iterable / converting relationship to a set - python

I'm trying to convert a one-to-many part of a many-to-many relationship to a python set(). See the following play example for code:
message_tos = Table('message_tos', Base.metadata,
Column('message_id', Integer, ForeignKey('message.id')),
Column('address_id', Integer, ForeignKey('address.id'))
)
class Address(Base):
address = Column(String, nullable=False, unique=True)
emails = relationship('Message')
def __init__(self, address):
self.address = address
class Email(Base):
from = relationship('Address', backref='from')
to = relationship('Address', secondary=message_tos, backref='messages_to')
message_id = Column(String, nullable=False, unique=True)
...
def whatever(*args, **kwargs):
"""
...
"""
email = session.query(Email).filter(Email.message_id==message_id).first()
blah = set(email.to).union(email.from) # This lines throws an error that Address is not iterable
Is there any way to run the set(email.to) code (which places an Address object into a set), or am I going about this completely the wrong way? Obviously, I could just do set([email.to]), however this is an entire extra order of complexity (and this function may be called multiple times with potentially very long .to or .from lists) which I'd rather not have

Your error most probably occurs not in the set(email.to) part, but in .union(email.from), since the email.from is not iterable. According to your code, email.from is an instance of Address.
This should work though: blah = set(email.to).union([email.from]). I assume that did not really call the property from, as it is a reserved keyword in python. I guess sender is a good name.
Also note that in SA searching for a primary key can be done clearer with the Query.get:
email = session.query(Email).get(message_id)

Related

What is the best way to store runtime information in SQLalchemy model?

What is the best way to store runtime information in model?
And is it good idea to store one in a model(like online/offline, etc)
for example:
class User(Base):
__tablename__ = 'users'
id = Column(Integer, primary_key=True)
username = Column(String, unique=True, nullable=False)
fullname = Column(String, default='')
password = Column(String, nullable=False)
role = Column(String, nullable=False)
status = {0: "Offline", 1: "Online", -1: "Unknown"}
def __init__(self, **kwargs):
Base.__init__(self, **kwargs)
self.init_status()
#orm.reconstructor
def init_status(self):
self._online = 0
#property
def online(self):
if self._online is None:
self._online = 0
if self.enable:
return self._online
return -1
#online.setter
def online(self, value):
if value != self.online:
dispatcher.send(sender=self, signal="state", value=value)
self._online = value
If I get object from session like
user = session.query(User).get(1)
change state
user.online = 1
and after session.close() I have detached object
Do I always have to do expunge(user) after commit() and before close()
and then if I want to change it, I have to add it to new session and the again commit,expunge,close
Is there any other ways?
P.S.
what is the most used practice, to create DAO layer or session it self work like DAO layer?
I need to have access to this state in a whole life of app, but as I undestand it's not a good way to use one session all time.
Proper way, to open session, do all my stuff with DB, then close session. But then I lost my state.
In java I have DAO layer and business object, that store all my db field and all my states regardless of session.
but with SA I already have session, DBO object and Manager object. I dont want to create so much layers, I think its not much pythonic.
Thanks.
You should store the status also in DB instead of memory.
As its not user data, preferably a different table, UserSession which has a user id FK.
If you do so, you can store other data as well e.g lastlogintime.
And even make intelligent decisions like if the lastlogintime > 30 mins, you can change the status back to offline maybe.
Storing such state in memory is not a good idea.

SQLAlchemy with single table inheritance expunge error

I'm having an issue with python SQLAlchemy single table inheritance.
Model:
class User(Base):
__tablename__ = 'user'
userID = Column(String(64), primary_key=True)
name = Column(String(64))
type = Column('type', String(50))
__mapper_args__ = {'polymorphic_on': type}
class PasswordUser(User):
__mapper_args__ = {'polymorphic_identity': 'puser'}
password = Column(String(64))
def validatePassword(self, password):
return (self.password == password)
In userManger.py I have this:
def userGet(userID):
with DBStore.Session(db) as sess:
user = sess.query(User).filter(User.userID==userID).one()
sess.expunge(user)
return user
In a test main method:
myUser = userManager.userGet('123')
myUser.validatePassword("password321')
This produces an error:
sqlalchemy.orm.exc.DetachedInstanceError: Instance is not bound to a Session; attribute refresh operation cannot proceed
I have verified that myUser is of type 'PasswordUser', and it calls the correct 'validatePassword' method.
The stranger thing is that when I step through the code slowly (PyDev), it works without error.
It also works if my userGet method does a sess.query(PasswordUser). But I want this method to be generic so it can return any type of 'User'.
Any ideas?
The problem is that when you query just for User, only the attributes/columns for the User (base) class are queried from the database. In order to load also other columns (like password) for sub-classes, you need to instruct query to do so. You can do this by using with_polymorphic, in which case your code might look like:
def userGet(userID):
with DBStore.Session(db) as sess:
user = sess.query(User).with_polymorphic('*').filter(User.userID==userID).one()
sess.expunge(user)
return user
If you do not do that, the sqlalchemy will try to load the missing attribute (in your case, password) automatically using the session, and this is why it complains that it cannot work with detached object.

can I store a Dictionary as the property of an object?

Using Python/Flask/SQLAlchemy/Heroku.
Want to store dictionaries of objects as properties of an object:
TO CLARIFY
class SoccerPlayer(db.Model):
name = db.Column(db.String(80))
goals_scored = db.Column(db.Integer())
^How can I set name and goals scored as one dictionary?
UPDATE: The user will input the name and goals_scored if that makes any difference.
Also, I am searching online for an appropriate answer, but as a noob, I haven't been able to understand/implement the stuff I find on Google for my Flask web app.
I would second the approach provided by Sean, following it you get properly
normalized DB schema and can easier utilize RDBMS to do the hard work for you. If,
however, you insist on using dictionary-like structure inside your DB, I'd
suggest to try out hstore
data type which allows you to store key/value pairs as a single value in
Postgres. I'm not sure if hstore extension is created by default in Postgres
DBs provided by Heroku, you can check that by typing \dx command inside
psql. If there are no lines with hstore in them, you can create it by
typing CREATE EXTENSION hstore;.
Since hstore support in SQLAlchemy is available in version 0.8 which is not
released yet (but hopefully will be in coming weeks), you need to install it
from its Mercurial repository:
pip install -e hg+https://bitbucket.org/sqlalchemy/sqlalchemy#egg=SQLAlchemy
Then define your model like this:
from sqlalchemy.dialects.postgresql import HSTORE
from sqlalchemy.ext.mutable import MutableDict
class SoccerPlayer(db.Model):
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(80), nullable=False, unique=True)
stats = db.Column(MutableDict.as_mutable(HSTORE))
# Note that hstore only allows text for both keys and values (and None for
# values only).
p1 = SoccerPlayer(name='foo', stats={'goals_scored': '42'})
db.session.add(p1)
db.session.commit()
After that you can do the usual stuff in your queries:
from sqlalchemy import func, cast
q = db.session.query(
SoccerPlayer.name,
func.max(cast(SoccerPlayer.stats['goals_scored'], db.Integer))
).group_by(SoccerPlayer.name).first()
Check out HSTORE docs
for more examples.
If you are storing such information in a database I would recommend another approach:
class SoccerPlayer(db.Model):
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(80))
team_id = db.Column(db.Integer, db.ForeignKey('Team.id'))
stats = db.relationship("Stats", uselist=False, backref="player")
class Team(db.Model):
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(80))
players = db.relationship("SoccerPlayer")
class Stats(db.Model):
id = db.Column(db.Integer, primary_key=True)
player_id = db.Column(db.Integer, db.ForeignKey('SoccerPlayer.id'))
goals_scored = db.Column(db.Integer)
assists = db.Column(db.Integer)
# Add more stats as you see fit
With this model setup you can do crazy things like this:
from sqlalchemy.sql import func
max_goals_by_team = db.session.query(Team.id,
func.max(Stats.goals_scored).label("goals_scored")
). \
join(SoccerPlayer, Stats). \
group_by(Team.id).subquery()
players = SoccerPlayer.query(Team.name.label("Team Name"),
SoccerPlayer.name.label("Player Name"),
max_goals_by_team.c.goals_scored). \
join(max_goals_by_team,
SoccerPlayer.team_id == max_goals_by_team.c.id,
SoccerPlayer.stats.goals_scored == max_goals_by_team.c.goals_scored).
join(Team)
thus making the database do the hard work of pulling out the players with the highest goals per team, rather than doing it all in Python.
Not even django(a bigger python web framework than flask) doesn't support this by default. But in django you can install it, it's called a jsonfield( https://github.com/bradjasper/django-jsonfield ).
What i'm trying to tell you is that not all databases know how to store binaries, but they do know how to store strings and jsonfield for django is actually a string that contains the json dump of a dictionary.
So, in short you can do in flask
import simplejson
class SoccerPlayer(db.Model):
_data = db.Column(db.String(1024))
#property
def data(self):
return simplejson.loads(self._data)
#data.setter
def data(self, value):
self._data = simplejson.dumps(value)
But beware, this way you can only assign the entire dictionary at once:
player = SoccerPlayer()
player.data = {'name': 'Popey'}
print player.data # Will work as expected
{'name': 'Popey'}
player.data['score'] = '3'
print player.data
# Will not show the score becuase the setter doesn't know how to input by key
{'name': 'Popey'}

Use of M2M Table and relationship to get specific data in sqlalchemy

I have Table
# File : MyRelations.py
ACC_ADD_TABLE = Table('acc_add_rel', METADATA,
Column('acc_id', ForeignKey('acc.id'),
nullable=False),
Column('add_id', ForeignKey('address.id'),
nullable=False),
PrimaryKeyConstraint('add_id', 'acc_id'),
)
# File : Address.py
class Address(Base):
id = Column(Integer, primary_key=True,)
type = Column(String(length=10), nullable=False)
# File : Account.py
class Account(Base):
id = Column(Integer, primary_key=True,)
addresses = relationship('Address',
secondary=ACC_ADD_TABLE
)
# default_address = relationship('Address',
# secondary=ACC_ADD_TABLE,
# primaryjoin=and_("ACC_ADD_TABLE.add_id==Address.id",
# "ACC_ADD_TABLE.acc_id==Account.id",
# "Address.type='default'")
# )
As per the example I want to access the all default addresses in account. I can use declared_attr or can write the function but is there any way to combine Table and Class attribute in single and_ operation?
Note: Address.py and Account.py both are different files and due to cycle dependency I cant import any model in other model
Thx for you help.
This works without requiring an import:
default_address = relationship('Address',
secondary=ACC_ADD_TABLE,
primaryjoin="acc.c.id==acc_add_rel.c.acc_id",
secondaryjoin="and_(address.c.id==acc_add_rel.c.add_id, address.c.type=='default')",
#uselist = True,
)
If you are certain that there is only one default address, you might use uselist=True for convenience.
Sometimes I prefer the other structure for such situations though: add a column to the Account table: default_address_id and build 1-[0..1] relationship based on this column, still checking that the referenced Address is also part of Account.addresses M-N relationship.
On the side note, a typo: in your (commented) code you should use == instead of = in "Address.type='default'". This does not solve the problem though.

Auto updating properties in sqlalchemy

I've got a sqlalchemy model that is set up like this:
class Entry(Base):
__tablename__ = 'entries'
__table__ = Table('entries', Base.metadata,
Column('id', Integer, primary_key=True, unique=True),
Column('user_id', Integer, ForeignKey('users.id', onupdate="CASCADE", ondelete="RESTRICT")),
Column('title', String(128)),
Column('slug', String(128), index=True),
Column('url', String(256), index=True),
Column('entry', Text),
Column('cached_entry', Text),
Column('created', DateTime, server_default=text('current_timestamp')),
Column('modified', DateTime, server_onupdate=text('current_timestamp')),
Column('pubdate', DateTime),
)
What I would like is that when I update entry that cached_entry gets re-generated, cached_entry is the markdown parsed version of entry. Basically I am caching the output of the markdown parsing so that I don't have to do it on each showing of the entry. I've ttried using #hybrid_method however that didn't seem to work as that is not stored in the database at all. I've got it working on Google AppEngine, but I can't seem to figure out how to do the same thing using SQLAlchemy.
I really would prefer not to have to add a function to the class that is used instead of the names in the model because it is harder to enforce it from an application standpoint, I don't want to accidentally miss something.
#hybrid_descriptor certainly does it using the form described at http://www.sqlalchemy.org/docs/orm/mapper_config.html#using-descriptors . You assign to the database-mapped attribute, which you can map under a different name - since you're using the __table__, you can use a form like:
class Entry(Base):
__table__ = ...
_entry = __table__.c.entry
#hybrid_property
def entry(self):
return self._entry
#entry.setter
def entry(self, value):
self._entry = value
self.cached_entry = markdown(value)
Another is to use the before_insert and before_update events to populate the column at flush time - this is a simple approach but has the disadvantage that you have to wait for a flush() for it to happen.
I think the quickest way for an "on-set" is to use #validates:
from sqlalchemy.orm import validates
class Entry(Base):
__table__ = ...
#validates('entry')
def _set_entry(self, key, value):
self.cached_entry = markdown(value)
return value

Categories