I have a model of the form:
class MyModel(base):
# ...
datetime = Column(DateTime)
# ...
and would like to create a .date and a .time property which correspond to the .datetime column. The SQLA documentations shows a couple of examples of combining properties (such as firstname + lastname => fullname) but nothing for decomposing.
I think using #hybrid_property and friends I can do the initial decomposition but am unsure about assignment (so if n = MyModel.query...one() I wish to be able to do n.date = d and have it update the .datetime field.)
My primary RDBMS is MySQL.
(For those that are interested in my motivation for wanting to do this: I have a lot of client-side code which is duck-typed to expect .date and .time fields however many stored procedures and triggers on the server expect a single .datetime column.)
The docs say explicitly that you can do this, you need to use the #hybrid.property and #value.setter decorators and your own code to return the date or time in the expected format:
from sqlalchemy.ext.hybrid import hybrid_property
class SomeClass(object):
#hybrid_property
def value(self):
return self._value
#value.setter
def value(self, value):
self._value = value
Full disclosure: I have used the property feature but not the setter feature.
Related
I want to get the field value like we use self in Django models.
class UserModel(Model):
id = IDField()
uid = TextField()
#classmethod
def get_user(cls):
return cls.uid
The class method, keep returning NONE instead of the string value of the uid field. Did I miss something?
This is from the Firestore Python wrapper https://octabyte.io/FireO/quick-start/
If you use #classmethod and cls you can only get empty values. It is because you have basic class schema from which you can create objects (aka instances of that class).
To get value of current objects it has to be from self, so standard method. Then you can get a value of this particular object instance.
I didn't even find mention of a #classmethod in the Firestore Python. Most likely you don't need that decorator for now.
I am using SQLAlchemy through flask_sqlalchemy. A model receives input from HTML forms. I would like this input to be stripped of any tags. Instead of doing this several times in the code before assignment, I thought it might be better to implement this somehow in the model object.
The possibilities I could think of were:
Derive own column types
Wrap a proxy class around the column types
Define kind of a decorator that does the above
Modify the model object to intercept assignments
The first three solutions seem more elegant, but I don't understand how I need to implement these. The main reason is that I don't understand how exactly SQLAlchemy extracts the table structure and column types from the column variables, and how assignment to these is handled, in particular when access through the flask_sqlalchemy class.
I played around with the last option in the list above, and came up with this (partial) solution:
import bleach
class Example(db.Model):
__tablename__ = 'examples'
id = db.Column(db.Integer, primary_key=True)
field1 = db.Column(db.Text)
field2 = db.Column(db.String(64))
_bleach_columns = ('field1', 'field2')
def __init__(self, **kwargs):
if kwargs is not None:
for key in Example._bleach_columns:
kwargs[key] = bleach.clean(kwargs[key], tags=[], strip=True)
super(Example, self).__init__(**kwargs)
This works when creating objects using Example(field1='foo', field2='bar'). However, I am uncertain how to handle the assignment of individual fields. I was thinking of something along these lines, but am unsure about the parts marked as ASSIGN:
def __setattr__(self, attr, obj):
if(attr in Example._bleach_columns):
ASSIGN(..... , bleach.clean(obj, tags=[], strip=True))
else:
ASSIGN(..... , obj)
More generally, my impression is that this is not the best way to handle tag filtering. I'd therefore appreciate any hint on how to best implement this behaviour, ideally with a decorator of new column types.
It looks like this could be done with a TypeDecorator (link) that applies bleach in process_bind_param. However, I could not figure out how to apply this decorator to the flask_sqlalchemy based column definition in the db.Model-derived class above.
I finally managed to solve this... which was easy, as usual, once one understands what it all is about.
The first thing was to understand that db.Column is the same than SQLAlchemy's column. I thus could use the same syntax. To implement variable length strings, I used a class factory to return the decorators. If there is another solution to implement the length, I'd be interested to hear about it. Anyway, here is the code:
def bleachedStringFactory(len):
class customBleachedString(types.TypeDecorator):
impl = types.String(len)
def process_bind_param(self, value, dialect):
return bleach.clean(value, tags=[], strip=True)
def process_result_value(self, value, dialect):
return value
return customBleachedString
class Example(db.Model):
__tablename__ = 'examples'
id = db.Column(db.Integer, primary_key=True)
field1 = db.Column(bleachedStringFactory(64), unique=True)
field2 = db.Column(bleachedStringFactory(128))
I am using sqlalchemy as a readable way to model my database, I'm only interested in generating the database definition for several engines from my model.
Some of the columns in my tables have type Enum, which works fine in engines such as MySQL, since it has native enum support. However for SQL Server, it generates the column as VARCHAR and sets a constraint to check the values are within the expected enum values I specify.
I'd like to replace this alternative with a numeric based fallback, so that the column type is actually numeric and the constraint checks the numeric values are within the range of the enum size (assumes sequential values starting with 0).
I have tried creating a TypeDecorator with Enum as impl, but this was not enough or I did not know how to make it work. I also tried to just copy the code for the Boolean type and mix it with the Enum type to create my own type, but it seems that database compiler support is required too.
Is there a way in which I can achieve this without having to patch sqlalchemy itself?
Note that I am not interested in querying the database with python, after it's generated, I'm done, so that might simplify, perhaps.
Here's what you need:
import sqlalchemy as sa
class IntEnum(sa.types.TypeDecorator):
impl = sa.Integer
def __init__(self, enumtype, *args, **kwargs):
super().__init__(*args, **kwargs)
self._enumtype = enumtype
def process_bind_param(self, value, dialect):
return value.value
def process_result_value(self, value, dialect):
return self._enumtype(value)
And then you use it like this:
from enum import Enum
from sqlalchemy.ext.declarative import declarative_base
class MyEnum(Enum):
one = 1
two = 2
three = 3
engine = sa.create_engine('sqlite:///:memory:')
session = sa.orm.sessionmaker(bind=engine)()
Base = declarative_base()
class Stuff(Base):
__tablename__ = 'stuff'
id = sa.Column('id', sa.Integer, primary_key=True)
thing = sa.Column('num', IntEnum(MyEnum))
Base.metadata.create_all(engine)
session.add(Stuff(thing=MyEnum.one))
session.add(Stuff(thing=MyEnum.two))
session.add(Stuff(thing=MyEnum.three))
session.commit()
engine.execute(sa.text('insert into stuff values(4, 2);'))
for thing in session.query(Stuff):
print(thing.id, thing.thing)
Really your only problem is that impl needed to be a sa.Integer, as that's what's actually backing the enum, not enum.Enum.
I'm currently doing something like this in SQLA:
class Base(object):
query = DBSession.query_property()
#classmethod
def _get_fulltext_query(cls, terms):
# Search is a method which runs a fulltext search in Whoosh for objects of the given type, returning all ids which match the given terms
ids = search(terms, cls.__name__)
return cls.query.filter(cls.id.in_(ids))
I want to set up a custom query class and do something like:
class BaseQuery(Query):
def fulltext(self, terms):
# Need a way to find out what class we're querying so that I can run the fulltext search and return the proper query
class Base(object):
query = DBSession.query_property(BaseQuery)
It just seems cleaner to me. I also have other use cases for needing to know what class is being queried -- for instance, a series of Notification classes in which I need to know what notification type to return.
Is there any way to find out what class is being queried from inside BaseQuery.fulltext?
This should work:
class BaseQuery(Query):
def fulltext(self, terms):
# assuming query is always created with `cls.query` or `DBSession.query(cls)`
cls = self._entities[0].type
...
I'm using pylons with sqlalchemy. I have several models, and found myself wrote such code again and again:
question = Session.query(Question).filter_by(id=question_id).one()
answer = Session.query(Answer).fileter_by(id=answer_id).one()
...
user = Session.query(User).filter_by(id=user_id).one()
Since the models are all extend class Base, is there any way to define a common get_by_id() method?
So I can use it as:
quesiton = Question.get_by_id(question_id)
answer = Answer.get_by_id(answer_id)
...
user = User.get_by_id(user_id)
If id is your primary key column, you just do:
session.query(Foo).get(id)
which has the advantage of not querying the database if that instance is already in the session.
Unfortunately, SQLAlchemy doesn't allow you to subclass Base without a corresponding table declaration. You could define a mixin class with get_by_id as a classmethod, but then you'd need to specify it for each class.
A quicker-and-dirtier solution is to just monkey-patch it into Base:
def get_by_id(cls, id, session=session):
return session.query(cls).filter_by(id=id).one()
Base.get_by_id = classmethod(get_by_id)
This assumes you've got a session object available at definition-time, otherwise you'll need to pass it as an argument each time.
class Base(object):
#classmethod
def get_by_id(cls, session, id):
q = session.query(cls).filter_by(id=id)
return q.one()
Question.get_by_id(Session, question_id)