GAE require model property iff another has a certain value? - python

In GAE's db.Model properties, we have a required parameter that disallows an entity of that model from being created without a value for that property.
e.g.:
class user(db.Model):
isFromUK = db.BoolProperty(required = True)
fromCounty = db.StringProperty()
How can I do essentially required = True on fromCounty iff the isFromUK == True?
I am aware this may not be possible directly in GAE implementation (I have not found a way in docs) - but I wondered if there may be some simple way to implement this, perhaps with a #ClassMethod?
I have not had cause to use one before, so I am not sure if that would offer a solution.

This is how you would override .put() to do your special validation before continuing with the regular (ie. super-class' .put):
class user(db.Model):
...
def put(self, *args, **kw):
if self.isFromUK:
if not self.fromCountry:
raise ValueError("Need fromCountry if isFromUK..")
super(user, self).put(*args, **kwargs)

Related

Process fields in SQLAlchemy model (using flask_sqlalchemy)

I am using SQLAlchemy through flask_sqlalchemy. A model receives input from HTML forms. I would like this input to be stripped of any tags. Instead of doing this several times in the code before assignment, I thought it might be better to implement this somehow in the model object.
The possibilities I could think of were:
Derive own column types
Wrap a proxy class around the column types
Define kind of a decorator that does the above
Modify the model object to intercept assignments
The first three solutions seem more elegant, but I don't understand how I need to implement these. The main reason is that I don't understand how exactly SQLAlchemy extracts the table structure and column types from the column variables, and how assignment to these is handled, in particular when access through the flask_sqlalchemy class.
I played around with the last option in the list above, and came up with this (partial) solution:
import bleach
class Example(db.Model):
__tablename__ = 'examples'
id = db.Column(db.Integer, primary_key=True)
field1 = db.Column(db.Text)
field2 = db.Column(db.String(64))
_bleach_columns = ('field1', 'field2')
def __init__(self, **kwargs):
if kwargs is not None:
for key in Example._bleach_columns:
kwargs[key] = bleach.clean(kwargs[key], tags=[], strip=True)
super(Example, self).__init__(**kwargs)
This works when creating objects using Example(field1='foo', field2='bar'). However, I am uncertain how to handle the assignment of individual fields. I was thinking of something along these lines, but am unsure about the parts marked as ASSIGN:
def __setattr__(self, attr, obj):
if(attr in Example._bleach_columns):
ASSIGN(..... , bleach.clean(obj, tags=[], strip=True))
else:
ASSIGN(..... , obj)
More generally, my impression is that this is not the best way to handle tag filtering. I'd therefore appreciate any hint on how to best implement this behaviour, ideally with a decorator of new column types.
It looks like this could be done with a TypeDecorator (link) that applies bleach in process_bind_param. However, I could not figure out how to apply this decorator to the flask_sqlalchemy based column definition in the db.Model-derived class above.
I finally managed to solve this... which was easy, as usual, once one understands what it all is about.
The first thing was to understand that db.Column is the same than SQLAlchemy's column. I thus could use the same syntax. To implement variable length strings, I used a class factory to return the decorators. If there is another solution to implement the length, I'd be interested to hear about it. Anyway, here is the code:
def bleachedStringFactory(len):
class customBleachedString(types.TypeDecorator):
impl = types.String(len)
def process_bind_param(self, value, dialect):
return bleach.clean(value, tags=[], strip=True)
def process_result_value(self, value, dialect):
return value
return customBleachedString
class Example(db.Model):
__tablename__ = 'examples'
id = db.Column(db.Integer, primary_key=True)
field1 = db.Column(bleachedStringFactory(64), unique=True)
field2 = db.Column(bleachedStringFactory(128))

Bulk update using model's method in SQLAlchemy

I'm developing an application with SQLAlchemy and I've run into a bit of an issue. I would like to run a method on all models returned by a query and make all that in a single SQL query, while preserving the readability the ORM offers.
The method in question is very simple and doesn't depend on any external data nor makes any more queries. It's also fine if all the models in the bulk update use the same exact value, so the value itself needs to be evaluated only once.
Here's my model:
class Item(db.Model):
last_fetch = db.Column(db.DateTime)
def refresh(self):
self.last_fetch = datetime.utcnow()
I would like to call the refresh() function on all models returned by a query - for the sake of example let's assume it's Item.query.all().
I can iterate through them and run the method on each model but that would run a separate query for each one of them:
items = Item.query.all()
for item in items:
item.refresh()
Or I could do the following which works however I've now moved my refresh() logic from the model to the code that would otherwise just call that method:
items = Item.query.all()
items.update({Item.last_fetch: datetime.utcnow()})
Is there a better solution? A way to define a "smart" method on the model that would somehow allow the ORM to run it in bulk while still keeping it a model method?
Regards.
Since SQLAlchemy doesn't recreate queried objects using __init__ but __new__.
You could either override __new__ on your model or try and see if the #orm.reconstructor decorator explained in Constructors and Object Initialization would work.
Using the reconstructor:
class Item(db.Model):
last_fetch = db.Column(db.DateTime)
#orm.reconstructor
def set_last_fetch_on_load(self):
self.last_fetch = datetime.datetime.now()
Overriding __new__:
class Item(db.Model):
last_fetch = db.Column(db.DateTime)
def __new__(cls, *args, **kwargs):
obj = object.__new__(cls, *args, **kwargs)
obj.last_fetch = datetime.datetime.now()
return obj
Note: Haven't tested it.

Is it safe to override __hash__ on a peewee.Model object?

I recently noticed that my a bunch of peewee model objects that referred to the same data were not being recognized as equivalent, even though they contained the same data.
Is it safe to override __hash__ on these guys? It appears to work but I don't wan't this to come back and bite me unexpectedly in the future -- does hashing mess with anything like the internal state to record mapping that I should worry about?
class User(PowertailMeta):
name = CharField(unique=True)
password = CharField(null=False)
balance = FloatField(default=10.0)
cap = FloatField(default=60, constraints=[Check('cap >= 0')])
is_admin = BooleanField(default=False)
last_login = DateTimeField(default=datetime.now)
picture = CharField(default="porp")
def __hash__(self):
return hash(self.name) # since name is unique...
This passes trivial tests but I'm not sure what I might need to be looking for.
__hash__ has been recently added to peewee.Model, see https://github.com/coleifer/peewee/issues/879. It's yet unreleased, but I assume it will be in 2.8.1.
The implementation is rather simple:
def __hash__(self):
return hash((self.__class__, self._get_pk_value()))
(in https://github.com/coleifer/peewee/commit/4de894aeebf7245d4fb6c4f412c7a09a2c039d8a#diff-eb0556c6b1b9232ba053c4cea13ff075R4786)
So, it relies on the model class and your primary key. If you have other needs, I don't see any problem in overriding it. However, with your suggested solution, consider following:
class User(Model):
name = CharField(unique=True)
def __hash__(self):
return hash(self.name)
class User(Model):
label = CharField(unique=True)
def __hash__(self):
return hash(self.label)
These are two different models, even using different fields, but they would produce the same hash value.

Django abstract base model with custom QuerySet class

I am using an approach similar to T. Stone's answer on this question. However, I have added an abstract base class, so my models.py looks like this:
class CustomQuerySetManager(models.Manager):
"""A re-usable Manager to access a custom QuerySet"""
def __getattr__(self, attr, *args):
try:
return getattr(self.__class__, attr, *args)
except AttributeError:
return getattr(self.get_query_set(), attr, *args)
def get_query_set(self):
return self.model.QuerySet(self.model)
class MyModel(models.Model):
class Meta:
abstract = True
class QuerySet(QuerySet):
def user(self, pub, *args, **kwargs):
return self.filter(publisher=pub, *args, **kwargs)
...some more methods here
class Book(MyModel):
title = models.CharField(max_length=100)
authors = models.ManyToManyField(Author, related_name='book_author')
publisher = models.ForeignKey(Publisher)
publication_date = models.DateField()
objects=models.Manager()
obj=CustomQuerySetManager() #for testing purposes only, this will override objects later
This allows me to get all of the books for a given publisher like such:
p = Publisher.object.get(pk=1)
Book.obj.user(p).all()
I would like to extend this so I can define a custom query in the Book model then pass a Q object to the QuerySet class, so the query "publisher=pub" can be different for different models. I still want to be able to call this like Book.obj.user(p).all(). Somewhere in the Book model I need:
pubQ=Q(publisher=pub)
Where can I put this and how do I pass it to QuerySet defined in the Abstract Base Class, while keeping the code as DRY as possible?
That answer is clever, but it breaks the Python principle of "explicit is better than implicit". My first reaction to your code was to tell you that you can't declare a custom queryset inside your model, but I decided to check the mentioned SO answer to see where you got that idea from. Again, it's clever -- not discounting that, but well-written code is self-documenting and should be able to be picked up by any random Django developer and ran with. That's where peer code-reviews come in handy -- had you had one, you'd have instantly got a WTF with that.
The Django core team does it the following way:
class MyQuerySet(models.query.QuerySet):
def some_method(self, an_arg, another_arg, a_kwarg='some_value'):
# do something
return a_queryset
class MyManager(models.Manager):
def get_query_set(self):
return MyQuerySet(self.model)
def some_method(self, *args, **kwargs):
return self.get_query_set().some_method(*args, **kwargs)
It's DRY in the sense that you don't repeat the actual method definition in the manager. But, it's also explicit -- you know exactly what's going on. It's not as DRY as the method you're referencing, but "explicit is better than implicit". Besides if it's done that way in the actual Django codebase, you can be reasonably assured that it's good practice to do so in your own code. And, it has the side-effect of making it much easier to extend and override in subclasses.

Automatic GUID key_name in model

I want my model to get a GUID as key_name automatically and I'm using the code below. Is that a good approach to solve it? Does it have any drawbacks?
class SyncModel(polymodel.PolyModel):
def __init__(self, key_name=None, key=None, **kwargs):
super(SyncModel, self).__init__(key_name=str(uuid.uuid1()) if not key else None,key=key, **kwargs)
Overriding __init__ on a Model subclass is dangerous, because the constructor is used by the framework to reconstruct instances from the datastore, in addition to being used by user code. Unless you know exactly how the constructor is used to reconstruct existing entities - something which is an internal detail and may change in future - you should avoid overriding it.
Instead, define a factory method, like this:
class MyModel(db.Model):
#classmethod
def new(cls, **kwargs):
return cls(key_name=str(uuid.uuid4()), **kwargs)
There is an article by Nick about pre and post put hooks which and be used to set the key_name, I don't know if your current method is valid or not but at least you should be aware of other options.

Categories