By default collection.find or collection.findone() functions results in a dictionary types and if you pass paramater as_class=SomeUserClass than it will try to parse the result into this class format.
but it seems this class should also be derived class of dictionary (as it required __setitem__ function to be defined and i can add keys in the class ).
Here i want to set the properties of the class. how can i do achieve this?
Also, my collection class contains some child classes as properties .So how can i set the properties of child classes also.
It sounds like you want something like an object-relational mapper. I am the primary author of one Ming , but there exist several others for Python as well. In Ming, you might do the following to set up your mapping:
from ming import schema, Field
from ming.orm import (mapper, Mapper, RelationProperty,
ForeignIdProperty)
WikiDoc = collection(‘wiki_page', session,
Field('_id', schema.ObjectId()),
Field('title', str, index=True),
Field('text', str))
CommentDoc = collection(‘comment', session,
Field('_id', schema.ObjectId()),
Field('page_id', schema.ObjectId(), index=True),
Field('text', str))
class WikiPage(object): pass
class Comment(object): pass
ormsession.mapper(WikiPage, WikiDoc, properties=dict(
comments=RelationProperty('WikiComment')))
ormsession.mapper(Comment, CommentDoc, properties=dict(
page_id=ForeignIdProperty('WikiPage'),
page=RelationProperty('WikiPage')))
Mapper.compile_all()
Then you can query for some particular page via:
pg = WikiPage.query.get(title='MyPage')
pg.comments # loads comments via a second query from MongoDB
The various ODMs I know of for MongoDB in Python are listed below.
Ming
MongoKit
MongoEngine
I have solved this by adding __setitem__ in class.
than i do
result = as_class()
for key,value in dict_expr.items():
result.__setitem__(key,value)
and in my class __setitem__ is like
def __setitem__(self,key,value):
try:
attr = getattr(class_obj,key)
if(attr!=None):
if(isinstance(value,dict)):
for child_key,child_value in value.items():
attr.__setitem__(child_key,child_value)
setattr(class_obj,key,attr)
else:
setattr(class_obj,key,value)
except AttributeError:
pass
Related
Cornice's documentation mentions how to validate your schema using a colander's MappingSchema subclass. How should we use a colanderalchemy schema for the same purpose? Because if we create a schema using colanderalchemy as stated in the documentation, the schema object has already instantiated the colander's class, and I think that this results in an error.
To be more precise, here is my sample code:
from sqlalchemy.ext.declarative import declarative_base
from cornice.resource import resource, view
from colanderalchemy import SQLAlchemySchemaNode
from sqlalchemy import (
Column,
Integer,
Unicode,
)
Base = declarative_base()
'''
SQLAlchemy part
'''
class DBTable(Base):
__tablename__ = 'mytable'
id = Column(Integer, primary_key=True,
info={'colanderalchemy': {'exclude': True}})
name = Column(Unicode(70), nullable=False)
description = Column(Unicode(256))
'''
ColanderAlchemy part
'''
ClndrTable = SQLAlchemySchemaNode(DBTable)
'''
Cornice part
'''
PRF='api'
#resource(collection_path='%s/' % PRF, path='%s/{fid}' % PRF)
class TableApi(object):
def __init__(self, request):
self.request = request
#view(schema=ClndrTable, renderer='json')
def put(self):
# do my stuff here
pass
Where ClndrTable is my auto-generated schema. Now, when trying to deploy this code, I get the following error:
NotImplementedError: Schema node construction without a typ argument or a schema_type() callable present on the node class
As I've mentioned earlier, I am suspecting that the problem is that ClndrTable (given as an argument to the view decorator) is an instantiation of the automatically generated schema by colanderalchemy.
Anyone knowing how to resolve this?
Thanks all in advance!
This appears to be due to the issue of colander having both a typ property and a schema_type property. They're both supposed to tell you the schema's type, but they can actually be different values. I filed an issue with colander, but if there's a fix it'll likely not make it to pypi any time soon.
So what's happing is: ColanderAlchemy ignores schema_type and uses typ while Cornice ignores typ and uses schema_type.
You can hack a fix with the following: ClndrTable.schema_type = lambda: ClndrTable.typ
However, that just leads you to the next exception:
cornice.schemas.SchemaError: schema is not a MappingSchema: <class 'colanderalchemy.schema.SQLAlchemySchemaNode'>
This is due to Cornice not duck typing but expecting all Schema to be a subclass of MappingSchema. However, MappingSchema is just a Schema with typ/schema_type being Mapping (which is what ColanderAlchemy returns).
I'll see if I can enact some changes to fix this.
Update
Despite the names, 'typ' and 'schema_type' have two different purposes. 'typ' always tells you the type of a schema instance. 'schema_type' is a method that's called to give a SchemaNode a default type when it's instantiated (so it's called in the __init__ if you don't pass a typ in, but other than that it's not supposed to be used).
Cornice has been patched to properly use typ now (though, as of this message, it's not part of the latest release).
For a Django model I'm using django-import-export package.
If need to export more then just available model fields, like properties or custom fields, new can be added with import_export.fields.Field class and optionally dehydrate_<field> method.
from import_export import resources, fields, instance_loaders
class ProductResource(resources.ModelResource):
categories = fields.Field()
price = fields.Field(attribute='unit_price')
class Meta:
model = Product
def dehydrate_categories(self, product):
return ';'.join(
'/%s' % '/'.join([c.name for c in cat.parents()] + [cat.name])
for cat in product.category.iterator() )
It does work well, but only for exporting. What about import, the reverse process ? Is there some counterpart to dehydrate_ method ?
So far I've overridden get_or_init_instance method:
class ProductResource(resources.ModelResource):
def get_or_init_instance(self, instance_loader, row):
row['unit_price'] = row['price']; row.pop('price')
return super(ProductResource, self).get_or_init_instance(instance_loader, row)
but doubt this is the right way.
Would appreciate any hint how to handle imports of custom fields.
You can override import_obj instead. See Import workflow for more details.
Another approach is to subclass Field and override export and save methods and do all required data manipulation in a field.
I know this is very old but I came across the same problem and this is how I fixed it (based on the direction the original asker was heading).
First, you can add any custom/modified fields you need by overriding the 'before_import_row' function, like so:
def before_import_row(self, row, **kwargs):
row['extra_info'] = 'Some Info'
return super(RetailLocationResource, self).before_import_row(row, **kwargs)
Then you can pass this into your instance by overriding get_or_init_instance like so:
def get_or_init_instance(self, instance_loader, row):
instance, bool = super(RetailLocationResource, self).get_or_init_instance(instance_loader, row)
instance.extra_info = row['extra_info']
return instance, bool
Hope this helps anyone!
I'd like to be able to automatically insert an entity with a reference t another entity directly from a message, using Google Endpoints.
To transmit ReferenceProperty in message, I use the encoded string value of the Key. That is fine for sending message, but when receiving message, and creating an entity of it, I cannot just pass the encoded string as a parameter to the constructor.
For instance, say I have two classes that inherits from BaseModel which itself inherits from db.models
class TestModel2(models.BaseModel):
test_string = db.StringProperty(required=True)
class TestModel(models.BaseModel):
test2 = db.ReferenceProperty(TestModel2)
test2_id = property(models.BaseModel._get_attr_id_builder('test2'),
models.BaseModel._set_attr_id_builder('test2'))
And a message class
class TestModelMessage(messages.Message):
test2_id = messages.StringField(4)
I want to be able to create an Entity TestModel directly of the TestModelMessage.
I managed to do it in the other way (from to entity to message) using a property. But in the other way it doesn't work since I have the feeling that the constructor for db.models will only set the attributes that inherits db.Property. Thus the setter for the property won't be called...
How could I do this?
I thought of overriding the __init__ in BaseModel but then when calling the __init__ of db.models it will probably override the ReferenceProperty.
So, I added a _ref_properties field to the BaseModel class.
For the previous example, it would be _ref_properties = {'test2': 'test2_id'}
Then I added this class method
#classmethod
def from_message(cls, message, *args):
attributes = {attr: getattr(message, attr) for attr in args}
for attribute, property in cls._ref_properties.items():
attributes[attribute] = db.Key(encoded=getattr(message, property))
entity = cls(**attributes)
return entity
And it seems to work. Probably not the best. Any remarks or better solution?
Is there a way to create custom methods to the query object so you can do something like this?
User.query.all_active()
Where all_active() is essentially .filter(User.is_active == True)
And be able to filter off of it?
User.query.all_active().filter(User.age == 30)
You can subclass the base Query class to add your own methods:
from sqlalchemy.orm import Query
class MyQuery(Query):
def all_active(self):
return self.filter(User.is_active == True)
You then tell SQLAlchemy to use this new query class when you create the session (docs here). From your code it looks like you might be using Flask-SQLAlchemy, so you would do it as follows:
db = SQLAlchemy(session_options={'query_cls': MyQuery})
Otherwise you would pass the argument directly to the sessionmaker:
sessionmaker(bind=engine, query_cls=MyQuery)
As of right now, this new query object isn't that interesting because we hardcoded the User class in the method, so it won't work for anything else. A better implementation would use the query's underlying class to determine which filter to apply. This is slightly tricky but can be done as well:
class MyOtherQuery(Query):
def _get_models(self):
"""Returns the query's underlying model classes."""
if hasattr(query, 'attr'):
# we are dealing with a subquery
return [query.attr.target_mapper]
else:
return [
d['expr'].class_
for d in query.column_descriptions
if isinstance(d['expr'], Mapper)
]
def all_active(self):
model_class = self._get_models()[0]
return self.filter(model_class.is_active == True)
Finally, this new query class won't be used by dynamic relationships (if you have any). To let those also use it, you can pass it as argument when you create the relationship:
users = relationship(..., query_class=MyOtherQuery)
this work for me finely
from sqlalchemy.orm import query
from flask_sqlalchemy import BaseQuery
class ParentQuery(BaseQuery):
def _get_models(self):
if hasattr(query, 'attr'):
return [query.attr.target_mapper]
else:
return self._mapper_zero().class_
def FilterByCustomer(self):
model_class = self._get_models()
return self.filter(model_class.customerId == int(g.customer.get('customerId')))
#using like this
class AccountWorkflowModel(db.Model):
query_class = ParentQuery
.................
To provide a custom method that will be used by all your models that inherit from a particular parent, first as mentioned before inherit from the Query class:
from flask_sqlalchemy import SQLAlchemy, BaseQuery
from sqlalchemy.inspection import inspect
class MyCustomQuery(BaseQuery):
def all_active(self):
# get the class
modelClass = self._mapper_zero().class_
# get the primary key column
ins = inspect(modelClass)
# get a list of passing objects
passingObjs = []
for modelObj in self:
if modelObj.is_active == True:
# add to passing object list
passingObjs.append(modelObj.__dict__[ins.primary_key[0].name])
# change to tuple
passingObjs = tuple(passingObjs)
# run a filter on the query object
return self.filter(ins.primary_key[0].in_(passingObjs))
# add this to the constructor for your DB object
myDB = SQLAlchemy(query_class=MyCustomQuery)
This is for flask-sqlalchemy, for which people will still get here when looking for this answer.
What would be the best way to get the latest inserted object using AppEngine ?
I know in Django this can be done using
MyObject.objects.latest()
in AppEngine I'd like to be able to do this
class MyObject(db.Model):
time = db.DateTimeProperty(auto_now_add=True)
# Return latest entry from MyObject.
MyObject.all().latest()
Any idea ?
Your best bet will be to implement a latest() classmethod directly on MyObject and call it like
latest = MyObject.latest()
Anything else would require monkeypatching the built-in Query class.
Update
I thought I'd see how ugly it would be to implement this functionality. Here's a mixin class you can use if you really want to be able to call MyObject.all().latest():
class LatestMixin(object):
"""A mixin for db.Model objects that will add a `latest` method to the
`Query` object returned by cls.all(). Requires that the ORDER_FIELD
contain the name of the field by which to order the query to determine the
latest object."""
# What field do we order by?
ORDER_FIELD = None
#classmethod
def all(cls):
# Get the real query
q = super(LatestMixin, cls).all()
# Define our custom latest method
def latest():
if cls.ORDER_FIELD is None:
raise ValueError('ORDER_FIELD must be defined')
return q.order('-' + cls.ORDER_FIELD).get()
# Attach it to the query
q.latest = latest
return q
# How to use it
class Foo(LatestMixin, db.Model):
ORDER_FIELD = 'timestamp'
timestamp = db.DateTimeProperty(auto_now_add=True)
latest = Foo.all().latest()
MyObject.all() returns an instance of the Query class
Order the results by time:
MyObject.all().order('-time')
So, assuming there is at least one entry, you can get the most recent MyObject directly by:
MyObject.all().order('-time')[0]
or
MyObject.all().order('-time').fetch(limit=1)[0]