Is it possible to store a mongo_id as an ObjectId object in a MongoAlchemy field? I've been able to store an ObjectId inside of a document I defined manually, but it seems as though I'm restricted to storing the string value of the id in the context of the MongoAlchemy ORM.
Here's some of my code:
class Group(db.Document):
name = db.StringField()
trial_id = db.StringField(required=False)
participants = db.ListField(
db.DictField(db.AnythingField()), default_empty=True, required=False)
def add_participant(self, participant):
self.participants.append({
'participant_id': participant.mongo_id,
'start': datetime.utcnow(),
})
class Trial(db.Document):
name = db.StringField()
groups = db.ListField(
db.DocumentField(Group), default_empty=True, required=False)
def add_group(self, group):
group.trial_id = str(self.mongo_id)
group.save()
def get_group(self, group):
return Group.query.filter(
{'name': group, 'trial_id': str(self.mongo_id)}).first()
You'll see that I'm able to store a mongo_id as an ObjectId object in the Group method add_participant (since it's creating document manually, not through the MongoAlchemy ORM), but am forced to convert the mongo_id to a string in order to store it in a db.StringField.
I tried storing the original ObjectId in a db.AnythingField, but was then unable to filter by it.
Does anyone know if it's possible to store an ObjectId in a MongoAlchemy field and then filter by it in a database query?
Thank you!
You want an ObjectIdField: http://www.mongoalchemy.org/api/schema/fields.html#mongoalchemy.fields.ObjectIdField
This is the type of field which is used for mongo_id (although that one is special-cased)
try
id = db.ObjectIdField().gen()
This would automatically generate the object id for each instance of the mongo db object/document - as would id's in relational dbs
Related
I wondered if it is possible to query documents in MongoDB by computed properties using mongoengine in python.
Currently, my model looks like this:
class SnapshotIndicatorKeyValue(db.Document):
meta = {"collection": "snapshot_indicator_key_values"}
snapshot_id = db.ObjectIdField(nullable=False)
indicator_key_id = db.ObjectIdField(nullable=False)
value = db.FloatField(nullable=False)
created_at = db.DateTimeField()
updated_at = db.DateTimeField()
#property
def snapshot(self):
return Snapshot.objects(id=self.snapshot_id).first()
def indicator_key(self):
return IndicatorKey.objects(id=self.indicator_key_id).first()
When I do for example SnapshotIndicatorKeyValue .objects().first().snapshot, I can access the snapshotproperty.
But when I try to query it, it doesn't work. For example:
SnapshotIndicatorKeyValue.objects(snapshot__date_time__lte=current_date_time)
I get the error `mongoengine.errors.InvalidQueryError: Cannot resolve field "snapshot"``
Is there any way to get this working with queries?
I need to query SnapshotIndicatorKeyValue based on a property of snapshot.
In order to query the snapshot property directly through mongoengine, you can reference the related snapshot object rather than the snapshot_id in your SnapshotIndicatorKeyValue document definition.
An amended model using a Reference field would be like this:
from mongoengine import Document, ReferenceField
class Snapshot(Document)
property_abc = RelevantPropertyHere() # anything you need
class SnapshotIndicatorKeyValue(Document):
snapshot = ReferenceField(Snapshot)
You would sucessively save an instance of Snapshot and an instance of SnapshotIndicatorKeyValue like this:
sample_snapshot = Snapshot(property_abc=relevant_value_here) # anything you need
sample_snapshot.save()
sample_indicatorkeyvalue = SnapshotIndicatorKeyValue()
sample_indicatorkeyvalue.snapshot = sample_snapshot
sample_indicatorkeyvalue.save()
You can then refer to any of the snapshot's properties through:
SnapshotIndicatorKeyValue.objects.first().snapshot.property_abc
Hello community,
I want to insert a mongoengine document to specific collection.
I know there is the save method on the document. However this requires to use the connect method from mongoengine which I don't want to use. I need to pass in the collection and save the document to the specified collection. Below is some example code which illustrates my problem.
Is there a way to do this?
Thanks in advance!
from mongoengine import Document, StringField
from pymongo import MongoClient
from pymongo.collection import Collection
# document which must be stored in measurements collection
class Example(Document):
value = StringField()
def insert(value: str, collection: Collection):
# create document
example = Example(value=value)
# TODO: insert document in specified collection
# conenct to db
client = MongoClient('mongodb://localhost:27017/')
db = client.test
collection: Collection = db["measurements"]
# insert document to collection
insert(value="123", collection=collection)
I would not recommend you to build a complex application on this pattern (mixing Mongoengine and pymongo) but to answer your question, you can achieve what you want with:
def insert(value: str, collection: Collection):
example = Example(value=value)
bson_data = example.to_mongo()
collection.insert_one(bson_data)
Working with Google App Engine for Python, I am trying to create and then update an ndb entity. To update a single property, you can just access the property using a dot, e.g.
post.body = body
But I would like to know if there is a simple way to update multiple fields within an ndb entity. The following code:
class Create(Handler):
def post(self):
## code to get params
post = Post(author = author,
title = title,
body = body)
post.put()
class Update(Handler):
def post(self, post_id):
post = post.get_by_id(int(post_id))
fields = ['author', 'title', 'body']
data = get_params(self.request, fields)
for field in fields:
post[field] = data[field]
post.put()
The "Create" handler works fine, but the "Update" handler results in:
TypeError: 'Post' object does not support item assignment
So it seems I would need to access the properties using a dot, but that is not going to work when I have a list of properties I want to access.
Can someone provide an alternative way to update multiple properties of an NDB entity after it has been created?
You should use setattr.
for field in fields:
setattr(post, field, data[field])
(Note that GAE objects do actually provide a hidden way of updating them via a dict, but you should use the public interface.)
You can use the populate method:
post.populate(**data)
I have a peewee model like so:
class User(peewee.Model):
name = peewee.CharField(unique=True)
some_json_data = peewee.CharField()
requested_at = peewee.DateTimeField(default=datetime.now())
I know that peewee doesn't support a JSONField for a MySQL DB, but anyway, I though if I could just convert it to a string format and save to db, I can retrieve it as is.
Let's say, for example, this is my JSONField that I am writing to the DB:
[
{
'name': 'abcdef',
'address': 'abcdef',
'lat': 43176757,
'lng': 42225601
}
]
When I fetch this (JSONField) data, the output is like so:
u'[{u\'name\': u\'abcdef\', u\'address\': u\'abcdef\', u\'lat\': 43176757, u\'lng\': 42225601\'}]'
Trying a simplejson load of this is giving me an error like so:
JSONDecodeError: Expecting property name enclosed in double quotes:
line 1 column 3 (char 2)
I've tried json dumps of the json data before entering it to the DB and seeing if something would work, but still I have no luck with that.
I am looking for a solution that involves peewee's custom field options and I want to stick my MySQL. Can someone guide me?
What's probably happening in your code is Peewee is calling str() (or unicode()) on the value instead of dumping it to JSON, so the Python string representation is being saved to the database. To do JSON manually, just import json and then call json.dumps(obj) when you're setting the field and json.loads(db_value) when you fetch the field.
It looks like there's a Peewee playhouse extension defined for certain databases (SQLite, PostgreSQL?) that defined a JSONField type -- see JSONField docs here.
Alternatively, I don't think it'd be hard to define a custom JSONField type which does the json loads/dumps automatically. There's a simple example of this in playhouse/kv.py:
class JSONField(TextField):
def db_value(self, value):
return json.dumps(value)
def python_value(self, value):
if value is not None:
return json.loads(value)
Why not using the JSONField field from Peewee's playhouse?
from playhouse.sqlite_ext import *
db = SqliteExtDatabase(':memory:')
class KV(Model):
key = TextField()
value = JSONField()
class Meta:
database = db
KV.create_table()
It takes care of converting Python objects to JSON, and viceversa:
KV.create(key='a', value={'k1': 'v1'})
KV.get(KV.key == 'a').value # print {'k1': 'v1'}
You can query using the JSON keys:
KV.get(KV.value['k1'] == 'v1').key # print 'a'
You can easily update JSON keys:
KV.update(value=KV.value.update({'k2': 'v2', 'k3': 'v3'})).execute() # add keys to JSON
I have a mongoengine schema like this:
class Page(Document):
title = StringField(max_length=200, required=True)
date_modified = DateTimeField(default=datetime.datetime.now)
meta = {"db_alias":"page", "collection":"page_detail"}
as you can see, my collection will be save with name "page_detail".
so my problem is this:
I have data in my database already, but some record is useless, so I need to filter them. Now, I want to filter them and save the filtered data in another collection with this schema, so I have any choice?
You can use aggregation framework with $out operator, which will store the result of your query into a new collection specified in the $out.
db.yourOldCollection.aggregate([
<your filetring pipeline, e.g. $match>,
{$out: "yourNewCollection"}
])