POST List of Objects w/ endpoints-proto-datastore - python

tl;dr: is it possible, with endpoints-proto-datastore, to receive a list with objects from a POST and insert it in the db?
Following the samples, when building my API i didn't got how could i let the users POST a list of objects so that i could be more efficient about putting a bunch of data in the db using ndb.put_multi, for example.
From this comment here at endpoints_proto_datastore.ndb.model i imagine that it is not possible with how it is designed. Am i right or i am missing something?
Extending the sample provided by endpoints achieved the desired with:
class Greeting(messages.Message):
message = messages.StringField(1)
class GreetingCollection(messages.Message):
items = messages.MessageField(Greeting, 1, repeated=True)
# then inside the endpoints.api class
#endpoints.method(GreetingCollection, GreetingCollection,
path='hellogretting', http_method='POST',
name='greetings.postGreeting')
def greetings_post(self, request):
result = [item for item in request.items]
return GreetingCollection(items=result)
-- edit --

See the docs about POSTing into the datastore, your only issue is that your models aren't EndpointsModels. Instead define a datastore model for both your Greeting and GreetingCollection:
from endpoints_proto_datastore.ndb import EndpointsModel
class Greeting(EndpointsModel):
message = ndb.StringProperty()
class GreetingCollection(EndpointsModel):
items = ndb.StructuredProperty(Greeting, repeated=True)
Once you've done this, you can use
class MyApi(remote.Service):
# ...
#GreetingCollection.method(path='hellogretting', http_method='POST',
name='greetings.postGreeting')
def greetings_post(self, my_collection):
ndb.put_multi(my_collection.items)
return my_collection

Related

Update multiple fields on Google NDB entity

Working with Google App Engine for Python, I am trying to create and then update an ndb entity. To update a single property, you can just access the property using a dot, e.g.
post.body = body
But I would like to know if there is a simple way to update multiple fields within an ndb entity. The following code:
class Create(Handler):
def post(self):
## code to get params
post = Post(author = author,
title = title,
body = body)
post.put()
class Update(Handler):
def post(self, post_id):
post = post.get_by_id(int(post_id))
fields = ['author', 'title', 'body']
data = get_params(self.request, fields)
for field in fields:
post[field] = data[field]
post.put()
The "Create" handler works fine, but the "Update" handler results in:
TypeError: 'Post' object does not support item assignment
So it seems I would need to access the properties using a dot, but that is not going to work when I have a list of properties I want to access.
Can someone provide an alternative way to update multiple properties of an NDB entity after it has been created?
You should use setattr.
for field in fields:
setattr(post, field, data[field])
(Note that GAE objects do actually provide a hidden way of updating them via a dict, but you should use the public interface.)
You can use the populate method:
post.populate(**data)

1 to many relationship in Parse.com using Python Restful API

I am using Parse to build my database.
I have two tables: Article & Comment. Where Article has one or many Comments
I am using Parse Resful API [ParsePy][1] to add items
from parse_rest.datatypes import Object
class Article(Object):
pass
class Comment(Object):
pass
articleItem = Article(title='Test', author='John Doe')
articleItem.save() # we have to save it before it can be referenced
I don't know how to achieve this 1:N relationship, Can anyone show a way to make this:
from parse_rest.datatypes import Object
class Article(Object):
pass
class Comment(Object):
pass
articleItem = Article(title='Test', author='John Doe')
# for example, you get data from json
comments = []
for data in incoming_json['comments']:
# Don't know why, but this cannot working [Comments(**data).save() for data in incoming_json['comments]]
comment = Comments(**data)
comment.save()
comments.append(comment)
articleItem.comments = comments
articleItem.save() # we have to save it before it can be referenced
If you get query, you will receive:
[<Comments:v6PvbsPbQT>, <Comments:V5JpqPRuS6>, <Comments:HNocXqmUeJ>,]

Django Rest Framework - Read nested data, write integer

So far I'm extremely happy with Django Rest Framework, which is why I alsmost can't believe there's such a large omission in the codebase. Hopefully someone knows of a way how to support this:
class PinSerializer(serializers.ModelSerializer):
item = ItemSerializer(read_only=True, source='item')
item = serializers.IntegerSerializer(write_only=True)
class Meta:
model = Pin
with the goal
The goal here is to read:
{pin: item: {name: 'a', url: 'b'}}
but to write using an id
{pin: item: 10}
An alternative would be to use two serializers, but that looks like a really ugly solution:
django rest framework model serializers - read nested, write flat
Django lets you access the Item on your Pin with the item attribute, but actually stores the relationship as item_id. You can use this strategy in your serializer to get around the fact that a Python object cannot have two attributes with the same name (a problem you would encounter in your code).
The best way to do this is to use a PrimaryKeyRelatedField with a source argument. This will ensure proper validation gets done, converting "item_id": <id> to "item": <instance> during field validation (immediately before the serializer's validate call). This allows you to manipulate the full object during validate, create, and update methods. Your final code would be:
class PinSerializer(serializers.ModelSerializer):
item = ItemSerializer(read_only=True)
item_id = serializers.PrimaryKeyRelatedField(write_only=True,
source='item',
queryset=Item.objects.all())
class Meta:
model = Pin
fields = ('id', 'item', 'item_id',)
Note 1: I also removed source='item' on the read-field as that was redundant.
Note 2: I actually find it rather unintuitive that Django Rest is set up such that a Pin serializer without an Item serializer specified returns the item_id as "item": <id> and not "item_id": <id>, but that is beside the point.
This method can even be used with forward and reverse "Many" relationships. For example, you can use an array of pin_ids to set all the Pins on an Item with the following code:
class ItemSerializer(serializers.ModelSerializer):
pins = PinSerializer(many=True, read_only=True)
pin_ids = serializers.PrimaryKeyRelatedField(many=True,
write_only=True,
source='pins',
queryset=Pin.objects.all())
class Meta:
model = Item
fields = ('id', 'pins', 'pin_ids',)
Another strategy that I previously recommended is to use an IntegerField to directly set the item_id. Assuming you are using a OneToOneField or ForeignKey to relate your Pin to your Item, you can set item_id to an integer without using the item field at all. This weakens the validation and can result in DB-level errors from constraints being violated. If you want to skip the validation DB call, have a specific need for the ID instead of the object in your validate/create/update code, or need simultaneously writable fields with the same source, this may be better, but I wouldn't recommend anymore. The full line would be:
item_id = serializers.IntegerField(write_only=True)
If you are using DRF 3.0 you can implement the new to_internal_value method to override the item field to change it to a PrimaryKeyRelatedField to allow the flat writes. The to_internal_value takes unvalidated incoming data as input and should return the validated data that will be made available as serializer.validated_data. See the docs: http://www.django-rest-framework.org/api-guide/serializers/#to_internal_valueself-data
So in your case it would be:
class ItemSerializer(ModelSerializer):
class Meta:
model = Item
class PinSerializer(ModelSerializer):
item = ItemSerializer()
# override the nested item field to PrimareKeyRelatedField on writes
def to_internal_value(self, data):
self.fields['item'] = serializers.PrimaryKeyRelatedField(queryset=Item.objects.all())
return super(PinSerializer, self).to_internal_value(data)
class Meta:
model = Pin
Two things to note: The browsable web api will still think that writes will be nested. I'm not sure how to fix that but I only using the web interface for debug so not a big deal. Also, after you write the item returned will have flat item instead of the nested one. To fix that you can add this code to force the reads to use the Item serializer always.
def to_representation(self, obj):
self.fields['item'] = ItemSerializer()
return super(PinSerializer, self).to_representation(obj)
I got the idea from this from Anton Dmitrievsky's answer here: DRF: Simple foreign key assignment with nested serializers?
You can create a Customized Serializer Field (http://www.django-rest-framework.org/api-guide/fields)
The example took from the link:
class ColourField(serializers.WritableField):
"""
Color objects are serialized into "rgb(#, #, #)" notation.
"""
def to_native(self, obj):
return "rgb(%d, %d, %d)" % (obj.red, obj.green, obj.blue)
def from_native(self, data):
data = data.strip('rgb(').rstrip(')')
red, green, blue = [int(col) for col in data.split(',')]
return Color(red, green, blue)
Then use this field in your serializer class.
I create a Field type that tries to solve the problem of the Data Save requests with its ForeignKey in Integer, and the requests to read data with nested data
This is the class:
class NestedRelatedField(serializers.PrimaryKeyRelatedField):
"""
Model identical to PrimaryKeyRelatedField but its
representation will be nested and its input will
be a primary key.
"""
def __init__(self, **kwargs):
self.pk_field = kwargs.pop('pk_field', None)
self.model = kwargs.pop('model', None)
self.serializer_class = kwargs.pop('serializer_class', None)
super().__init__(**kwargs)
def to_representation(self, data):
pk = super(NestedRelatedField, self).to_representation(data)
try:
return self.serializer_class(self.model.objects.get(pk=pk)).data
except self.model.DoesNotExist:
return None
def to_internal_value(self, data):
return serializers.PrimaryKeyRelatedField.to_internal_value(self, data)
And so it would be used:
class PostModelSerializer(serializers.ModelSerializer):
message = NestedRelatedField(
queryset=MessagePrefix.objects.all(),
model=MessagePrefix,
serializer_class=MessagePrefixModelSerializer
)
I hope this helps you.

Returning extended fields in JSON

I have two tabels(Ingredient_Step and Ingredient) in on relation as you can see below:
Models.Py
class Ingredient_Step(models.Model):
ingredient = models.ForeignKey(Ingredient)
Step = models.ForeignKey(Step)
def __unicode__(self):
return u'{}'.format(self.Step)
class Ingredient(models.Model):
IngredientName = models.CharField(max_length=200,unique=True)
Picture = models.ImageField(upload_to='Ingredient')
def __unicode__(self):
return u'{}'.format(self.IngredientName)
In a function, i need serialize a JSON object from a query that returns from "Ingredient_step", but I need send the field "IngredientName", who comes from "Ingredient" table.
I try using "ingredient__IngredientName" but it fails.
Views.Py:
def IngredientByStep(request):
if request.is_ajax() and request.GET and 'id_Step' in request.GET:
if request.GET["id_Step"] != '':
IngStp = Ingredient_Step.objects.filter(Step =request.GET["id_Step"])
return JSONResponse(serializers.serialize('json', IngStp, fields=('pk','ingredient__IngredientName')))
How i can call extends field from a relation?
Thanks
This "feature" of Django (and many ORM's like SQLAlchemy) are called Lazy Loading, meaning data is only loaded from related models if you specifically ask for them. In this case, build your IngStp as a list of results, and make sure to access the property for each result before serializing.
Here's an example of how to do that: Django: Include related models in JSON string?

Dereference Models from many to many relationship

In my schema, as described in the below test data generation example, I want to know a good way to:
Dereference all instances of Favourites that have reference keys to instances of Pictures that have been deleted. Just delete any Favourite that links to a deleted picture.
The Person class is a user
The Picture class is something that can be a Favourite
The Favourite class is an example of the Link-Model way of having many-to-many relationships.
Why this question?
First I hope it doesn't fall out of the scope here, second because this can happen and third because it's interesting.
How?
Let's say that a person can have up to thousands favourites, something like Likes are on social networks or to make it worse, orders, accounts or invalid data in a scientific application.
In our example for some reason (and these reasons happen) a person is experiencing lot of dead favourite link, or I do know, that there are dead favourites.
What would be a good way to do this, reducing ndb.get() operations and not iterating through every Favourite.
Lets not complicate things. Lets make the assumption that we have only one user suffering from dead favourites. He has a class of Person and stubbed user_id property of '123'.
In the following example you can use the following handlers and their corresponding functions.
import time
import sys
import logging
import random
import cgi
import webapp2
from google.appengine.ext import ndb
class Person(ndb.Expando):
pass
class Picture(ndb.Expando):
pass
class Favourite(ndb.Expando):
user_id = ndb.StringProperty(required=True)
#picture = ndb.KeyProperty(kind=Picture, required=True)
pass
class GenerateDataHandler(webapp2.RequestHandler):
def get(self):
try:
number_of_models = abs(int(cgi.escape(self.request.get('n'))))
except:
number_of_models = 10
logging.info("GET ?n=parameter not defined. Using default.")
pass
user_id = '123' #stub
person = Person.query().filter(ndb.GenericProperty('user_id') == user_id).get()
if not person:
person = Person()
person.user_id = user_id #Stub
person.put()
logging.info("Created Person instance")
if not self._gen_data(person, number_of_models):
return
self.response.write("Data generated successfully")
def _gen_data(self, person, number_of_models):
first, last = Picture.allocate_ids(number_of_models)
picture_keys = [ndb.Key(Picture, id) for id in range(first, last+1)]
pictures = []
favourites = []
for picture_key in picture_keys:
picture = Picture(key=picture_key)
pictures.append(picture)
favourite = Favourite(parent=person.key,
user_id=person.user_id,
picture=picture_key
)
favourites.append(favourite)
entities = favourites
entities[1:1] = pictures
ndb.put_multi(entities)
return True
class CorruptDataHandler(webapp2.RequestHandler):
def get(self):
if not self._corrupt_data(0.5):#50% corruption
return
self.response.write("Data corruption completed successfully")
def _corrupt_data(self, n):
picture_keys = Picture.query().fetch(99999, keys_only=True)
random_picture_keys = random.sample(picture_keys, int(float(len(picture_keys))*n))
ndb.delete_multi(random_picture_keys)
return True
class FixDataHandler(webapp2.RequestHandler):
def get(self):
user_id = '123' #stub
person = Person.query().filter(ndb.GenericProperty('user_id') == user_id).get()
self._dereference(person)
def _dereference(self, person):
#Here if where you implement your answer
Separate handlers due to eventual consistency in
the NDB Datastore. More info:
GAE put_multi() entities using backend NDB
Of course I am posting an answer as well to show that I tried something before posting this.
A ReferenceProperty is just a key, so if you have the key of the deleted Person, you can use that to query the Favourite.
Otherwise, there's no easy way. You'll have to filter through all Favourites and find ones that have an invalid Picture. It's very simple in a mapreduce job, but could be an expensive query if you have a lot of Favourites.
You could use a pre delete hook (look here for a way to implement it)
Of course this could be done easier if you use the NDB API instead of the Datastore API (hooks on NDB), but then you'll have to change the way you make the referenes

Categories