Retrieve a numeric ID from a Model instance just created - python

From google documentation:
"A model instance's key includes the instance's entity kind along with a unique identifier. The identifier may be either a key name string, assigned explicitly by the application when the instance is created, or an integer numeric ID, assigned automatically by App Engine when the instance is written (put) to the Datastore. "
so in the example:
name = "John"
idd = 11
person = Person(name, idd)
person.put()
How do i get the "integer numeric ID, assigned automatically by App Engine"?

if you are using ndb put() returns the new key... call the id function on the key:
name = "John"
idd = 11
person = Person(name, idd)
new_key = person.put()
auto_assigned_id = new_key.id()
from https://developers.google.com/appengine/docs/python/ndb/entities :
To store the object as a persistent entity in the Datastore, use the
put() method. This returns a key for retrieving the entity from the
Datastore later:
sandy_key = sandy.put()
and:
https://developers.google.com/appengine/docs/python/ndb/keyclass#Key_id

Have you tried the
print person.id()
or if you have provided the unique identifuer
print person.id_or_name()
Also the put() method returns the key
key = person.put()

Related

PonyORM retrieve object from class Entity problem

Let's say I have these two classes:
class TeamMember(db.Entity):
member_id= PrimaryKey(int, auto=True)
name = Required(str)
team = Required('Team')
class Team(db.Entity):
team_id= PrimaryKey(int, auto=True)
name = Required(str)
team_members = Set(TeamMember)
I want to select all TeamMembers that are in specific team (ex. team_id==1). Query would look something like this (C1):
TeamMember.select(lambda member: member.team == 1)[:]
If I write it like that, I'm getting error below:
Incomparable types 'Team' and 'int' in expression: member.team == 1
On the other hand, I can write this and it will work (C2):
TeamMember.select(lambda member: member.team == Team[1])[:]
But, I don't wan't to write it like it, because I want to create generic function that will work for every Entity class:
def get_instances_from_db(classname, classname_var, var_value):
"""
:param classname: name of class
:param classname_var: name of class variable to search by
:param var_value: value of class variable
:return:
"""
return classname.select(lambda v: getattr(v, classname_var) == var_value)[:]
Above method will work for variable that's isn't relating to other class Entity like:
members = get_instances_from_db(TeamMember, "name", "some_team_member_name")
Finally, my question is: Is it possible to set query to search by integer, and not by Entity object. Or, is there way to use line 'C1'?
Hope I'm clear enough! :)

missing attribute get in datastore's object

This is my GAE datastore:
class Search(ndb.Model):
city = ndb.StringProperty()
counter = ndb.IntegerProperty(indexed = True)
date = ndb.DateTimeProperty(auto_now_add=True)
When I run this part of code:
keys = Search.query(Search.city == city).fetch()
if (len(keys)==0):
luogo = Search(city = city, counter = 1)
luogo.put()
else:
for key in keys:
luogo_1 = key.get()
luogo_1.counter = luogo_1.counter+1
luogo_1.put()
my terminal says that is missing in object Search attribute get
Do you know why?
Running
keys = Search.query(Search.city == city).fetch()
fetches a list of model instances
so
for key in keys:
luogo_1 = key.get()
fails because instances don't have a get method.
You need to do:
keys = Search.query(Search.city == city).fetch(keys_only=True)
to fetch a list of keys, or treat keys as a list of instances rather than keys, and omit the key.get() call.
Maybe because key object doesn't have get() method:
luogo_1 = key.get()
Use python dir() function, it helps me a lot

"Cascading delete" on GAE, NDB, Python app

I have a GAE app using NDB datastore and python which assigns tasks to employees. I have Task Entities and Employee Entities which have arrays of Tasks (storing the tasks' keys). I am trying to implement a "cascading delete" where I can pass my delete function the key of the task to delete, and have it "cascade" to employee entities to clean up references to that task. Right now my delete task function works fine but it does not cascade correctly. When I delete a task and check out an employee who has been assigned that task, its key value still shows. I would greatly appreciate any pointers anyone can provide!
My entity definitions are in a db_models file, with Task entities (consisting only of name as a string) and Employee entities which have arrays of tasks:
class Employee(ndb.Model):
name = ndb.StringProperty(required=True)
title = ndb.StringProperty(required=True)
tasks = ndb.KeyProperty(repeated=True)
def to_dict(self):
d = super(Employee, self).to_dict()
d['tasks'] = [m.id() for m in d['tasks']]
return d
My delete function, which I am passing the 'did' or the key of the Task entity to delete
class TaskDelete(webapp2.RequestHandler):
def get(self, **kwargs):
if 'application/json' not in self.request.accept:
webapp2.abort(406, details="Not Acceptable, API only supports application/json MIME type")
return
if 'did' in kwargs:
entity = ndb.Key(db_models.Task, int(kwargs['did'])).delete()
q = db_models.Employee.query()
key = q.fetch(keys_only=True)
for x in key:
employee = ndb.Key(db_models.Employee, int(x.id())).get()
for task in employee.tasks:
if 'did' == task:
task.delete()
employee.put()
First of all, you are requesting Employees one at a time and that is very slow. Instead of:
q = db_models.Employee.query()
key = q.fetch(keys_only=True)
for x in key:
employee = ndb.Key(db_models.Employee, int(x.id())).get()
use:
for employee in db_models.Employee.query():
Now you simply need to update your employee.tasks property:
for task in employee.tasks:
if 'did' == task:
task.delete()
employee.tasks.remove(task) # add this line
employee.put()
break # add this line too

Datastore set property name from variable

I have an unknown variable that I want to use as a datastore property name. I'm using Expando, as I know you can dynamically create properties without first declaring them in the db class, however I am unable to do this as the property names are not known. I get the error: 'StoreNames' object does not support item assignment. Is there any way around this?
class StoreNames(db.Expando):
index = db.FloatProperty()
name = "unknown"
value = "something"
store = StoreNames()
store[name] = value
store.index = 0
Solved by using the following code:
class StoreNames(db.Expando):
index = db.FloatProperty()
name = "unknown"
value = "something"
store = StoreNames()
setattr(db, name, value)
I would have answered earlier but Stackoverflow wouldn't let me. Thanks Brent Washburne

Copy an entity in Google App Engine datastore in Python without knowing property names at 'compile' time

In a Python Google App Engine app I'm writing, I have an entity stored in the datastore that I need to retrieve, make an exact copy of it (with the exception of the key), and then put this entity back in.
How should I do this? In particular, are there any caveats or tricks I need to be aware of when doing this so that I get a copy of the sort I expect and not something else.
ETA: Well, I tried it out and I did run into problems. I would like to make my copy in such a way that I don't have to know the names of the properties when I write the code. My thinking was to do this:
#theThing = a particular entity we pull from the datastore with model Thing
copyThing = Thing(user = user)
for thingProperty in theThing.properties():
copyThing.__setattr__(thingProperty[0], thingProperty[1])
This executes without any errors... until I try to pull copyThing from the datastore, at which point I discover that all of the properties are set to None (with the exception of the user and key, obviously). So clearly this code is doing something, since it's replacing the defaults with None (all of the properties have a default value set), but not at all what I want. Suggestions?
Here you go:
def clone_entity(e, **extra_args):
"""Clones an entity, adding or overriding constructor attributes.
The cloned entity will have exactly the same property values as the original
entity, except where overridden. By default it will have no parent entity or
key name, unless supplied.
Args:
e: The entity to clone
extra_args: Keyword arguments to override from the cloned entity and pass
to the constructor.
Returns:
A cloned, possibly modified, copy of entity e.
"""
klass = e.__class__
props = dict((k, v.__get__(e, klass)) for k, v in klass.properties().iteritems())
props.update(extra_args)
return klass(**props)
Example usage:
b = clone_entity(a)
c = clone_entity(a, key_name='foo')
d = clone_entity(a, parent=a.key().parent())
EDIT: Changes if using NDB
Combining Gus' comment below with a fix for properties that specify a different datastore name, the following code works for NDB:
def clone_entity(e, **extra_args):
klass = e.__class__
props = dict((v._code_name, v.__get__(e, klass)) for v in klass._properties.itervalues() if type(v) is not ndb.ComputedProperty)
props.update(extra_args)
return klass(**props)
Example usage (note key_name becomes id in NDB):
b = clone_entity(a, id='new_id_here')
Side note: see the use of _code_name to get the Python-friendly property name. Without this, a property like name = ndb.StringProperty('n') would cause the model constructor to raise an AttributeError: type object 'foo' has no attribute 'n'.
If you're using the NDB you can simply copy with:
new_entity.populate(**old_entity.to_dict())
This is just an extension to Nick Johnson's excellent code to address the problems highlighted by Amir in the comments:
The db.Key value of the ReferenceProperty is no longer retrieved via an unnecessary roundtrip to the datastore.
You can now specify whether you want to skip DateTime properties with the auto_now and/or auto_now_add flag.
Here's the updated code:
def clone_entity(e, skip_auto_now=False, skip_auto_now_add=False, **extra_args):
"""Clones an entity, adding or overriding constructor attributes.
The cloned entity will have exactly the same property values as the original
entity, except where overridden. By default it will have no parent entity or
key name, unless supplied.
Args:
e: The entity to clone
skip_auto_now: If True then all DateTimeProperty propertes will be skipped which have the 'auto_now' flag set to True
skip_auto_now_add: If True then all DateTimeProperty propertes will be skipped which have the 'auto_now_add' flag set to True
extra_args: Keyword arguments to override from the cloned entity and pass
to the constructor.
Returns:
A cloned, possibly modified, copy of entity e.
"""
klass = e.__class__
props = {}
for k, v in klass.properties().iteritems():
if not (type(v) == db.DateTimeProperty and ((skip_auto_now and getattr(v, 'auto_now')) or (skip_auto_now_add and getattr(v, 'auto_now_add')))):
if type(v) == db.ReferenceProperty:
value = getattr(klass, k).get_value_for_datastore(e)
else:
value = v.__get__(e, klass)
props[k] = value
props.update(extra_args)
return klass(**props)
The first if expression is not very elegant so I appreciate if you can share a better way to write it.
I'm neither Python nor AppEngine guru, but couldn't one dynamically get/set the properties?
props = {}
for p in Thing.properties():
props[p] = getattr(old_thing, p)
new_thing = Thing(**props).put()
A variation inspired in Nick's answer which handles the case in which your entity has a (repeated) StructuredProperty, where the StructuredProperty itself has ComputedProperties. It can probably be written more tersely with dict comprehension somehow, but here is the longer version that worked for me:
def removeComputedProps(klass,oldDicc):
dicc = {}
for key,propertType in klass._properties.iteritems():
if type(propertType) is ndb.StructuredProperty:
purged = []
for item in oldDicc[key]:
purged.append(removeComputedProps(propertType._modelclass,item))
dicc[key]=purged
else:
if type(propertType) is not ndb.ComputedProperty:
dicc[key] = oldDicc[key]
return dicc
def cloneEntity(entity):
oldDicc = entity.to_dict()
klass = entity.__class__
dicc = removeComputedProps(klass,oldDicc)
return klass(**dicc)
This can be tricky if you've renamed the underlying keys for your properties... which some people opt to do instead of making mass data changes
say you started with this:
class Person(ndb.Model):
fname = ndb.StringProperty()
lname = ndb.StringProperty()
then one day you really decided that it would be nicer to use first_name and last_name instead... so you do this:
class Person(ndb.Model):
first_name = ndb.StringProperty(name="fname")
last_name = ndb.StringProperty(name="lname")
now when you do Person._properties (or .properties() or person_instance._properties) you will get a dictionary with keys that match the underlying names (fname and lname)... but won't match the actual property names on the class... so it won't work if you put them into the constructor of a new instance, or use the .populate() method (the above examples will break)
In NDB anyways, instances of models have ._values dictionary which is keyed by the underlying property names... and you can update it directly. I ended up with something like this:
def clone(entity, **extra_args):
klass = entity.__class__
clone = klass(**extra_args)
original_values = dict((k,v) for k,v in entity._values.iteritems() if k not in clone._values)
clone._values.update(original_values)
return clone
This isn't really the safest way... as there are other private helper methods that do more work (like validation and conversion of computed properties by using _store_value() and _retrieve_value())... but if you're models are simple enough, and you like living on the edge :)
Here's the code provided by #zengabor with the if expression formatted for easier reading. It may not be PEP-8 compliant:
klass = e.__class__
props = {}
for k, v in klass.properties().iteritems():
if not (type(v) == db.DateTimeProperty and ((
skip_auto_now and getattr(v, 'auto_now' )) or (
skip_auto_now_add and getattr(v, 'auto_now_add')))):
if type(v) == db.ReferenceProperty:
value = getattr(klass, k).get_value_for_datastore(e)
else:
value = v.__get__(e, klass)
props[k] = value
props.update(extra_args)
return klass(**props)

Categories