App Engine - Specify entity name at runtime - python

In the low-level Java api, I used to be able to make up entity names at runtime.
In python, it seems that I have to pre-define a class name that is also my entity name:
class SomeKindaData(db.Expando):
pass
sKD = SomeKindaData(key_name='1')
...
Is there a way to make entity names up at run time in App Engine for Python?

I don't know much about App Engine, but you can define classes at run time like this:
def get_my_class(name):
return type(name, (db.Expando,), {})
So type takes three arguments:
name of class
tuple of classes to inherit from
dictionary of class attributes

Entities themselves don't have names. In the datastore, entities are identified by keys, which can have names or IDs. You're setting your entity's key name to "1" in your sample code. Entities are also classified by kind, in this case SomeKindaData.
db.Model and db.Expando provide a local ORM abstraction around the datastore. When you use these, your entity's kind name is set to your model class name by default. If you don't want to define a model class before creating an entity, you can use the low-level datastore API:
from google.appengine.api import datastore
sKD = datastore.Entity(kind='SomeKindaData', name='1')
sKD['SomeProperty'] = 'SomeValue'
datastore.Put(sKD)

Related

set non-ORM object fields while querying

Assume I have a class Interaction. My program processes interactions, in a way that each interaction updates a score table. Interaction is declared as an exact mapping of the database table, but I also want it to have a reference to the relevant instance of ScoreTable. ScoreTable is a class that holds the scores, and controls the business logic to update scores:
class Interaction(Base):
__tablename__ = 'interactions'
#Mirror the table's structure
anomaly_id = Column(Integer, ForeignKey('anomalies.id'), primary_key=True)
user_id = Column(Integer, primary_key=True) # ForeignKey('users.id'),
feedback_score = Column(Integer)
#scoreUpdater is another (non-ORM) object I want this instance to be aware of
myScoreUpdater= ...
All instances of Interaction fetched by a query() will share the same ScoreUpdater.So when I run the query() to get all my instances of Interactions, can I somehow tell the query() to set the scoreUpdater to a certain value, in the same process? Else, can I give query() a half-built template instance if Interaction to clone and populate the ORM data into?
I read that I could modify the standard constructor to perform certain tasks, but don't know how to pass extra arguments (such as the instance of ScoreUpdater) to the constructor via the query()
I guess the other way is to run the query first and let it populate the ORM-related fields, and then in a second step, iterate over the query results to set the non-OM fields (i.e. the right instance of scoreUpdater)?
I'm new to SQLalchemy ... and converting from java to python. So if you think my approach is fundamentally wrong, let me know!
The relevant documentation on constructing objects says:
The SQLAlchemy ORM does not call __init__ when recreating objects from database rows. The ORM’s process is somewhat akin to the Python standard library’s pickle module, invoking the low level __new__ method and then quietly restoring attributes directly on the instance rather than calling __init__.
If you need to do some setup on database-loaded instances before they’re ready to use, there is an event hook known as InstanceEvents.load() which can achieve this; it is also available via a class-specific decorator called orm.reconstructor(). When using orm.reconstructor(), the mapper will invoke the decorated method with no arguments every time it loads or reconstructs an instance of the class. This is useful for recreating transient properties that are normally assigned in __init__
So if I've understood you correctly, you could define a reconstructor for Interaction that populates the non-ORM fields:
from sqlalchemy.orm import reconstructor
class Interaction(Base):
__tablename__ = 'interactions'
# Mirror the table's structure
anomaly_id = Column(Integer, ForeignKey('anomalies.id'), primary_key=True)
user_id = Column(Integer, primary_key=True) # ForeignKey('users.id'),
feedback_score = Column(Integer)
# myScoreUpdater is another (non-ORM) object I want this instance to be aware of
#reconstructor
def init_on_load(self):
# Do what you must to populate score updater
self.myScoreUpdater = ...
Note that you'll probably want to share the logic between the reconstructor and __init__, so either just decorate __init__ as the reconstructor, if it can be called without arguments, or move initialization of score updater to a method.
Finally, if your ScoperUpdater does not actually need to know what instance it is bound to, you could just have it as a class attribute shared between all instances – like static attributes in Java.

Bi-directional relationship in mongoengine

I need to use a bi-directional relationship in Mongoengine which is something like the below.
from mongoengine import *
class Notification(Document):
desc = StringField()
from_user = ReferenceField('User')
class User(Document):
Name = StringField()
notifications = ListField(EmbeddedDocumentField(Notification))
I know we can put single quoted class name there when the class has not yet defined.
from_user = ReferenceField('User')
However, we got a problem here. Seems like in runtime it interprets our class as mongoengine.django.auth.user instead of our custom user class. (This is just what I guess but in runtime during debug mode I find that it misinterprets it as mongoengine.django.auth.user although the record in the collections should belong to the custom user class)
So is there any way for me to specify a fully qualified class name there?
Thanks!
In this instance you'd need to declare the User class after the Notification class.
Internally mongoengine uses a class registry, which is populated via the Document metaclass. Unfortunately, namespacing isn't the same as in the java world (I never thought I'd say that!) so as far as I know its not possible to determine the full location name for a class eg: myapp.models.User
Are you using the django User class? as well as another User class - this will cause issues with the registry as currently you can only have one class per name.

In App Engine for Python, is it possible to persist a class with another object nested inside it?

In App Engine for Python, is there anything like Objectify (Java Library) where I can easily embed a class within another and save it to the datastore?
This class would be modeled like the following example where a Venue contain a Location object. I want to persist this as one nested object, as well as be able to query by fields in the embedded object.
class Location():
city = db.StringProperty()
state = db.StringProperty()
class Venue(db.Model):
name = db.StringProperty()
location = Location()
Here is information on how it works in Objectify in App Engine for Java.
http://code.google.com/p/objectify-appengine/wiki/IntroductionToObjectify##Embedded
Is this possible using Python?
Consider using Reference properties. I.e. store a Location object as its own entity and incorporate that location into the Venue object by reference.
class Location():
city = db.StringProperty()
state = db.StringProperty()
class Venue(db.Model):
name = db.StringProperty()
location = db.ReferenceProperty(Location)
Then, if you want to transact on a Location and Venue at the same time, use datastore transactions.
EDIT: To query fields in the 'contained' object, use datastore "back references". I.e. the fact that Venue contains a reference to Location means Location also contains references to Venues. See: http://code.google.com/appengine/docs/python/datastore/datamodeling.html#References
Not currently, but the NDB library supports embedding models within one another either by serializing them as Protocol Buffers, or by nesting their properties (Objectify fashion).

How can I decide which declarative model to instantiate, based on row information

I'm building a webapp that has optional Facebook Login. The users created through the Facebook API are handled differently at several points in my application. I want to encapsulate these differences in a subclass of Person that overrides methods.
class Person(Model):
def get_profile_picture(self):
return profile_pictures.url(self.picture)
class FacebookPerson(Person):
def get_profile_picture(self):
return 'http:/.../%s.jpg' % self.graph_id
I would like to avoid the nasty if self.graph_id and just query the Person model and get the right object for each user.
I've thought of hacking the metaclass to add the FacebookPerson as a base. Obviously I would like to avoid such voodoo.
I'm using Flask and Flask-SQLAlchemy.
The general idea would be to store the model's class name as metadata in each row, and when you instantiate the object, do something like:
def query(self):
# stuff
return model_class(data)
To do this in SQLAlchemy, you might look at making Person the base class to something like BasicPerson and FacebookPerson, and in Person.init(), use the metadata to initialize to the proper subclass.
For example, the idea would be than when this query returns, user will have been initialized to the proper subclass:
user = session.query(Person).filter_by(name='james').first()
You will probably need to modify this concept a bit for SQLAlchemy (I haven't used it in a while), but that's the general idea.
Or, you could do something like store the metadata in a cookie with the user_id, and then when they log in again, use the metadata to pass the proper class to the user query:
user = session.query(FacebookPerson).filter_by(name='james').first()
If you want this to be generic so that the metatdata is meaningful to non-Python clients, instead of storing the model's class name, store the model's "object_type" and have something in each client library that maps object_types to classes.

Dynamic Class Creation in SQLAlchemy

We have a need to create SQLAlchemy classes to access multiple external data sources that will increase in number over time. We use the declarative base for our core ORM models and I know we can manually specify new ORM classes using the autoload=True to auto generate the mapping.
The problem is that we need to be able generate them dynamically taking something like this:
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
stored={}
stored['tablename']='my_internal_table_name'
stored['objectname']='MyObject'
and turning it into something like this dynamically:
class MyObject(Base):
__tablename__ = 'my_internal_table_name'
__table_args__ = {'autoload':True}
We don't want the classes to persist longer than necessary to open a connection, perform the queries, and then closing the connection. Therefore, ideally, we can put the items in the "stored" variable above into a database and pull them as needed. The other challenge is that the object name (e.g. "MyObject") may be used on different connections so we cannot define it once and keep it around.
Any suggestions on how this might be accomplished would be greatly appreciated.
Thanks...
You can dynamically create MyObject using the 3-argument call to type:
type(name, bases, dict)
Return a new type object. This is essentially a dynamic form of the
class statement...
For example:
mydict={'__tablename__':stored['tablename'],
'__table_args__':{'autoload':True},}
MyObj=type(stored['objectname'],(Base,),mydict)
print(MyObj)
# <class '__main__.MyObject'>
print(MyObj.__base__)
# <class '__main__.Base'>
print(MyObj.__tablename__)
# my_internal_table_name
print(MyObj.__table_args__)
# {'autoload': True}

Categories