I want to get the field value like we use self in Django models.
class UserModel(Model):
id = IDField()
uid = TextField()
#classmethod
def get_user(cls):
return cls.uid
The class method, keep returning NONE instead of the string value of the uid field. Did I miss something?
This is from the Firestore Python wrapper https://octabyte.io/FireO/quick-start/
If you use #classmethod and cls you can only get empty values. It is because you have basic class schema from which you can create objects (aka instances of that class).
To get value of current objects it has to be from self, so standard method. Then you can get a value of this particular object instance.
I didn't even find mention of a #classmethod in the Firestore Python. Most likely you don't need that decorator for now.
Related
I'm trying to create a class which maps to a mongoDB collection.
My code looks like this:
class Collection:
_collection = get_collection() # This seems not working
#classmethod
def get_collection(cls):
collection_name = cls.Meta.collection_name if cls.Meta.collection_name \
else cls.__name__.lower()
collection = get_collection_by_name(collection_name) # Pseudo code, please ignore
return collection
class Meta:
collection_name = 'my_collection'
I came across a situation where I need to assign the class variable _collection with the return value of get_collection.
I also tried _collection = Collection.get_collection() which also seems not to be working
As a work-around, I subclassed Collection and set value of _collection in the child class.
Would like to know any simple solution for this.
Thanks in advance
As DeepSpace mentions, here:
class Collection:
_collection = get_collection() # This seems not working
#classmethod
def get_collection(cls):
# code that depends on `cls`
the get_collection method is not yet defined when you call it. But moving this line after the method definition won't work either, since the method depends on the Collection class (passed as cls to the method), which itself won't be defined before the end of the class Collection: statement's body.
The solution here is to wait until the class is defined to set this attribute. Since it looks like a base class meant to be subclassed, the better solution would be to use a metaclass:
class CollectionType(type):
def __init__(cls, name, bases, attrs):
super(CollectionType, cls).__init__(name, bases, attrs)
cls._collection = cls.get_collection()
# py3
class Collection(metaclass=CollectionType):
# your code here
# py2.7
class Collection(object):
__metaclass__ = CollectionType
# your code here
Note however that if Collection actually inherit from a another class already having a custom metaclass (ie Django Model class or equivalent) you will need to make CollectionType a subclass of this metaclass instead of a subclass of type.
There are some design/syntax errors in your code.
When the line _collection = get_collection() executes, get_collection is not yet defined. As a matter of fact, the whole Collection class is not yet defined.
get_collection_by_name is not defined anywhere.
EDIT OP updated the question so the below points may not be relevant anymore
collection = get_collection(collection_name) should be collection = cls.get_collection(collection_name)
Sometimes you are passing a parameter to get_collection and sometimes you don't, however get_collection's signature never accepts a parameter.
Calling get_collection will lead to an infinite recursion.
You have to take a step back and reconsider the design of your class.
I am currently trying to implement a python class that automatically synchronized with a NoSQL database with implicit buffering, quite to the image of the SLQAlchemy.
In order to do this, I need to track attribute updates issued by the user and, on each attribute update, call functions that keep that object in synchronization with the database or buffer.
What is the best way of doing this in Python? If it passes through __setattr__ and __delattr__, how do I do it correctly, to avoid messing up with garbage collector?
One way to do it (the way I would recommend) is to use descriptors.
First you make a class for your properties, something like:
class Property:
def __init__(self, *args, **kwargs):
#initialize the property with any information it needs to do get and set
def __get__(self,obj, type=None):
#logic to get from database or cache
def __set__(self,obj, value):
#logic to set the value and sync with database if necessary.
And then in your class entity class you have something like this:
class Student:
student_id = Property(...)
name = Property(...)
classes = Property(...)
Of course in practice you may have multiple Property types. My guess is that SQLAlchemy does something like this, where Column types are descriptors.
I'm building my own lightweight orm. I'd like to keep instantiated objects in a class variable (perhaps a dictionary). When I request an object (through a class method) like get(id), I'd like to return the object from the instantiated list, or create one if it does not exist.
Is there a way to prevent the instantiation of an object (if its id already exists in the cls list)?
There are two straightforward ways of doing it - and many other ways,as well. One of them, as you suggest, is to write the __new__ method for your objects, which could return an already existing object or create a new instance.
Another way is to use a factory function for your objects - and call this factory function instead of the class - more or less like this:
class _MyClass(object):
pass
def MyClass(*args, **kw):
if not "my_class_object" in all_objects:
all_objects["my_class_object"] = _MyClass(*args, **kw)
return all_objects["my_class_object"]
Just perform explicit check, it is the cleanest method, I believe:
class OrmContainer(object):
objects = {}
#classmethod
def get(cls, id):
if id not in cls.objects:
cls.objects[id] = SomeOtherClass(id)
return cls.objects[id]
I have a list of python objects mixed with Django model instances and need to overload the __eq__ operator on some of the fields for filtering. This works fine until I run into a ForeignKey which throws an error due to the attribute only accepting the specific model instance.
How can I override this attribute? Even deleting it would be nice as I have no use for it in the template.
The reason for this pattern is that under certain conditions, I'd like certain attributes to always match the incoming search parameters on a per object basis (I have wildcards in the objects being searched, not the search query).
class AlwaysEqual(object):
def __eq__(self, a):
return True
for i in bag_of_objects:
if certain_conditions_met:
i.foo = AlwaysEqual()
# ValueError: Cannot assign "AlwaysEqual": "ProductFile.option1" must be a "ProductOptionValue" instance.
# is there a way to delete this attribute?
del i.foo
# AttributeError: __delete__
Why do you use a non-model instance in this case? Just create a dummy model instance with overriden eq method:
class MyDjangoModel(models.Model):
match_always = False
def __eq__(self, obj):
if (self.match_always):
return True
return self.id == obj.id
.....
always_equal = MyDjangoModel()
always_equal.match_always = True
for i in bag_of_objects:
if certain_conditions_met:
i.foo = always_equal
I'm looking to implement a property class for appengine, very similar to the existing db.ReferenceProperty. I am implementing my own version because I want some other default return values. My question is, how do I make the property remember its returned value, so that the datastore query is only performed the first time the property is fetched? What I had is below, and it does not work. I read that the Property classes do not belong to the instances, but to the model definition, so I guess that the return value is not cached for each instance, but overwritten on the model every time. Where should I store this _resolved variable?
class PageProperty(db.Property):
data_type = Page
def get_value_for_datastore(self, model_instance):
page = super(PageProperty, self).get_value_for_datastore(model_instance)
self._resolved = page
return page.key().name()
def make_value_from_datastore(self, value):
if not hasattr(self, '_resolved'):
self._resolved = Page.get_by_name(value)
return self._resolved
Edit
Alex' answer is certainly usable. But it seems that the built-in db.ReferenceProperty does store the _RESOLVED variable on the model instance. As evidenced by:
[...]
setattr(model_instance, self.__resolved_attr_name(), value)
[...]
def __resolved_attr_name(self):
return '_RESOLVED' + self._attr_name()
The get_value_for_datastore method is passed the model instance, but make_value_from_datastore is not, so how do they find the _RESOLVED property from that method?
Edit 2
From the code I gather that google is using the __get__() and __set__() methods, both of which do get the model instance as an argument. Are those usable in custom classes? What is the difference with get_value_for_datastore and its counterpart?
A PageProperty instance exists per-model, not per-entity (where an entity is an instance of the model class). So I think you need a dictionary that maps pagename -> Page entity, instead of a single attribute per PageProperty instance. E.g., maybe something like...:
class PageProperty(db.Property):
data_type = Page
def __init__(self, *a, **k):
super(PageProperty, self).__init__(*a, **k)
self._mycache = {}
def get_value_for_datastore(self, model_instance):
page = super(PageProperty, self).get_value_for_datastore(model_instance)
name = page.key().name()
self._mycache[name] = page
return name
def make_value_from_datastore(self, value):
if value not in self._mycache:
self._mycache[value] = Page.get_by_name(value)
return self._mycache[value]
If you only want to change some small part of the behaviour of ReferenceProperty, you may want to simply extend it, overriding its default_value method. You may find the source for ReferenceProperty to be instructive.