Use model method as default value for Django Model? - python

I actually have this method in my Model:
def speed_score_compute(self):
# Speed score:
#
# - 8 point for every % of time spent in
# high intensity running phase.
# - Average Speed in High Intensity
# running phase (30km/h = 50 points
# 0-15km/h = 15 points )
try:
high_intensity_running_ratio = ((
(self.h_i_run_time * 100)/self.training_length) * 8)
except ZeroDivisionError:
return 0
high_intensity_running_ratio = min(50, high_intensity_running_ratio)
if self.h_i_average_speed < 15:
average_speed_score = 10
else:
average_speed_score = self.cross_multiplication(
30, self.h_i_average_speed, 50)
final_speed_score = high_intensity_running_ratio + average_speed_score
return final_speed_score
I want to use it as default for my Model like this:
speed_score = models.IntegerField(default=speed_score_compute)
But this don't work (see Error message below) . I've checked different topic like this one, but this work only for function (not using self attribute) but not for methods (I must use methods since I'm working with actual object attributes).
Django doc. seems to talk about this but I don't get it clearly maybe because I'm still a newbie to Django and Programmation in general.
Is there a way to achieve this ?
EDIT:
My function is defined above my models. But here is my error message:
ValueError: Could not find function speed_score_compute in tournament.models.
Please note that due to Python 2 limitations, you cannot serialize unbound method functions (e.g. a method declared and used in the same class body). Please move the function into the main module body to use migrations.
For more information, see https://docs.djangoproject.com/en/1.8/topics/migrations/#serializing-values
Error message is clear, it seems that I'm not able to do this. But is there another way to achieve this ?

The problem
When we provide a default=callable and provide a method from a model, it doesn't get called with the self argument that is the model instance.
Overriding save()
I haven't found a better solution than to override MyModel.save()* like below:
class MyModel(models.Model):
def save(self, *args, **kwargs):
if self.speed_score is None:
self.speed_score = ...
# or
self.calculate_speed_score()
# Now we call the actual save method
super(MyModel, self).save(*args, **kwargs)
This makes it so that if you try to save your model, without a set value for that field, it is populated before the save.
Personally I just prefer this approach of having everything that belongs to a model, defined in the model (data, methods, validation, default values etc). I think this approach is referred to as the fat Django model approach.
*If you find a better approach, I'd like to learn about it too!
Using a pre_save signal
Django provides a pre_save signal that runs before save() is ran on the model. Signals run synchronously, i.e. the pre_save code needs to finish running before save() is called on the model. You'll get the same results (and order of execution) as overriding the save().
from django.db.models.signals import pre_save
from django.dispatch import receiver
from myapp.models import MyModel
#receiver(pre_save, sender=MyModel)
def my_handler(sender, **kwargs):
instance = kwargs['instance']
instance.populate_default_values()
If you prefer to keep the default values behavior separated from the model, this approach is for you!
When is save() called? Can I work with the object before it gets saved?
Good questioin! Because we'd like the ability to work with our object before subjecting to saving of populating default values.
To get an object, without saving it, just to work with it we can do:
instance = MyModel()
If you create it using MyModel.objects.create(), then save() will be called. It is essentially (see source code) equivalent to:
instance = MyModel()
instance.save()
If it's interesting to you, you can also define a MyModel.populate_default_values(), that you can call at any stage of the object lifecycle (at creation, at save, or on-demande, it's up to you)

Related

About converting custom functions in models into annotations in custom managers/querysets

Being new to Django, I'm starting to care a bit about performance of my web application.
I'm trying to transform many of my custom functions / properties which were originally in my models to querysets within custom managers.
in my model I have:
class Shape(models.Model):
#property
def nb_color(self):
return 1 if self.colors=='' else int(1+sum(self.colors.upper().count(x) for x in 'ABCDEFGHIJKLMNOPQRSTUVWXYZ'))
def __str__(self):
return self.name + "-" + self.constraints
#property
def image_url(self):
return format_html(f'{settings.SVG_DIR}/{self}.svg')
#property
def image_src(self):
return format_html('<img src="{url}"|urlencode />'.format(url = self.image_url))
def image_display(self):
return format_html(f'{self.image_src}"')
But I'm not clear on a few points:
1/ is there any pros or cons declaring with the propriety decorator in a django model?
2/ what is the cost of calling a function/property in term of database calls
and therefore, is there an added value to use custom managers / querysets and define annotations to simulate my functions at that level?
3/ how would you suggest me to transform my image & nb_color functions into annotations
Thanks in advance
PS: For the image related functions, I mostly figured it out:
self.annotate(image_url = Concat(Value(join(settings.SVG_DIR,'')), F('fullname'), Value('.svg'), output_field=CharField()),
image_src = Concat(Value('<img src="'), F('image_url'), Value('"|urlencode />'), output_field=CharField()),
image_display = Concat(Value(''),F('image_src'), Value(''), output_field=CharField()),
)
I am however having an issue for the display of image_src
through:
readonly_fields=['image']
def image(self, obj):
return format_html(obj.image_src)
it doesn't seem to find the image while the adress is ok.
If anybody has an idea...
PS: For the image related functions, I mostly figured it out:
self.annotate(image_url = Concat(Value(join(settings.SVG_DIR,'')),
F('fullname'), Value('.svg'), output_field=CharField()),
image_src = Concat(Value(''), output_field=CharField()),
image_display = Concat(Value(''),F('image_src'),
Value(''), output_field=CharField()),
) I am however having an issue for the display of image_src through:
readonly_fields=['image'] def image(self, obj):
return format_html(obj.image_src) it doesn't seem to find the image while the adress is ok.
I figured it up for my image problem: I should simply use a relative path and let Django manage:
self.annotate(image_url = Concat(Value('/static/SVG_shapes/'), F('fullname'), Value('.svg'), output_field=CharField()),)
With now 1.5 years more experience, I'll try to answer my newbie questions for the next ones who may have the same questions poping into their minds.
1/ is there any pros or cons declaring with the propriety decorator in a django model?
No cons that I could see so far.
It allows the data to be retrieved as a property of the model (my_shape.image_url), instead of having to call the corresponding method (my_shape.image_url())
However, for different purposes, one my prefer to have a callable (the method) instead of a property
2/ what is the cost of calling a function/property in term of database calls
No extra calling to the database if the data it needs as input are already available, or are themselves attributes of the instance object (fields / properties / methods that don't require input from outside the instance object)
However, if external data are needed, a database call will be generated for each of them.
For this reason, it can be valuable to cache the result of such a property by using the #cached_property decorator instead of the #property decorator
The only thing needed to use cached properties is the following import:
from django.utils.functional import cached_property
After being called for the first time, the cached property will remain available at no extra cost during all the lifetime of the object instance,
and its content can be manipulated like any other property / variable:
and therefore, is there an added value to use custom managers / querysets and define annotations to simulate my functions at that level?
In my understanding and practice so far, it is not uncommon to replicate the same functionality in both property & managers
The reason is that properties are easily available when we are interested only in one specific object instance,
while when you are interested into comparing / retrieving a given property for a range of objects, it is much more efficient to calculate & annotate this property for the whole queryset, for instance through using model managers
My give-away would be:
For a given model,
(1) try to put all the business logic concerning a single object instance into model methods / properties
(2) and all the business logic concerning a range of objects into model managers
3/ how would you suggest me to transform my image & nb_color functions into annotations
Already answered in previous answer

Generalizing deletion of placeholderfield django-cms with signals

I'm currently working in django-cms and utilizing a PlaceholderField in several of my models. As such, I'd like to generalize this process to avoid having to override every model's delete, and add a specialized manager for each type of object just to handle deletions.
a little back story:
After working up the design of my application a little bit and using the (honestly impressive) PlaceholderFields I noticed that if I deleted a model that contained one of these fields, it would leave behind it's plugins/placeholder after deletion of the model instance that spawned it. This surprised me, so I contacted them and according to django-cms's development team:
By design, the django CMS PlaceholderField does not handle deletion of the plugins for you.
If you would like to clear the placeholder content and remove the placeholder itself when the object that references it is deleted, you can do so by calling the clear() method on the placeholder instance and then the delete() method
So being that this is expected to happen prior to deletion of the model, my first thought was use the pre_delete signal provided by django. So I set up the following:
my problem
models.py
class SimplifiedCase(models.Model):
#... fields/methods ...
my_placeholder_instance= PlaceholderField('reading_content') # ****** the placeholder
#define local method for clearing placeholderfields for this object
def cleanup_placeholders(self):
# remove any child plugins of this placeholder
self.my_placeholder_instance.clear()
# remove the placeholder itself
self.my_placeholder_instance.delete()
# link the reciever to the section
signals.pre_delete.connect(clear_placeholderfields, sender=SimplifiedCase)
signals.py
# create a generalized reciever
#(expecting multiple models to contain placeholders so generalizing the process)
def clear_placeholderfields(sender, instance, **kwargs):
instance.cleanup_placeholders() # calls the newly defined cleanup method in the model
I expected this to work without any issues, but I'm getting some odd behavior from when calling the [placeholder].delete() method from within the method called by the pre_delete receiver.
For some reason, when calling the placeholder's delete() method in my cleanup_placeholders method, it fires the parent's pre_delete method again. Resulting in an recursion loop
I'm relatively new to using django/django-cms, so possibly I'm overlooking something or fundamentally misunderstanding what's causing this loop, but is there a way to achieve what I'm trying to do here using the pre_delete signals? or am I going about this poorly?
Any suggestions would be greatly appreciated.
After several days of fighting this, I believe I've found a method of deleting Placeholders along with the 3rd party app models automatically.
Failed attempts:
- Signals failed to be useful due to the recursion mentioned my question, which is caused by all related models of a placeholder triggering a pre_delete event during the handling of the connected model's pre_delete event.
Additionally, I had a need for handling child FK-objects that also contained their own placeholders. after much trial and error the best course of action (I could find) to ensure deletion of placeholders for child objects is as follows:
define a queryset which performs an iterative deletion. (non-ideal, but only way to ensure the execution of the following steps)
class IterativeDeletion_Manager(models.Manager):
def get_queryset(self):
return IterativeDeletion_QuerySet(self.model, using=self._db)
def delete(self, *args, **kwargs):
return self.get_queryset().delete(*args, **kwargs)
class IterativeDeletion_QuerySet(models.QuerySet):
def delete(self, *args, **kwargs):
with transaction.atomic():
# attempting to prevent 'bulk delete' which ignores overridden delete
for obj in self:
obj.delete()
Set the model containing the PlaceholderField to use the newly defined manager.
Override the deletion method of any model that contains a placeholder field to handle the deletion of the placeholder AFTER deletion of the connected model. (i.e. an unoffical post_delete event)
class ModelWithPlaceholder(models.Model):
objects = IterativeDeletion_Manager()
# the placeholder
placeholder_content = PlaceholderField('slotname_for_placeholder')
def delete(self, *args, **kwargs):
# ideally there would be a method to get all fields of 'placeholderfield' type to populate this
placeholders = [self.placeholder_content]
# if there are any FK relations to this model, set the child's
# queryset to use iterative deletion as well, and manually
# call queryset delete in the parent (as follows)
#self.child_models_related_name.delete()
# delete the object
super(ModelWithPlaceholder,self).delete(*args, **kwargs)
# clear, and delete the placeholders for this object
# ( must be after the object's deletion )
for ph in placeholders:
ph.clear()
ph.delete()
Using this method I've verified that the child PlaceholderFieldfor each object is deleted along with the object utilizing the Admin Interface, Queryset deletions, Direct deletions. (at least in my usage cases)
Note: This seems unintuitive to me, but the deletion of placeholders needs to happen after deletion of the model itself, at least in the case of having child relations in the object being deleted.
This is due to the fact that calling placeholder.delete() will trigger a deletion of all related models to the deleted-model containing the PlaceholderField.
I didn't expect that at all. idk if this was the expected functionality of deleting a placeholder, but it does. Being that the placeholder is still contained in the database prior to deleting the object (by design), there shouldn't be an issue with handling it's deletion after calling the super(...).delete()
if ANYONE has a better solution to this problem, feel free to comment and let me know my folly.
This is the best I could come up with after many hours running through debuggers and tracing through the deletion process.

Django: overriding insert, update, delete in a model

Sometimes you want to do something when there is an event for model (on create object, on update object, on delete object).
There is a method you can override in Model called save. And it even has a parameter forced_insert which I first thought would always be set to a proper value meaning whether an object will be created or updated. But the parameter is optional and you cannot expect it to be right.
Searching in source code of Model led me to methods _do_update and _do_insert, but the underscore at the beginning tells me that this method is not recommended for use. And it also have a lot of parameters which pollute code when you override it.
The only solution left that I could find is using django.db.models.signals. But I believe they are meant to be used for external purposes like when you want to create a UserProfile on every User create. But I have some internal purposes like updating fields on update. Also using signals makes code look spreaded and harder to understand.
What would be the right way deal with implementing functional on these Model events?
Look at this simplified condition from django base Model _save_table method - decision to update or insert depends on model pk and force_insert value:
def _save_table(self, raw=False, cls=None, force_insert=False,
force_update=False, using=None, update_fields=None):
updated = False
# ...
if pk_set and not force_insert:
updated = self._do_update()
if not updated:
result = self._do_insert()
# ...
return updated
And you can go that same way in your overrided save method if you want entirely custom update/insert operations:
def save(self, force_insert=False, **kwargs):
updated = False
if self.pk and not force_insert:
updated = self.custom_update()
if not updated:
self.custom_insert()
return updated

Is this the way to validate Django model fields?

As I understand it, when one creates a Django application, data is validated by the form before it's inserted into a model instance which is then written to the database. But if I want to create an additional layer of protection at the data model layer, is what I've done below the current "best practice?" I'm trying to ensure that a reviewer's name cannot be omitted nor be left blank. Should I be putting any custom validation in the 'clean' method as I've done here and then have 'save' call 'full_clean" which calls 'clean'? If not, what's the preferred method? Thanks.
class Reviewer(models.Model):
name = models.CharField(max_length=128, default=None)
def clean(self, *args, **kwargs):
if self.name == '':
raise ValidationError('Reviewer name cannot be blank')
super(Reviewer, self).clean(*args, **kwargs)
def full_clean(self, *args, **kwargs):
return self.clean(*args, **kwargs)
def save(self, *args, **kwargs):
self.full_clean()
super(Reviewer, self).save(*args, **kwargs)
Firstly, you shouldn't override full_clean as you have done. From the django docs on full_clean:
Model.full_clean(exclude=None)
This method calls Model.clean_fields(), Model.clean(), and Model.validate_unique(), in that order and raises a ValidationError that has a message_dict attribute containing errors from all three stages.
So the full_clean method already calls clean, but by overriding it, you've prevented it calling the other two methods.
Secondly, calling full_clean in the save method is a trade off. Note that full_clean is already called when model forms are validated, e.g. in the Django admin. So if you call full_clean in the save method, then the method will run twice.
It's not usually expected for the save method to raise a validation error, somebody might call save and not catch the resulting error. However, I like that you call full_clean rather than doing the check in the save method itself - this approach allows model forms to catch the problem first.
Finally, your clean method would work, but you can actually handle your example case in the model field itself. Define your CharField as
name = models.CharField(max_length=128)
The blank option will default to False. If the field is blank, a ValidationError will be raised when you run full_clean. Putting default=None in your CharField doesn't do any harm, but it is a bit confusing when you don't actually allow None as a value.
After thinking about Alasdair's answer and doing addtional reading, my sense now is that Django's models weren't designed so as to be validated on a model-only basis as I'm attempting to do. Such validation can be done, but at a cost, and it entails using validation methods in ways they weren't intended for.
Instead, I now believe that any constraints other than those that can be entered directly into the model field declarations (e.g. "unique=True") are supposed to be performed as a part of Form or ModelForm validation. If one wants to guard against entering invalid data into a project's database via any other means (e.g. via the ORM while working within the Python interpreter), then the validation should take place within the database itself. Thus, validation could be implemented on three levels: 1) First, implement all constraints and triggers via DDL in the database; 2) Implement any constraints available to your model fields (e.g. "unique=True"); and 3) Implement all other constraints and validations that mirror your database-level constraints and triggers within your Forms and ModelForms. With this approach, any form validation errors can be re-displayed to the user. And if the programmer is interacting directly with the database via the ORM, he/she would see the database exceptions directly.
Thoughts anyone?
Capturing the pre-save signals on on my models ensured clean will be called automatically.
from django.db.models.signals import pre_save
def validate_model(sender, **kwargs):
if 'raw' in kwargs and not kwargs['raw']:
kwargs['instance'].full_clean()
pre_save.connect(validate_model, dispatch_uid='validate_models')
Thanks #Kevin Parker for your answer, quite helpful!
It is common to have models in your app outside of the ones you define, so here is a modified version you can use to scope this behavior only to your own models or a specific app/module as desired.
from django.db.models.signals import pre_save
import inspect
import sys
MODELS = [obj for name, obj in
inspect.getmembers(sys.modules[__name__], inspect.isclass)]
def validate_model(sender, instance, **kwargs):
if 'raw' in kwargs and not kwargs['raw']:
if type(instance) in MODELS:
instance.full_clean()
pre_save.connect(validate_model, dispatch_uid='validate_models')
This code will run against any models defined inside the module where it is executed but you can adapt this to scope more strictly or to be a set of modules / apps if desired.

Is there a better way to create this model? (Django)

I have a model that pings a REST service and saves the results.
class StoreStatus(models.Model):
store = models.OneToOneField(Store)
status = models.TextField()
def save(self, *args, **kwargs):
self.status = get_store_information(self.store.code)
self.pk = self.store.pk
super( StoreStatus, self ).save(*args, **kwargs)
I need to run it every repeatedly and figure I can just .save() it in a view, since the "Store" object is in the majority of my views.
Is there a better way to do this? I had to the set the pk manually because I was getting duplicate errors when I tried to save a second time.
Seems kind of dirty, and I'm trying to improve my coding.
Thanks
That looks quite bad.
First, associating the retrieval of the status information with the saving of the object is quite a bad idea. If 'updating' is the only action you'll ever perform on this model, then maybe a better idea would be to write a an "update()" method that automatically saves once the status is updated, instead of doing it the other way around.
def update(self):
self.status = get_store_information(self.store.code)
self.save()
Second: how are you creating the first instance of this model? You'll get duplication errors if you are trying to save a new instance every time the model gets updated. Say, if you do something like this:
# this will crap out
update = Update(mystore)
update.save()
What you should be doing is something like:
# this will work (provided we have 'update')
mystore.status.update()
Or:
# always retrieve the stored instance before saving
status, created = StoreStatus.objects.get_or_create(store=mystore)
status.update()
In case you are lazy enough, you can always add an "update_status" method to your Store model and perform the creation/update there. It's always better to be explicit about what you are doing. Remember: Django is based upon the principle of least surprise, and so should your code! :)
If I were you, I would have created a function that would:
1. Accept the Store object as a parameter,
2. Make the REST call, and
3. On receiving the response then update the status in the StoreStatus.
This would be desirable to enable loose coupling which is required for architectures involving web-based services.
Moreover, if you just want to avoid duplicate PK errors, you can check for id to safely loop the update and create conditions.
def save(self, *args, **kwargs):
if self.id:
# Update case
pass
else:
# New object
# Process for the new object
pass
# Save the changes
super(StoreStatus, self).save(*args, **kwargs)

Categories