I have a model with some customizations in the save method.
def SomeModel(models.Model):
def save(self, *args, **kwargs):
if not kwargs.pop('skip_expensive_processing', False):
do_expensice_processing()
return super(SomeModel, self).save(*args, **kwargs)
Basically, whenever the save method gets called, I want some expensive process to be executed
But when doing a bunch a saves together (a mass import), I don't want to do the expensive processing on each save. I want to do the expensive process once after all the objects are saved.
In in case of a mass save, the objects are being created through a ModelForm. I need to find some way to modify the form so that when the form calls the save method on SomeModel, it pases on that skip_expensive_processing keyword arg. How do I do this?
I loked through te source of the ModelForm.save() method, but it doesn't seem to be caling the model save method in a too straight forward manner...
You probably don't need to override the modelform's save method. You should just be able to pass commit=True, and then the model save won't be called at all.
Related
Here is one piece from Django documentation:
from django.db import models
class Blog(models.Model):
name = models.CharField(max_length=100)
tagline = models.TextField()
def save(self, *args, **kwargs):
do_something()
super(Blog, self).save(*args, **kwargs) # Call the "real" save() method.
do_something_else()
My hesitation is focus on the save method.
Why the author separates do_something from do_something_else?
Because of the existence of 'Call the "real" save() method',what's the meaning of do_something(),which seems to be "false" ?I even can do the manipulation:
def save(self, *args, **kwargs):
super(Blog, self).save(*args, **kwargs) # Call the "real" save() method.
do_something()
do_something_else()
Am I right?
Yes, you're right.
There is no deep meaning.
The sentences just mean "You can write some methods which you want to do before super().save or after super().save()."
However, I slightly doubt that you know super() in detail, which is one of the syntax in python.
do_something() and do_something_else() aren't real functions. They aren't defined. They are just hints for you to do something there, then, if you need, do something else, implementing even functions and calling them there, if you need.
Usually you need some field to automatically update/get a value when another field is saved. One way to do this is by overriding the model save method. And usually you do this before you call super().save().
Well, it is just showing that you can fall functions (or write code) before or after calling a parent or sibling method with super.
But first of all, you must know what super is...
In your example, you e calling save with super, which triggers the django based methods and functions and do many other stuff that is within ModelBase.save().
You are doing this to follow standart django save procedure.
But, you may want to make some checks or make some pre-save work, so you can call a method to do stuff that should be done before saving the model instance. Maybe you want to log the record time to a file and you just write a method and call it to log timestamp before you call super().save()
The same is also valid as post-save actions.
In a Django model, I want to avoid doubles so I wrote this:
class Points(TimeStampedModel):
....
def save(self, *args, **kwargs):
if self.pk:
super(Points, self).save(*args, **kwargs)
else: # save() is a creation here, not an update
if Points.objects.filter(benef_card=self.benef_card,
spendable_at=self.spendable_at).exists():
pass
else:
super(Points, self).save(*args, **kwargs)
I was very surprised to find this result in my database:
I suppose there is something wrong with my code, but I'd like to know how these doubles could exist inspite of the protection I wrote in my save() method?
I think what you want instead is:
class Points(TimeStampedModel):
# ...
class Meta:
unique_together = ('benef_card', 'spendable_at')
Then you don't need to override save -- the uniqueness will be handled by a DB constraint and it is generally the way to go. This approach is better because save is not always called (example: bulk operations) so you might get different behavior across your app.
You might also want to check out update_or_create which just returns an object with attributes you need, creating it if it doesn't exist.
You could use Django signals instead to check before save.
I have a model:
class A(models.Model):
number = models.IntegerField()
But when I call A.save(), I want to ensure that number is a prime (or other conditions), or the save instruction should be cancelled.
So how can I cancel the save instruction in the pre_save signal receiver?
#receiver(pre_save, sender=A)
def save_only_for_prime_number(sender, instance, *args, **kwargs):
# how can I cancel the save here?
See my another answer: https://stackoverflow.com/a/32431937/2544762
This case is normal, if we just want to prevent the save, throw an exception:
from django.db.models.signals import pre_save, post_save
#receiver(pre_save)
def pre_save_handler(sender, instance, *args, **kwargs):
# some case
if case_error:
raise Exception('OMG')
I'm not sure you can cancel the save only using the pre_save signal. But you can easily achieve this by overriding the save method:
def save(self):
if some_condition:
super(A, self).save()
else:
return # cancel the save
As mentioned by #Raptor, the caller won't know if the save was successful or not. If this is a requirement for you, take look at the other answer which forces the caller to deal with the "non-saving" case.
If the data's always coming from a Form and you have a straightforward test for whether or not the save should occur, send it through a validator. Note, though, that validators aren't called for save() calls originating on the backend. If you want those to be guarded as well, you can make a custom Field, say class PrimeNumberField(models.SmallIntegerField) If you run your test and raise an exception in the to_python() method of that custom field, it will prevent the save. You can also hook into the validation of a specific field by overriding any of several other methods on the Field, Form, or Model.
Is there a difference between calling a model's save like :
self.model.save(*args,**kwargs)
and:
self.model.save()
The obvious difference is that in the first case you are passing positional and keyword arguments. If you wonder what arguments Model.save() takes and what they do, the simplest thing is to read the source code. Then you'll find something like:
def save(self, force_insert=False, force_update=False, using=None):
"""
Saves the current instance. Override this in a subclass if you want to
control the saving process.
The 'force_insert' and 'force_update' parameters can be used to insist
that the "save" must be an SQL insert or update (or equivalent for
non-SQL backends), respectively. Normally, they should not be set.
"""
if force_insert and force_update:
raise ValueError("Cannot force both insert and updating in model saving.")
self.save_base(using=using, force_insert=force_insert, force_update=force_update)
The third argument, using, is not documented, it specifies which db connection you want to use (if you have more than one db connection).
To make a long story short:
when you want to save your instance (in a view or form or whatever), you usually just want to let Django handle all this so you call my_model_instance.save() without arguments
when you override the save method in your Model class, you definitly want to accept and pass the same arguments when calling on the base class save, ie:
class MyModel(models.Model):
def save(self, *args, **kw):
do_something_here()
super(MyModel, self).save(*args, **kw)
do_something_else()
I have a field that inherits from ImageField with a custom storage like this:
image = PicasaImageField(upload_to=get_upload_to, blank=True, storage=pwcs())
When deleting an object with this field (using object.delete(), NOT bulk), the delete method of the custom storage doesn't get called. Trying to debug, I couldn't find where Django is going through the fields of an object to delete the file or whatever is behind in the actual storage. Or should I delete the file manually / in a hook / write a custom delete() method into my end model that will call the behind-the-stage delete() on the actual object? I failed to find how this is handled with the standard ImageFile + default filesystem, but would assume regular files would be deleted. Or am I getting it wrong?
Thanks for any insights.
Igor
From the Django 1.3 release notes:
when a model is deleted the FileField's delete() method won't be called. If you need cleanup of orphaned files, you'll need to handle it yourself
As you suggested you could handle this in a custom delete method:
def delete(self, *args, **kwargs):
image.delete(save=False)
super(Foo, self).delete(*args, **kwargs)
or use a receiver function which will be called even when object.delete() is not called:
#receiver(post_delete, sender=Foo, weak=False)
def delete_image_on_file(sender, instance, **kwargs):
image.delete(save=False)