Can someone help me understand the update_field argument for Django signals?
According to the docs:
update_fields: The set of fields to update explicitly specified in the
save() method. None if this argument was not used in the save() call.
I'm not clear on what this means. I was trying to use it to prevent a signal function from executing unless certain fields were updated:
#receiver(post_save, sender=SalesRecord)
def spawn_SaleSource_record(sender, update_fields, created, instance, **kwargs):
if created or update_fields is 'sale_item' or 'sales_qty':
*do function*
However, it seems that it still executes during another signal process when an object is saved, even if an unspecified field is explicitly updated:
x = SalesRecord.objects.filter(paid_off=False, customer=instance.customer).first()
x.paid_off = True
x.save(update_fields=['paid_off'])
Am I going about this wrong?
Your condition does not correspond to what you want as 'sales_qty' is always true.
You want your condition to be:
if created or 'sale_item' in update_fields or 'sales_qty' in update_fields:
Related
Let´s say, there is a Django model called TaskModel which has a field priority and we want to insert a new element and increment the existing element which has already the priority and increment also the priority of the following elements.
priority is just a numeric field without any special flags like unique or primary/foreign key
queryset = models.TaskModel.objects.filter().order_by('priority')
Can this be done in a smart way with some methods on the Queryset itself?
I believe you can do this by using Django's F expressions and overriding the model's save method. I guess you could instead override the model's __init__ method as in this answer, but I think using the save method is best.
class TaskModel(models.Model):
task = models.CharField(max_length=20)
priority = models.IntegerField()
# Override the save method so whenever a new TaskModel object is
# added, this will be run.
def save(self, *args, **kwargs):
# First get all TaskModels with priority greater than, or
# equal to the priority of the new task you are adding
queryset = TaskModel.objects.filter(priority__gte=self.priority)
# Use update with the F expression to increase the priorities
# of all the tasks above the one you're adding
queryset.update(priority=F('priority') + 1)
# Finally, call the super method to call the model's
# actual save() method
super(TaskModel, self).save(*args, **kwargs)
def __str__(self):
return self.task
Keep in mind that this can create gaps in the priorities. For example, what if you create a task with priority 5, then delete it, then add another task with priority 5? I think the only way to handle that would be to loop through the queryset, perhaps with a function like the one below, in your view, and call it whenever a new task is created, or it's priority modified:
# tasks would be the queryset of all tasks, i.e, TaskModels.objects.all()
def reorder_tasks(tasks):
for i, task in enumerate(tasks):
task.priority = i + 1
task.save()
This method is not nearly as efficient, but it will not create the gaps. For this method, you would not change the TaskModel at all.
Or perhaps you can also override the delete method of the TaskModel as well, as shown in this answer, but I haven't had a chance to test this yet.
EDIT
Short Version
I don't know how to delete objects using a similar method to saving while keeping preventing priorities from having gaps. I would just use a loop as I have shown above.
Long version
I knew there was something different about deleting objects like this:
def delete(self, *args, **kwargs):
queryset = TaskModel.objects.filter(priority__gt=self.priority)
queryset.update(priority=F('priority') - 1)
super(TaskModel, self).delete(*args, **kwargs)
This will work, in some situations.
According to the docs on delete():
Keep in mind that this [calling delete()] will, whenever possible, be executed purely in
SQL, and so the delete() methods of individual object instances will
not necessarily be called during the process. If you’ve provided a
custom delete() method on a model class and want to ensure that it is
called, you will need to “manually” delete instances of that model
(e.g., by iterating over a QuerySet and calling delete() on each
object individually) rather than using the bulk delete() method of a
QuerySet.
So if you delete() a TaskModel object using the admin panel, the custom delete written above will never even get called, and while it should work if deleting an instance, for example in your view, since it will try acting directly on the database, it will not show up in the python until you refresh the query:
tasks = TaskModel.objects.order_by('priority')
for t in tasks:
print(t.task, t.priority)
tr = TaskModel.objects.get(task='three')
tr.delete()
# Here I need to call this AGAIN
tasks = TaskModel.objects.order_by('priority')
# BEFORE calling this
for t in tasks:
print(t.task, t.priority)
# to see the effect
If you still want to do it, I again refer to this answer to see how to handle it.
I'm currently working in django-cms and utilizing a PlaceholderField in several of my models. As such, I'd like to generalize this process to avoid having to override every model's delete, and add a specialized manager for each type of object just to handle deletions.
a little back story:
After working up the design of my application a little bit and using the (honestly impressive) PlaceholderFields I noticed that if I deleted a model that contained one of these fields, it would leave behind it's plugins/placeholder after deletion of the model instance that spawned it. This surprised me, so I contacted them and according to django-cms's development team:
By design, the django CMS PlaceholderField does not handle deletion of the plugins for you.
If you would like to clear the placeholder content and remove the placeholder itself when the object that references it is deleted, you can do so by calling the clear() method on the placeholder instance and then the delete() method
So being that this is expected to happen prior to deletion of the model, my first thought was use the pre_delete signal provided by django. So I set up the following:
my problem
models.py
class SimplifiedCase(models.Model):
#... fields/methods ...
my_placeholder_instance= PlaceholderField('reading_content') # ****** the placeholder
#define local method for clearing placeholderfields for this object
def cleanup_placeholders(self):
# remove any child plugins of this placeholder
self.my_placeholder_instance.clear()
# remove the placeholder itself
self.my_placeholder_instance.delete()
# link the reciever to the section
signals.pre_delete.connect(clear_placeholderfields, sender=SimplifiedCase)
signals.py
# create a generalized reciever
#(expecting multiple models to contain placeholders so generalizing the process)
def clear_placeholderfields(sender, instance, **kwargs):
instance.cleanup_placeholders() # calls the newly defined cleanup method in the model
I expected this to work without any issues, but I'm getting some odd behavior from when calling the [placeholder].delete() method from within the method called by the pre_delete receiver.
For some reason, when calling the placeholder's delete() method in my cleanup_placeholders method, it fires the parent's pre_delete method again. Resulting in an recursion loop
I'm relatively new to using django/django-cms, so possibly I'm overlooking something or fundamentally misunderstanding what's causing this loop, but is there a way to achieve what I'm trying to do here using the pre_delete signals? or am I going about this poorly?
Any suggestions would be greatly appreciated.
After several days of fighting this, I believe I've found a method of deleting Placeholders along with the 3rd party app models automatically.
Failed attempts:
- Signals failed to be useful due to the recursion mentioned my question, which is caused by all related models of a placeholder triggering a pre_delete event during the handling of the connected model's pre_delete event.
Additionally, I had a need for handling child FK-objects that also contained their own placeholders. after much trial and error the best course of action (I could find) to ensure deletion of placeholders for child objects is as follows:
define a queryset which performs an iterative deletion. (non-ideal, but only way to ensure the execution of the following steps)
class IterativeDeletion_Manager(models.Manager):
def get_queryset(self):
return IterativeDeletion_QuerySet(self.model, using=self._db)
def delete(self, *args, **kwargs):
return self.get_queryset().delete(*args, **kwargs)
class IterativeDeletion_QuerySet(models.QuerySet):
def delete(self, *args, **kwargs):
with transaction.atomic():
# attempting to prevent 'bulk delete' which ignores overridden delete
for obj in self:
obj.delete()
Set the model containing the PlaceholderField to use the newly defined manager.
Override the deletion method of any model that contains a placeholder field to handle the deletion of the placeholder AFTER deletion of the connected model. (i.e. an unoffical post_delete event)
class ModelWithPlaceholder(models.Model):
objects = IterativeDeletion_Manager()
# the placeholder
placeholder_content = PlaceholderField('slotname_for_placeholder')
def delete(self, *args, **kwargs):
# ideally there would be a method to get all fields of 'placeholderfield' type to populate this
placeholders = [self.placeholder_content]
# if there are any FK relations to this model, set the child's
# queryset to use iterative deletion as well, and manually
# call queryset delete in the parent (as follows)
#self.child_models_related_name.delete()
# delete the object
super(ModelWithPlaceholder,self).delete(*args, **kwargs)
# clear, and delete the placeholders for this object
# ( must be after the object's deletion )
for ph in placeholders:
ph.clear()
ph.delete()
Using this method I've verified that the child PlaceholderFieldfor each object is deleted along with the object utilizing the Admin Interface, Queryset deletions, Direct deletions. (at least in my usage cases)
Note: This seems unintuitive to me, but the deletion of placeholders needs to happen after deletion of the model itself, at least in the case of having child relations in the object being deleted.
This is due to the fact that calling placeholder.delete() will trigger a deletion of all related models to the deleted-model containing the PlaceholderField.
I didn't expect that at all. idk if this was the expected functionality of deleting a placeholder, but it does. Being that the placeholder is still contained in the database prior to deleting the object (by design), there shouldn't be an issue with handling it's deletion after calling the super(...).delete()
if ANYONE has a better solution to this problem, feel free to comment and let me know my folly.
This is the best I could come up with after many hours running through debuggers and tracing through the deletion process.
Sometimes you want to do something when there is an event for model (on create object, on update object, on delete object).
There is a method you can override in Model called save. And it even has a parameter forced_insert which I first thought would always be set to a proper value meaning whether an object will be created or updated. But the parameter is optional and you cannot expect it to be right.
Searching in source code of Model led me to methods _do_update and _do_insert, but the underscore at the beginning tells me that this method is not recommended for use. And it also have a lot of parameters which pollute code when you override it.
The only solution left that I could find is using django.db.models.signals. But I believe they are meant to be used for external purposes like when you want to create a UserProfile on every User create. But I have some internal purposes like updating fields on update. Also using signals makes code look spreaded and harder to understand.
What would be the right way deal with implementing functional on these Model events?
Look at this simplified condition from django base Model _save_table method - decision to update or insert depends on model pk and force_insert value:
def _save_table(self, raw=False, cls=None, force_insert=False,
force_update=False, using=None, update_fields=None):
updated = False
# ...
if pk_set and not force_insert:
updated = self._do_update()
if not updated:
result = self._do_insert()
# ...
return updated
And you can go that same way in your overrided save method if you want entirely custom update/insert operations:
def save(self, force_insert=False, **kwargs):
updated = False
if self.pk and not force_insert:
updated = self.custom_update()
if not updated:
self.custom_insert()
return updated
i would like to know how onchange function works with boolean and integer fields.
Suppose if one boolean field get changed to True, the value of respective integer should be changed.
Thanks in advance.
#api.onchange
This decorator will trigger the call to the decorated function if any of the fields specified in the decorator is changed in the form:
#api.onchange('fieldx')
def do_stuff(self):
if self.fieldx == x:
self.fieldy = 'toto'
In previous sample self corresponds to the record currently edited on the form. When in on_change context all work is done in the cache. So you can alter RecordSet inside your function without being worried about altering database. That’s the main difference with #api.depends
At function return, differences between the cache and the RecordSet will be returned to the form.
View management
One of the great improvement of the new API is that the onchange are automatically inserted into the form for you in a simple way. You do not have to worry about modifying views anymore.
Warning and Domain
To change domain or send a warning just return the usual dictionary. Be careful not to use #api.one in that case as it will mangle the dictionary (put it in a list, which is not supported by the web client).
I had a problem with post_save being called twice and I spent a lot of time figuring out the imports as had been mentioned. I confirmed that the import is happening only once and there is no question of multiple registrations. Besides I'm using a unique dispatch_uid in the signal registration which as per the documentation should have solved the problem. It did not. I looked more carefully and saw that the signal handler gets called on .create() as well as .save(). Why for create?
The only way I could get it to work is by relying on the hack below inside my signal handler
created = False
#Workaround to signal being emitted twice on create and save
if 'created' in kwargs:
if kwargs['created']:
created=True
#If signal is from object creation, return
if created:
return
This is a follow up to question Django post save signal getting called twice despite uid
Because "creation" is instantiation plus saving.
create(**kwargs)
A convenience method for creating an object and saving it all in one step. Thus:
p = Person.objects.create(first_name="Bruce", last_name="Springsteen")
and:
p = Person(first_name="Bruce", last_name="Springsteen")
p.save(force_insert=True)
are equivalent.