What is the difference between using django signals and calling a method? - python

I am trying to implement a function which sends a notification to all employee records whenever a new document record is published. In the models, I still needed to import the receiver function because my sender model lives in a different project:
receiver function (lives in a different app in project):
def new_document_version_published(sender, instance, **kwargs):
print("New version of document published!")
print(sender)
print(instance)
# Get all employees
employees = []
# Send notifications to employees
buttons = [NotificationButton(button_text="New version of document", value="Ok", style="primary")]
notifyUsers("A new version of the document has been published", buttons, employees, [])
sender (lives in a different app in project):
from django.db.models.signals import post_save
from api.views import new_document_version_published
class DocumentVersion:
...
def save(self, *args, **kw):
if self.pk is not None:
orig = DocumentVersion.objects.get(pk=self.pk)
if orig.date_published != self.date_published:
print('date_published changed')
notify_employees()
if orig.date_approved != self.date_approved:
print('date_approved changed')
super(DocumentVersion, self).save(*args, **kw)
def notify_employees():
post_save.connect(new_document_version_published, sender=DocumentVersion)
I know there is something wrong with my implementation because I don't understand what is the difference between using the signal and just importing and calling the receiver function. All help appreciated!

What is the difference
Calling a function makes the caller dependent on (or at least aware of) the receiver function, while using Django signals makes the receiver function dependent on the signal being called by its caller(s).
From https://docs.djangoproject.com/en/3.2/topics/signals/:
... helps decoupled applications get notified when actions occur elsewhere in the framework. In a nutshell, signals allow certain senders to notify a set of receivers that some action has taken place. They’re especially useful when many pieces of code may be interested in the same events.
When to use Django signals
From https://www.django-antipatterns.com/antipattern/signals.html:
Signals have a variety of problems and unforeseen consequences.
...
Often it is better to avoid using signals. One can implement a lot of logic without signals.
...
Signals can still be a good solution if you want to handle events raised by a third party Django application.
...
What that difference looks like
Calling a function
Pre-save:
# notify_employees() # Replace this
new_document_version_published(self.__class__, self) # with this
Post-save:
class DocumentVersion(models.Model):
...
def save(self, *args, **kw):
orig = None # ......... # Add this
if self.pk is not None:
orig = DocumentVersion.objects.get(pk=self.pk)
if orig.date_published != self.date_published:
print('date_published changed')
# notify_employees() # Replace this
if orig.date_approved != self.date_approved:
print('date_approved changed')
super(DocumentVersion, self).save(*args, **kw)
if orig and orig.date_published != self.date_published: # with this
new_document_version_published(self.__class__, self) #
Using Django Signals
Since Django's post_save signal doesn't pass orig, let's use a custom signal:
Define a signal.
Send the signal.
Implement the receiver function.
Connect the receiver function, usually where the receiver function is defined.
Post-save:
post_save_published = Signal() # Add this
class DocumentVersion(models.Model):
...
def save(self, *args, **kw):
orig = None # ......... # Add this
if self.pk is not None:
orig = DocumentVersion.objects.get(pk=self.pk)
if orig.date_published != self.date_published:
print('date_published changed')
# notify_employees() # Replace this
if orig.date_approved != self.date_approved:
print('date_approved changed')
super(DocumentVersion, self).save(*args, **kw)
if orig and orig.date_published != self.date_published: # with this
post_save_published.send(sender=self.__class__, instance=self) #
def new_document_version_published(sender, instance, **kwargs):
# ...
post_save_published.connect(new_document_version_published, sender=DocumentVersion) # Add this

Related

Adding an item to many to many after creation in django

Recently I've been trying to do something with this.
Think of the family as a facebook group.
class Family(models.Model):
name = models.CharField(max_length=50)
owner = models.ForeignKey(User, on_delete=models.CASCADE, related_name='owned_families')
users = models.ManyToManyField(User, related_name='families', blank=True)
let's assume we have this family object called fm, for illustration purpose.
My problem is, The owner is one of the users right? I mean, When someone creates a family, He's now the owner right? he owns it but he's still a user listed in it's users list.
Now, when I create a new family fm , I want to add the fm.owner to fm.users.
Let's talk about what I've tried.
post_save signal doesn't work with m2m. X
m2m_changed happens when the field is changed, not created. X
Overriding save method, lemme illustrate what I tried to acheive. ?
def save(self, *args, **kwargs):
old = self.pk
super(Family, self).save(*args, **kwargs)
if old is None:
print('This actually shows up')
self.users.add(self.owner)
Basically, this saves the pk each time, First time a family is created, Before calling super..... it has no .pk so I'm counting on this to check if it had no pk (On creation).
The problem is self.users.add(self.owner) doesn't work.
I've tried to clone the object as whole and keep track of it like
def save(self, *args, **kwargs):
old = self
super(Family, self).save(*args, **kwargs)
if old is None:
print("This actually doesn't show up")
self.users.add(self.owner)
This actually is terrible, It takes a refernce to self and when calling super...., The selfand it's reference old gets mutated, I just wanted to show this as this question itself might solve someone's problem.
So I solved this by.
import copy
def save(self, *args, **kwargs):
old = copy.deepcopy(self)
super(Family, self).save(*args, **kwargs)
if old is None:
print('This actually shows up')
self.users.add(self.owner)
but self.users.add(self.owner) still doesn't work.
What am I missing?
The problem is probably that in the django admin, the instance is saved first, and only after that the inline formsets and m2m-fields are saved. If the owner is not in there, it will be removed.
You can override some functionality in the admin to remedy this:
class FamilyAdmin(ModelAdmin):
def save_related(self, request, form, formsets, change):
super(FamilyAdmin, self).save_related(request, form, formsets, change)
form.instance.users.add(form.instance.owner)
Furthermore, you can try (note that there are other ways to remove the owner that are not picked up by any signal or other hook) to prevent code from removing the owner:
from django.db.models.signals import m2m_changed
from django.dispatch import receiver
#receiver(m2m_changed, sender=Family.users.through)
def famliy_users_changed(sender, **kwargs):
family = kwargs['instance']
pk_set = kwargs['pk_set']
action = kwargs['action']
if action == "pre_add":
pk_set.add(family.owner_id)
if action == "pre_remove":
pk_set.remove(family.owner_id)
if action == "post_clear":
family.users.add(family.owner)
But generally speaking, you are jumping through those hoops because you are denormalizing your data (putting the owner in users makes that information redundant, forcing you to keep your data correct). Since you always know the owner is one of the users, why not wrap that in a method
class Family(...):
# ...
def members(self):
return User.objects.filter(Q(pk__in=self.users.all()|Q(pk=self.owner_id)))
and access family members through that method?

Django - pre-save VS post-save VS view save

I want to perform some action (sending email) after updating an (already existing) object.
In order to do it I need to compare values of the object before and after saving and only if something specific has changed - do that action. From reading other related question I understood that I can only do it in pre-save signal, since I can't get the old version inside 'post-save', but - what if there will be some issue with saving and the item will not be saved? I don't want to perform the action in that case. So I thought about implementing it somehow by overriding the view save, but I'm not sure that's the correct way to do it. What do you think?
This is implementing in pre-save:
#staticmethod
#receiver(pre_save, sender=Item)
# check if there is change that requires sending email notification.
def send_email_notification_if_needed(sender, instance, raw, *args, **kwargs):
try:
# if item just created - don't do anything
pre_save_item_obj = sender.objects.get(pk=instance.pk)
except sender.DoesNotExist:
pass # Object is new, so field hasn't technically changed
else:
# check if state changed to Void
if pre_save_item_obj.state_id != VOID and instance.state_id == VOID:
content = {"item_name": instance.title, "item_description": instance.description}
EmailNotificationService().send_email("item_update"
["myemail#gmail.com"], str(instance.container.id) +
str(instance.id) + " changed to Void",
content)
There is nothing wrong with overriding the model's save method. After all, this is where you have all the information that you need:
class X(models.Model):
def save(self, *args, **kwargs):
pre_obj = X.objects.filter(pk=self.pk).first()
super(X, self).save(*args, **kwargs)
# no exception from save
if pre_obj and pre_obj.state_id != VOID and self.state_id == VOID:
# send mail

Detect field change using post_save instead of pre_save signal

I need to do some actions when one field has changed.
Since this action needs to work with already saved object, I can't use pre_save signal like this:
#receiver(pre_save, sender=reservation_models.Reservation)
def generate_possible_pairs(sender, instance, **kwargs):
try:
reservation_old = sender.objects.get(pk=instance.pk)
except sender.DoesNotExist:
pass # Object is new, so field hasn't technically changed, but you may want to do something else here.
else:
if not reservation_old.datetime == instance.datetime: # Field has changed
do_something(instance) # It would be better to be sure instance has been saved
Is it possible to use post_save signal for this?
I would like to avoid adding temporary attributes to this model.
Using the post_save signal you won't be able to retrieve the previous state from db - But why use a signal at all ?
class Reservation(models.Model):
def save(self, *args, **kw):
old = type(self).objects.get(pk=self.pk) if self.pk else None
super(Reservation, self).save(*args, **kw)
if old and old.datetime != self.datetime: # Field has changed
do_something(self)
You may also want to read this : https://lincolnloop.com/blog/django-anti-patterns-signals/
Yes you can use a post_save too. You should however remember signals are synchronous

Django post_save preventing recursion without overriding model save()

There are many Stack Overflow posts about recursion using the post_save signal, to which the comments and answers are overwhelmingly: "why not override save()" or a save that is only fired upon created == True.
Well I believe there's a good case for not using save() - for example, I am adding a temporary application that handles order fulfillment data completely separate from our Order model.
The rest of the framework is blissfully unaware of the fulfillment application and using post_save hooks isolates all fulfillment related code from our Order model.
If we drop the fulfillment service, nothing about our core code has to change. We delete the fulfillment app, and that's it.
So, are there any decent methods to ensure the post_save signal doesn't fire the same handler twice?
you can use update instead of save in the signal handler
queryset.filter(pk=instance.pk).update(....)
What you think about this solution?
#receiver(post_save, sender=Article)
def generate_thumbnails(sender, instance=None, created=False, **kwargs):
if not instance:
return
if hasattr(instance, '_dirty'):
return
do_something()
try:
instance._dirty = True
instance.save()
finally:
del instance._dirty
You can also create decorator
def prevent_recursion(func):
#wraps(func)
def no_recursion(sender, instance=None, **kwargs):
if not instance:
return
if hasattr(instance, '_dirty'):
return
func(sender, instance=instance, **kwargs)
try:
instance._dirty = True
instance.save()
finally:
del instance._dirty
return no_recursion
#receiver(post_save, sender=Article)
#prevent_recursion
def generate_thumbnails(sender, instance=None, created=False, **kwargs):
do_something()
Don't disconnect signals. If any new model of the same type is generated while the signal is disconnected the handler function won't be fired. Signals are global across Django and several requests can be running concurrently, making some fail while others run their post_save handler.
I think creating a save_without_signals() method on the model is more explicit:
class MyModel()
def __init__():
# Call super here.
self._disable_signals = False
def save_without_signals(self):
"""
This allows for updating the model from code running inside post_save()
signals without going into an infinite loop:
"""
self._disable_signals = True
self.save()
self._disable_signals = False
def my_model_post_save(sender, instance, *args, **kwargs):
if not instance._disable_signals:
# Execute the code here.
How about disconnecting then reconnecting the signal within your post_save function:
def my_post_save_handler(sender, instance, **kwargs):
post_save.disconnect(my_post_save_handler, sender=sender)
instance.do_stuff()
instance.save()
post_save.connect(my_post_save_handler, sender=sender)
post_save.connect(my_post_save_handler, sender=Order)
You should use queryset.update() instead of Model.save() but you need to take care of something else:
It's important to note that when you use it, if you want to use the new object you should get his object again, because it will not change the self object, for example:
>>> MyModel.objects.create(pk=1, text='')
>>> el = MyModel.objects.get(pk=1)
>>> queryset.filter(pk=1).update(text='Updated')
>>> print el.text
>>> ''
So, if you want to use the new object you should do again:
>>> MyModel.objects.create(pk=1, text='')
>>> el = MyModel.objects.get(pk=1)
>>> queryset.filter(pk=1).update(text='Updated')
>>> el = MyModel.objects.get(pk=1) # Do it again
>>> print el.text
>>> 'Updated'
You could also check the raw argument in post_save and then call save_baseinstead of save.
the Model's .objects.update() method bypasses the post_save signal
Try this something like this:
from django.db import models
from django.db.models.signals import post_save
class MyModel(models.Model):
name = models.CharField(max_length=200)
num_saves = models.PositiveSmallIntegerField(default=0)
#classmethod
def post_save(cls, sender, instance, created, *args, **kwargs):
MyModel.objects.filter(id=instance.id).update(save_counter=instance.save_counter + 1)
post_save.connect(MyModel.post_save, sender=MyModel)
In this example, an object has a name and each time .save() is called, the .num_saves property is incremented, but without recursion.
Check this out...
Each signal has it's own benefits as you can read about in the docs here but I wanted to share a couple things to keep in mind with the pre_save and post_save signals.
Both are called every time .save() on a model is called. In other words, if you save the model instance, the signals are sent.
running save() on the instance within a post_save can often create a never ending loop and therefore cause a max recursion depth exceeded error --- only if you don't use .save() correctly.
pre_save is great for changing just instance data because you do not have to call save() ever which eliminates the possibility for above. The reason you don't have to call save() is because a pre_save signal literally means right before being saved.
Signals can call other signals and or run delayed tasks (for Celery) which can be huge for usability.
Source: https://www.codingforentrepreneurs.com/blog/post-save-vs-pre-save-vs-override-save-method/
Regards!!
In post_save singal in django for avoiding recursion 'if created' check is required
from django.dispatch import receiver
from django.db.models.signals import post_save
#receiver(post_save, sender=DemoModel)
def _post_save_receiver(sender,instance,created, **kwargs):
if created:
print('hi..')
instance.save()
I was using the save_without_signals() method by #Rune Kaagaard until i updated my Django to 4.1. On Django 4.1 this method started raising an Integrity error on the database that gave me 4 days of headaches and i couldn't fix it.
So i started to use the queryset.update() method and it worked like a charm. It doesn't trigger the pre_save() neither post_save() and you don't need to override the save() method of your model. 1 line of code.
#receiver(pre_save, sender=Your_model)
def any_name(sender, instance, **kwargs):
Your_model.objects.filter(pk=instance.pk).update(model_attribute=any_value)

How to use Django model inheritance with signals?

I have a few model inheritance levels in Django:
class WorkAttachment(models.Model):
""" Abstract class that holds all fields that are required in each attachment """
work = models.ForeignKey(Work)
added = models.DateTimeField(default=datetime.datetime.now)
views = models.IntegerField(default=0)
class Meta:
abstract = True
class WorkAttachmentFileBased(WorkAttachment):
""" Another base class, but for file based attachments """
description = models.CharField(max_length=500, blank=True)
size = models.IntegerField(verbose_name=_('size in bytes'))
class Meta:
abstract = True
class WorkAttachmentPicture(WorkAttachmentFileBased):
""" Picture attached to work """
image = models.ImageField(upload_to='works/images', width_field='width', height_field='height')
width = models.IntegerField()
height = models.IntegerField()
There are many different models inherited from WorkAttachmentFileBased and WorkAttachment. I want to create a signal, which would update an attachment_count field for parent work, when attachment is created. It would be logical, to think that signal made for parent sender (WorkAttachment) would run for all inherited models too, but it does not. Here is my code:
#receiver(post_save, sender=WorkAttachment, dispatch_uid="att_post_save")
def update_attachment_count_on_save(sender, instance, **kwargs):
""" Update file count for work when attachment was saved."""
instance.work.attachment_count += 1
instance.work.save()
Is there a way to make this signal work for all models inherited from WorkAttachment?
Python 2.7, Django 1.4 pre-alpha
P.S. I've tried one of the solutions I found on the net, but it did not work for me.
You could register the connection handler without sender specified. And filter the needed models inside it.
from django.db.models.signals import post_save
from django.dispatch import receiver
#receiver(post_save)
def my_handler(sender, **kwargs):
# Returns false if 'sender' is NOT a subclass of AbstractModel
if not issubclass(sender, AbstractModel):
return
...
Ref: https://groups.google.com/d/msg/django-users/E_u9pHIkiI0/YgzA1p8XaSMJ
The simplest solution is to not restrict on the sender, but to check in the signal handler whether the respective instance is a subclass:
#receiver(post_save)
def update_attachment_count_on_save(sender, instance, **kwargs):
if isinstance(instance, WorkAttachment):
...
However, this may incur a significant performance overhead as every time any model is saved, the above function is called.
I think I've found the most Django-way of doing this: Recent versions of Django suggest to connect signal handlers in a file called signals.py. Here's the necessary wiring code:
your_app/__init__.py:
default_app_config = 'your_app.apps.YourAppConfig'
your_app/apps.py:
import django.apps
class YourAppConfig(django.apps.AppConfig):
name = 'your_app'
def ready(self):
import your_app.signals
your_app/signals.py:
def get_subclasses(cls):
result = [cls]
classes_to_inspect = [cls]
while classes_to_inspect:
class_to_inspect = classes_to_inspect.pop()
for subclass in class_to_inspect.__subclasses__():
if subclass not in result:
result.append(subclass)
classes_to_inspect.append(subclass)
return result
def update_attachment_count_on_save(sender, instance, **kwargs):
instance.work.attachment_count += 1
instance.work.save()
for subclass in get_subclasses(WorkAttachment):
post_save.connect(update_attachment_count_on_save, subclass)
I think this works for all subclasses, because they will all be loaded by the time YourAppConfig.ready is called (and thus signals is imported).
You could try something like:
model_classes = [WorkAttachment, WorkAttachmentFileBased, WorkAttachmentPicture, ...]
def update_attachment_count_on_save(sender, instance, **kwargs):
instance.work.attachment_count += 1
instance.work.save()
for model_class in model_classes:
post_save.connect(update_attachment_count_on_save,
sender=model_class,
dispatch_uid="att_post_save_"+model_class.__name__)
(Disclaimer: I have not tested the above)
I just did this using python's (relatively) new __init_subclass__ method:
from django.db import models
def perform_on_save(*args, **kw):
print("Doing something important after saving.")
class ParentClass(models.Model):
class Meta:
abstract = True
#classmethod
def __init_subclass__(cls, **kwargs):
super().__init_subclass__(**kwargs)
models.signals.post_save.connect(perform_on_save, sender=cls)
class MySubclass(ParentClass):
pass # signal automatically gets connected.
This requires django 2.1 and python 3.6 or better. Note that the #classmethod line seems to be required when working with the django model and associated metaclass even though it's not required according to the official python docs.
post_save.connect(my_handler, ParentClass)
# connect all subclasses of base content item too
for subclass in ParentClass.__subclasses__():
post_save.connect(my_handler, subclass)
have a nice day!
Michael Herrmann's solution is definitively the most Django-way of doing this.
And yes it works for all subclasses as they are loaded at the ready() call.
I would like to contribute with the documentation references :
In practice, signal handlers are usually defined in a signals submodule of the application they relate to. Signal receivers are connected in the ready() method of your application configuration class. If you’re using the receiver() decorator, simply import the signals submodule inside ready().
https://docs.djangoproject.com/en/dev/topics/signals/#connecting-receiver-functions
And add a warning :
The ready() method may be executed more than once during testing, so you may want to guard your signals from duplication, especially if you’re planning to send them within tests.
https://docs.djangoproject.com/en/dev/topics/signals/#connecting-receiver-functions
So you might want to prevent duplicate signals with a dispatch_uid parameter on the connect function.
post_save.connect(my_callback, dispatch_uid="my_unique_identifier")
In this context I'll do :
for subclass in get_subclasses(WorkAttachment):
post_save.connect(update_attachment_count_on_save, subclass, dispatch_uid=subclass.__name__)
https://docs.djangoproject.com/en/dev/topics/signals/#preventing-duplicate-signals
This solution resolves the problem when not all modules imported into memory.
def inherited_receiver(signal, sender, **kwargs):
"""
Decorator connect receivers and all receiver's subclasses to signals.
#inherited_receiver(post_save, sender=MyModel)
def signal_receiver(sender, **kwargs):
...
"""
parent_cls = sender
def wrapper(func):
def childs_receiver(sender, **kw):
"""
the receiver detect that func will execute for child
(and same parent) classes only.
"""
child_cls = sender
if issubclass(child_cls, parent_cls):
func(sender=child_cls, **kw)
signal.connect(childs_receiver, **kwargs)
return childs_receiver
return wrapper
It's also possible to use content types to discover subclasses - assuming you have the base class and subclasses packaged in the same app. Something like this would work:
from django.contrib.contenttypes.models import ContentType
content_types = ContentType.objects.filter(app_label="your_app")
for content_type in content_types:
model = content_type.model_class()
post_save.connect(update_attachment_count_on_save, sender=model)
In addition to #clwainwright answer, I configured his answer to instead work for the m2m_changed signal. I had to post it as an answer for the code formatting to make sense:
#classmethod
def __init_subclass__(cls, **kwargs):
super().__init_subclass__(**kwargs)
for m2m_field in cls._meta.many_to_many:
if hasattr(cls, m2m_field.attname) and hasattr(getattr(cls, m2m_field.attname), 'through'):
models.signals.m2m_changed.connect(m2m_changed_receiver, weak=False, sender=getattr(cls, m2m_field.attname).through)
It does a couple of checks to ensure it doesn't break if anything changes in future Django versions.

Categories