When is the "post_save" signal in django activated/called? - python

I've to update some database tables after saving a particular model. I've used the #receiver(post_save decorator for this. But when in this decorator function, the values are still not saved in the database. I've one to many relation but when I get the current instance that is being saved using kwargs['instance'], it doesn't have child objects. But after saving when I check from shell, it does have child objects. Following is the code that I'm using:
#receiver(post_save, sender=Test)
def do_something(sender, **kwargs):
test = kwargs['instance']
users = User.objects.filter(tags__in=test.tags.values_list('id',flat=True))
for user in users:
other_model = OtherModel(user=user, test=test, is_new=True)
other_model.save()

post_save is sent at the end of Model.save_base(), which is itself called by Model.save((). This means that if you override your model's save() method, post_save is sent when you call on super(YourModel, self).save(*args, **kw).
If Tag has a ForeignKey on Test and the Test instance was just created, you can't expect to have any Tag instance related to your Test instance at this stage, since the Tag instances obviously need to know the Test instance's pk first so they can be saved too.

The post_save for the parent instance is called when the parent instance is saved. If the children are added after that, then they won't exist at the time the parent post_save is called.

Related

Django: What is the simplest way to asynchronously execute some function when there is a change to the database?

I am familiar wıth the DBMS_ALERT feature in Oracle that sends an alert to the Operating System when there is a change to the database, and am somewhat familiar with database triggers in general. I have browsed through the django docs on signals, and inclined to use them if there is not a simpler way.
All I want to do is update or create an external file on the systems file system whenever there is an update or created record in the database. I would like this method to be called and defined right in models.py as depicted below.
models.py
from django.db import models
from django.shortcuts import reverse
class Entry(models.Model):
subject = CharField(max_length=20, unique=True)
content = models.TextField(blank=False)
class Meta:
ordering = ['subject']
def __str__(self):
"""String for representing the Model object."""
return self.subject
def get_absolute_url(self):
"""Returns the url to access a detail record for this entry."""
return reverse('entry_detail', args=[int(self.id)])
def ondbchange_updatecorrfile(self):
# this method would be called upon change in the model Entry, so it is already aware of object
util.save_entry(self.subject,self.content) # file corresponding to row is updated or created
What would be the simplest method to implement the ondbchange_updatecorrfile(self) method above?
I have looked at the below code from this source
models.py
class Entry(models.Model):
...
from django.db.models.signals import post_save
from django.dispatch import receiver
#receiver(post_save, sender=Entry)
def queue_task(sender, instance, created, **kwargs):
tasks.foo.delay(object=instance)
Then foo is some function in another class that would update the file. Since the database Model is the class that is aware of the change in the Model's underlying database, do we really need to use a signal if the desired function is already in the model and "self aware" of the database change?
Any help deeply appreciated. Please be specific since the devil is in the details.
The post_save signal is sent by the model Entry after a save, but according to this source, the signal does not include the model changes in its update_fields parameter. These must be manually inserted in the override of the function where the save() happens; which defeats the purpose. I initially assumed that the signal would automatically include this information.

Django CASCADE and post_delete interaction

I have the following model:
class A():
foriegn_id1 = models.CharField # ref to a database not managed by django
foriegn_id2 = models.CharField
class B():
a = models.OneToOneField(A, on_delete=models.CASCADE)
So I want A to be deleted as well when B is deleted:
#receiver(post_delete, sender=B)
def post_delete_b(sender, instance, *args, **kwargs):
if instance.a:
instance.a.delete()
And on the deletion of A, I want to delete the objects from the unmanaged databases:
#receiver(post_delete, sender=A)
def post_delete_b(sender, instance, *args, **kwargs):
if instance.foriegn_id1:
delete_foriegn_obj_1(instance.foriegn_id1)
if instance.foriegn_id2:
delete_foriegn_obj_2(instance.foriegn_id2)
Now, if I delete object B, it works fine. But if I delete obj A, then obj B is deleted by cascade, and then it emits a post_delete signal, which triggers the deletion of A again. Django knows how to manage that on his end, so it works fine until it reaches delete_foriegn_obj, which is then called twice and returns a failure on the second attempt.
I thought about validating that the object exists in delete_foriegn_obj, but it adds 3 more calls to the DB.
So the question is: is there a way to know during post_delete_b that object a has been deleted?
Both instance.a and A.objects.get(id=instance.a.id) return the object (I guess Django caches the DB update until it finishes all of the deletions are done).
The problem is that the cascaded deletions are performed before the requested object is deleted, hence when you queried the DB (A.objects.get(id=instance.a.id)) the related a instance is present there. instance.a can even show a cached result so there's no way it would show otherwise.
So while deleting a B model instance, the related A instance will always be existent (if actually there's one). Hence, from the B model post_delete signal receiver, you can get the related A instance and check if the related B actually exists from DB (there's no way to avoid the DB here to get the actual picture underneath):
#receiver(post_delete, sender=B)
def post_delete_b(sender, instance, *args, **kwargs):
try:
a = instance.a
except AttributeError:
return
try:
a._state.fields_cache = {}
except AttributeError:
pass
try:
a.b # one extra query
except AttributeError:
# This is cascaded delete
return
a.delete()
We also need to make sure we're not getting any cached result by making a._state.fields_cache empty. The fields_cache (which is actually a descriptor that returns a dict upon first access) is used by the ReverseOneToOneDescriptor (accessor to the related object on the opposite side of a one-to-one) to cache the related field name-value. FWIW, the same is done on the forward side of the relationship by the ForwardOneToOneDescriptor accessor.
Edit based on comment:
If you're using this function for multiple senders' post_delete, you can dynamically get the related attribute via getattr:
getattr(a, sender.a.field.related_query_name())
this does the same as a.b above but allows us to get attribute dynamically via name, so this would result in exactly similar query as you can imagine.

Django - How to dynamically create signals inside model Mixin

I'm working on a model Mixin which needs to dynamically set signals based on one attribute.
It's more complicated but for simplicity, let's say the Mixin has this attribute:
models = ['app.model1','app.model2']
This attribute is defined in model which extends this mixin.
How can I register signals dynamically?
I tried to create a classmethod:
#classmethod
def set_signals(cls):
def status_sig(sender, instance, created, *args, **kwargs):
print('SIGNAL')
... do som things
for m in cls.get_target_models():
post_save.connect(status_sig,m)
My idea was to call this method somewhere in class automatically (for example __call__ method) but for now, I just tried to call it and then save the model to see if it works but it didn't.
from django.db.models.signals import post_save
print(post_save.receivers)
Realestate.set_signals()
print(post_save.receivers)
r = Realestate.objects.first()
r.status = 1
r.save()
output
[]
[((139967044372680, 46800232), <weakref at 0x7f4c9d702408; dead>), ((139967044372680, 46793464), <weakref at 0x7f4c9d702408; dead>)]
So you see that it registered those models but no signal has been triggered after saving the realestate.
Do you know how to make it work? Even better without having to call method explicitely?
EDIT:
I can't just put the signals creation inside mixin file because models depends on the string in child model.
If you haven't already solved this:
In the connect method, set weak=False. By default it's True so the locally-defined function reference will get lost if the object instance is garbage collected.
This is likely what's happening to your status_sig function; as you can see in the print out of the post_save receivers, the weakref's are dead so will always just return None
In the Django docs:
weak – Django stores signal handlers as weak references by default. Thus, if your receiver is a local function, it may be garbage collected. To prevent this, pass weak=False when you call the signal’s connect() method.
For more info on weakrefs, see Python docs

Django model, default records

I am a beginner in Django, and I am learning models for now.
I have two tables in the backend, one a child of another (1-to-many relationship).
So far, so good.
What I want to do is set Django, so that if a record is created in the parent table, the child table will automatically create 3 records.
How do I program this?
Thanks.
You may be interested in something like this:
# ... other imports ...
from django.db.models.signals import post_save, pre_save
class Parent(models.Model)
# this decorator & this method go inside your model
#staticmethod
def create_children(sender, instance=None, **kwargs):
for x in range(3):
# I'm assuming you want the child to be linked to the parent immediately.
# You can set any other attributes you want here too, of course.
child = Child(parent = instance)
child.save()
pre_save.connect(Parent.create_children, Parent)
Note that in the pre_save.connect() call, you can call any [SomeClass].[SomeMethodOfThatClass] (this is the first argument) on the save of some other class (this is the second argument). In practice, though, I don't think I've actually ever done that, and I'm not sure that you need to do that here.

Custom Storage delete() method is not called in Django

I have a field that inherits from ImageField with a custom storage like this:
image = PicasaImageField(upload_to=get_upload_to, blank=True, storage=pwcs())
When deleting an object with this field (using object.delete(), NOT bulk), the delete method of the custom storage doesn't get called. Trying to debug, I couldn't find where Django is going through the fields of an object to delete the file or whatever is behind in the actual storage. Or should I delete the file manually / in a hook / write a custom delete() method into my end model that will call the behind-the-stage delete() on the actual object? I failed to find how this is handled with the standard ImageFile + default filesystem, but would assume regular files would be deleted. Or am I getting it wrong?
Thanks for any insights.
Igor
From the Django 1.3 release notes:
when a model is deleted the FileField's delete() method won't be called. If you need cleanup of orphaned files, you'll need to handle it yourself
As you suggested you could handle this in a custom delete method:
def delete(self, *args, **kwargs):
image.delete(save=False)
super(Foo, self).delete(*args, **kwargs)
or use a receiver function which will be called even when object.delete() is not called:
#receiver(post_delete, sender=Foo, weak=False)
def delete_image_on_file(sender, instance, **kwargs):
image.delete(save=False)

Categories