I'm currently following Hacked Existence's Django tutorial a bit. I'm having trouble understanding the Django signals involved
def create_User_callback(sender, instance, **kwargs):
a, b = User.objects.get_or_create(user = instance)
post_save.connect(create_User_callback, User)
I'm not quite sure the logic behind
post_save.connect(create_User_callback, User)
In order for a signal handler to work, you need to bind it to a signal. That is done using the signal's connect method. In your case, the signal is post_save.
connect is called with the name of the method and the model for which the method will be called. All models will emit post_save, so when you add User as the second argument to connect, it "filters" the signals so only the post_save signal that is emitted by the User model will trigger your method.
Think of it like tuning a radio to listen on a frequency.
Having said all that, this actual code seems a bit pointless. You are creating an object (or fetching one if it exists) of the same class that is emitting the signal; after any object has been saved.
Related
TL;DR:
I need a way to trigger a custom signal after the post_save signal, automatically, is there any way of doing it?
I'm currently developing a library for django that requires a lot of comes and goes with the post_save signal in django and I was wondering if it's possible to trigger another signal after the post_save so I could implement my own and not intervene the post_save in case a project that uses the library needs to do it.
So far I know that signals are expected to receive a class as a sender argument, and if I trigger manually the signal from the post_save I would be doing nothing (I'd still be intervening it). Is there any workaround for this? Am I missing something in the docs?
Although this may be possible by calling a signal manually from inside another signal like this:
post_save.send(MyModel, instance=a_mymodel_instance)
There are easier ways to do something like that:
Let us assume that you follow the file organization that I use in this answer: Django Create and Save Many instances of model when another object are created
Suggestion 1:
Let us assume that your first post_save does something to MyModel1 and another post_save signal does something to MyModel2 after some processing on the instance that triggered the first signal.
post_save is always send at the end of a save() method.
Organize your signals as follows:
#receiver(post_save, sender=MyModel1)
def mymodel1_signal (sender, instance, **kwargs):
Do stuff on MyModel1 instance...
Do also stuff on MyModel2 and then call MyModel2.save()
#receiver(post_save, sender=MyModel2)
def mymodel2_signal (sender, instance, **kwargs):
Do stuff on MyModel2 instance...
This way the mymodel2_signal signal will get triggered after the call to MyModel2.save() from mymodel1_signal.
Suggestion 2:
Use a mix of signals to achieve the correct "timing".
Let us assume that you want to start a process on MyModel2 before a MyModel1 get's saved
Use a pre_save and a post_save signal:
#receiver(pre_save, sender=MyModel1)
def mymodel1_signal (sender, instance, **kwargs):
Do stuff on MyModel1 instance...
Do also stuff on MyModel2 and then call MyModel2.save()
#receiver(post_save, sender=MyModel2)
def mymodel2_signal (sender, instance, **kwargs):
Do stuff on MyModel2 instance...
Suggestion 3:
Use a MyModel2 method directly inside a MyModel1 post_save signal.
I started to learn on django signal feature when I wanted to run a method only when a certain event has occurred.
So I read about custom signals and I couldn't find a reason to use signals over regular functions calls.
I read in the django Guide about how to call a signal:
The signal
pizza_done = django.dispatch.Signal(providing_args=["toppings", "size"])
The call:
def send_pizza(self, toppings, size):
pizza_done.send(sender=self.__class__, toppings=toppings, size=size)
and i can't understand how they are different. Then I just call directly to the
receiver function...
Can someone enlighten me about it?
You can think of signals as an implementation of observer design pattern.
It's propose is to remove the dependencies from observable to observer. So observable know nothing about the observer.
Like saving a model (post_save and pre_save signals). You could face cases you need YourModel.save method not to know about that a task should be done after/before saving.
Django default signals are a good sample. But I don't think you will need to create custom signal everyday (nor every project!).
And I suggest that you shouldn't use signals (default signals and custom signals) until you really need them, make them the last choice. As they can make your code hard to read, as reader can't figure out easily where are your logic.
Django before 1.7 doesn't have a convention for where to place your signals receivers. In django >= 1.7 you should use AppConfig.ready to connect the receivers.
Django signals are the implementation of the publish-subscribe design pattern. They allows you to decouple the different parts of the project. For example when you develop the "make pizza" app you don't have to know anything about "delivery pizza" app.
Another great example of this pattern - django built-in pre_save/post_save signal. You can add any logic to this event at any time without modifying of the django core.
Widely used scenario is to add user profile info after creating user instance. You just add the listener to post_save(sender=User) event.
So when creating a signal, we make a signals.py file that looks as such:
from django.dispatch import Signal
some_signal = Signal(providing_args=["arg_1","arg_2",...,"arg_n"])
#...
now my question is, what if the arguments you are going to provide are not always going to be the same? How would you create a signal with this kind of flexibility?
If you read the documentation:
All signals are django.dispatch.Signal instances. The providing_args is a list of the names of arguments the signal will provide to listeners. This is purely documentational, however, as there is nothing that checks that the signal actually provides these arguments to its listeners.
and if you look at the signature for the send method:
Signal.send(sender, **kwargs)
you will notice that it just takes a single argument and however many keyword arguments you want, so you can send as many things as you like to your signals listeners
I want to develop an application that monitors the database for new records and allows me to execute a method in the context of my Django application when a new record is inserted.
I am planning to use an approach where a Celery task checks the database for changes since the last check and triggers the above method.
Is there a better way to achieve this?
I'm using SQLite as the backend and tried apsw's setupdatehook API, but it doesn't seem to run my module in Django context.
NOTE: The updates are made by a different application outside Django.
Create a celery task to do whatever it is you need to do with the object:
tasks.py
from celery.decorators import task
#task()
def foo(object):
object.do_some_calculation()
Then create a django signal that is fired every time an instance of your Model is saved , queuing up your task in Celery:
models.py
class MyModel(models.Model):
...
from django.db.models.signals import post_save
from django.dispatch import receiver
from mymodel import tasks
#receiver(post_save, sender=MyModel)
def queue_task(sender, instance, created, **kwargs):
tasks.foo.delay(object=instance)
What's important to note that is django's signals are synchronous, in other words the queue_task function runs within the request cycle, but all the queue_task function is doing is telling Celery to handle the actual guts of the work (do_some_calculation) in theb background
A better way would be to have that application that modifies the records call yours. Or at least make a celery queue entry so that you don't really have to query the database too often to see if something changed.
But if that is not an option, letting celery query the database to find if something changed is probably the next best option. (surely better than the other possible option of calling a web service from the database as a trigger, which you should really avoid.)
I am making a game. Every time a certain action occurs it sends a signal. The signal is used by many different objects. For example, player model instances will regenerate a little bit of health, lanterns will decrease the oil they have left. I want instances of these model to react and modify their data when the signal is emitted. However, I don't know how to refer to the instance itself in the receiver function, since I can't put "self" in the list of arguments the function uses.
class Lantern(models.Model):
oil_left= models.IntegerField(default=4)
#receiver(mySignal)
def burn(sender, **kwargs):
self.oil_left -= 1 #<- self is not defined obviously
self.save() #<- self is not defined obviously
Why receiver as an instance method? Because the list of instances reacting to the signal is unknown when signal is sent. The signal is merely a ping notifying interested objects (i.e. with a receiver method) that an event occurred so they can trigger their own specific behaviour.
Why do you want the signal to be an instance method? There doesn't seem to be any reason for it. You just need to ensure that when you write the signal function itself, it passes the relevant instance - eg as an instance kwarg - exactly as the built-in pre-save and post-save signals do.
Added after edit to question But that is exactly how signals are supposed to work. There's a single signal function, which sends the signal along with any association information, and any number of receivers that listen to the signal. But the receivers themselves aren't associated with particular instances - they can't be, an instance only exists when you actually instantiate it(!) and otherwise it's just a row in a database.
Perhaps your receiver function could query for the relevant objects itself, and update them there - or even better, do an update query to change them in-place.
Signal is not about your model object class changes, it is about some stuff to be done every time after/before you write to database in table assosiated with model
from django.db.models.signals import post_save
from django.dispatch import receiver
#receiver(post_save, sender=YourModel)
def your_action_call(sender, instance, **kwargs):
pass
#some logic