Django custom signals - python

So when creating a signal, we make a signals.py file that looks as such:
from django.dispatch import Signal
some_signal = Signal(providing_args=["arg_1","arg_2",...,"arg_n"])
#...
now my question is, what if the arguments you are going to provide are not always going to be the same? How would you create a signal with this kind of flexibility?

If you read the documentation:
All signals are django.dispatch.Signal instances. The providing_args is a list of the names of arguments the signal will provide to listeners. This is purely documentational, however, as there is nothing that checks that the signal actually provides these arguments to its listeners.
and if you look at the signature for the send method:
Signal.send(sender, **kwargs)
you will notice that it just takes a single argument and however many keyword arguments you want, so you can send as many things as you like to your signals listeners

Related

How can I maintain, atleast, some order in which callbacks connect to a signal?

When I simply use 'connect', there is no way I can specify the order in the callbacks will be called. Having connect_before and connect_after to connect before/after the default doesn't serve my purpose. I want something like connect_first and connect_last or anything else that will help me specify the order in which callbacks connect to a signal.
Something like this?
something.connect_first('my-signal', callback1)
somethingelse.connect_last('my-signal', callback2)
There is only one explicit ordering guarantee for GObject signals:
the class closure added when creating a new signal with G_SIGNAL_RUN_FIRST will be called before all callbacks added using g_signal_connect()
the class closure added when creating a new signal eith G_SIGNAL_RUN_LAST will be called after all callbacks added using g_signal_connect() and before all callbacks added using g_signal_connect_after()
This means that you can only control whether a callback is invoked before or after all other callbacks connected manually when you're creating a new signal - for obvious reasons, since you may want to provide an initial state at the start of an emission chain, or you want to ensure a stable state at the end of an emission chain.
As for the order of the callbacks added using g_signal_connect(), there is no explicit guarantee of any ordering. There's an implicit order, the order of connection, though, that won't likely ever be changed. Remember that signal handlers can install new handlers, or disconnect them, or block the signal emission, so relying on a specific order is usually an indication of a design problem in the code.

django custom signals vs regular functions

I started to learn on django signal feature when I wanted to run a method only when a certain event has occurred.
So I read about custom signals and I couldn't find a reason to use signals over regular functions calls.
I read in the django Guide about how to call a signal:
The signal
pizza_done = django.dispatch.Signal(providing_args=["toppings", "size"])
The call:
def send_pizza(self, toppings, size):
pizza_done.send(sender=self.__class__, toppings=toppings, size=size)
and i can't understand how they are different. Then I just call directly to the
receiver function...
Can someone enlighten me about it?
You can think of signals as an implementation of observer design pattern.
It's propose is to remove the dependencies from observable to observer. So observable know nothing about the observer.
Like saving a model (post_save and pre_save signals). You could face cases you need YourModel.save method not to know about that a task should be done after/before saving.
Django default signals are a good sample. But I don't think you will need to create custom signal everyday (nor every project!).
And I suggest that you shouldn't use signals (default signals and custom signals) until you really need them, make them the last choice. As they can make your code hard to read, as reader can't figure out easily where are your logic.
Django before 1.7 doesn't have a convention for where to place your signals receivers. In django >= 1.7 you should use AppConfig.ready to connect the receivers.
Django signals are the implementation of the publish-subscribe design pattern. They allows you to decouple the different parts of the project. For example when you develop the "make pizza" app you don't have to know anything about "delivery pizza" app.
Another great example of this pattern - django built-in pre_save/post_save signal. You can add any logic to this event at any time without modifying of the django core.
Widely used scenario is to add user profile info after creating user instance. You just add the listener to post_save(sender=User) event.

Returning (Passing Around) A Function Call in Python/Tornado?

So I'm creating the back end for a web-based game in python. Currently it works like this...
WebSocket Handler receives message...
WebSocket Handler calls message handler...
Message Handler calls Game class functions...
Game class calls other classes to update information.
This is highly coupled, and probably should be in model-view-controller format. So I'm finally going to change that.
Basically I want it to work like this.
Controller has open a WebSocket Handler.
WebSocket Handler returns to Controller a (1?).
Controller uses (1?) calls Message Handler.
Message Handler returns to Controller a (2?).
Contoller uses (2?) to call Model.
Model sends updates to appropriate places.
So there's two problems here.
First of all, when I'm getting a message in the WebSocket Handler, it is an instance of Tornado's WebSocketHandler, and I'm not sure how I can return anything to the Controller. Is the case simply that this is not possible? Do I have to keep a small amount of coupling between the WebSocket Handler and the Message Handler? I know I could always call a function in Controller, but that doesn't seem like an actual fix, just more function calls.
Is there a way to pass a function call around in python, keeping track of it's parameters while doing so? That would be the optimal way to go about doing this, but I don't think it's implemented in the python language. Otherwise, I feel like the best way to do it would be to return a dictionary with a field for the function name to be called, and fields for the parameters. This of course is a lot more code. If it can be avoided, I'd like to, but I'm not sure the direction in which to take this.
Thanks for the tips guys, this is a big refactoring and I'm really nervous about where to start.
For the second part of your question, I believe you want to use partial functions.
Check out: http://docs.python.org/2/library/functools.html
Basically you would go:
from functools import partial
function_call(partial(future_function_call, future_argument))

Django signals and User Auth

I'm currently following Hacked Existence's Django tutorial a bit. I'm having trouble understanding the Django signals involved
def create_User_callback(sender, instance, **kwargs):
a, b = User.objects.get_or_create(user = instance)
post_save.connect(create_User_callback, User)
I'm not quite sure the logic behind
post_save.connect(create_User_callback, User)
In order for a signal handler to work, you need to bind it to a signal. That is done using the signal's connect method. In your case, the signal is post_save.
connect is called with the name of the method and the model for which the method will be called. All models will emit post_save, so when you add User as the second argument to connect, it "filters" the signals so only the post_save signal that is emitted by the User model will trigger your method.
Think of it like tuning a radio to listen on a frequency.
Having said all that, this actual code seems a bit pointless. You are creating an object (or fetching one if it exists) of the same class that is emitting the signal; after any object has been saved.

Django signal receiver accepts self argument

I am making a game. Every time a certain action occurs it sends a signal. The signal is used by many different objects. For example, player model instances will regenerate a little bit of health, lanterns will decrease the oil they have left. I want instances of these model to react and modify their data when the signal is emitted. However, I don't know how to refer to the instance itself in the receiver function, since I can't put "self" in the list of arguments the function uses.
class Lantern(models.Model):
oil_left= models.IntegerField(default=4)
#receiver(mySignal)
def burn(sender, **kwargs):
self.oil_left -= 1 #<- self is not defined obviously
self.save() #<- self is not defined obviously
Why receiver as an instance method? Because the list of instances reacting to the signal is unknown when signal is sent. The signal is merely a ping notifying interested objects (i.e. with a receiver method) that an event occurred so they can trigger their own specific behaviour.
Why do you want the signal to be an instance method? There doesn't seem to be any reason for it. You just need to ensure that when you write the signal function itself, it passes the relevant instance - eg as an instance kwarg - exactly as the built-in pre-save and post-save signals do.
Added after edit to question But that is exactly how signals are supposed to work. There's a single signal function, which sends the signal along with any association information, and any number of receivers that listen to the signal. But the receivers themselves aren't associated with particular instances - they can't be, an instance only exists when you actually instantiate it(!) and otherwise it's just a row in a database.
Perhaps your receiver function could query for the relevant objects itself, and update them there - or even better, do an update query to change them in-place.
Signal is not about your model object class changes, it is about some stuff to be done every time after/before you write to database in table assosiated with model
from django.db.models.signals import post_save
from django.dispatch import receiver
#receiver(post_save, sender=YourModel)
def your_action_call(sender, instance, **kwargs):
pass
#some logic

Categories