I am making a game. Every time a certain action occurs it sends a signal. The signal is used by many different objects. For example, player model instances will regenerate a little bit of health, lanterns will decrease the oil they have left. I want instances of these model to react and modify their data when the signal is emitted. However, I don't know how to refer to the instance itself in the receiver function, since I can't put "self" in the list of arguments the function uses.
class Lantern(models.Model):
oil_left= models.IntegerField(default=4)
#receiver(mySignal)
def burn(sender, **kwargs):
self.oil_left -= 1 #<- self is not defined obviously
self.save() #<- self is not defined obviously
Why receiver as an instance method? Because the list of instances reacting to the signal is unknown when signal is sent. The signal is merely a ping notifying interested objects (i.e. with a receiver method) that an event occurred so they can trigger their own specific behaviour.
Why do you want the signal to be an instance method? There doesn't seem to be any reason for it. You just need to ensure that when you write the signal function itself, it passes the relevant instance - eg as an instance kwarg - exactly as the built-in pre-save and post-save signals do.
Added after edit to question But that is exactly how signals are supposed to work. There's a single signal function, which sends the signal along with any association information, and any number of receivers that listen to the signal. But the receivers themselves aren't associated with particular instances - they can't be, an instance only exists when you actually instantiate it(!) and otherwise it's just a row in a database.
Perhaps your receiver function could query for the relevant objects itself, and update them there - or even better, do an update query to change them in-place.
Signal is not about your model object class changes, it is about some stuff to be done every time after/before you write to database in table assosiated with model
from django.db.models.signals import post_save
from django.dispatch import receiver
#receiver(post_save, sender=YourModel)
def your_action_call(sender, instance, **kwargs):
pass
#some logic
Related
TL;DR:
I need a way to trigger a custom signal after the post_save signal, automatically, is there any way of doing it?
I'm currently developing a library for django that requires a lot of comes and goes with the post_save signal in django and I was wondering if it's possible to trigger another signal after the post_save so I could implement my own and not intervene the post_save in case a project that uses the library needs to do it.
So far I know that signals are expected to receive a class as a sender argument, and if I trigger manually the signal from the post_save I would be doing nothing (I'd still be intervening it). Is there any workaround for this? Am I missing something in the docs?
Although this may be possible by calling a signal manually from inside another signal like this:
post_save.send(MyModel, instance=a_mymodel_instance)
There are easier ways to do something like that:
Let us assume that you follow the file organization that I use in this answer: Django Create and Save Many instances of model when another object are created
Suggestion 1:
Let us assume that your first post_save does something to MyModel1 and another post_save signal does something to MyModel2 after some processing on the instance that triggered the first signal.
post_save is always send at the end of a save() method.
Organize your signals as follows:
#receiver(post_save, sender=MyModel1)
def mymodel1_signal (sender, instance, **kwargs):
Do stuff on MyModel1 instance...
Do also stuff on MyModel2 and then call MyModel2.save()
#receiver(post_save, sender=MyModel2)
def mymodel2_signal (sender, instance, **kwargs):
Do stuff on MyModel2 instance...
This way the mymodel2_signal signal will get triggered after the call to MyModel2.save() from mymodel1_signal.
Suggestion 2:
Use a mix of signals to achieve the correct "timing".
Let us assume that you want to start a process on MyModel2 before a MyModel1 get's saved
Use a pre_save and a post_save signal:
#receiver(pre_save, sender=MyModel1)
def mymodel1_signal (sender, instance, **kwargs):
Do stuff on MyModel1 instance...
Do also stuff on MyModel2 and then call MyModel2.save()
#receiver(post_save, sender=MyModel2)
def mymodel2_signal (sender, instance, **kwargs):
Do stuff on MyModel2 instance...
Suggestion 3:
Use a MyModel2 method directly inside a MyModel1 post_save signal.
So when creating a signal, we make a signals.py file that looks as such:
from django.dispatch import Signal
some_signal = Signal(providing_args=["arg_1","arg_2",...,"arg_n"])
#...
now my question is, what if the arguments you are going to provide are not always going to be the same? How would you create a signal with this kind of flexibility?
If you read the documentation:
All signals are django.dispatch.Signal instances. The providing_args is a list of the names of arguments the signal will provide to listeners. This is purely documentational, however, as there is nothing that checks that the signal actually provides these arguments to its listeners.
and if you look at the signature for the send method:
Signal.send(sender, **kwargs)
you will notice that it just takes a single argument and however many keyword arguments you want, so you can send as many things as you like to your signals listeners
I'm developing a python program to monitor and control a game-server. The game-server has many game-cores, and those cores handle the clients.
I have a python class called Server that holds instances of the class Core, and those instances are used to manage the actual game-cores. The Core class needs to connect to the game-core via TCP-Socket, in order to send commands to that specific game-core. To close those sockets properly, the Core class has a __del__ method which closes the socket.
An example:
class Server(object):
Cores = [] # list which will be filled with the Core objects
def __init__(self):
# detect the game-cores, create the core objects and append them to self.Cores
class Core(object):
CoreSocket = None # when the socket gets created, the socket-object will be bound to this variable
def __init__(self, coreID):
# initiate the socket connection between the running game-core and this python object
def __del__(self):
# properly close the socket connection
Now, when I use the Core class itself, the destructor always gets called properly. But when I use the Server class, the Core objects inside Server.Cores never get destructed. I have read that the gc has a problem with circular references and classes with destructors, but the Core objects never reference the Server object (only the socket-object, in Core.CoreSocket), so no circular references are created.
I usually prefer using the with-statement for resource cleaning, but in this case I need to send commands over many different methods in the Server class, so using with won't help ... I also tried to create and close the socket on each command, but that really kills the performance when I need to send many commands. Weak refereneces created with the weakref module won't help eigther, because the destructors then get called immediately after I create the Server object.
Why don't the Core objects get destructed properly when the Server object gets cleaned up by the gc? I guess I'm just forgetting something simple, but I just can't find out what it is.
Or maybe there is a better approach for closing those sockets when the object gets cleaned up?
You've mixed up class and instance members. Unlike in some other languages, defining a variable at class scope creates a class variable, not an instance variable. When a Server instance dies, the Server class is still around and still holds references to the cores. Define self.cores in the __init__ method instead:
class Server(object):
def __init__(self):
self.cores = []
I'm currently following Hacked Existence's Django tutorial a bit. I'm having trouble understanding the Django signals involved
def create_User_callback(sender, instance, **kwargs):
a, b = User.objects.get_or_create(user = instance)
post_save.connect(create_User_callback, User)
I'm not quite sure the logic behind
post_save.connect(create_User_callback, User)
In order for a signal handler to work, you need to bind it to a signal. That is done using the signal's connect method. In your case, the signal is post_save.
connect is called with the name of the method and the model for which the method will be called. All models will emit post_save, so when you add User as the second argument to connect, it "filters" the signals so only the post_save signal that is emitted by the User model will trigger your method.
Think of it like tuning a radio to listen on a frequency.
Having said all that, this actual code seems a bit pointless. You are creating an object (or fetching one if it exists) of the same class that is emitting the signal; after any object has been saved.
I'm wanting a paradigm in a Qt4 (PyQt4) program where a component is able to respond to a signal without knowing anything about where it is coming from.
My intial reading suggests that I have to explicitly connect signals to slots. But what I want is for any of a number of components to be able to send a signal, and for it to be processed by another component.
Comparing with another toolkits, for example, in wxWidgets I would use events. These automatically propogate up from child windows/objects to parents. At each level they can be handled. This means if I have a lot of children which may emit the same event, I don't have to explicitly connect all of them to the handler. I can just put the handler in the parent, or some higher level in the window hierarchy. This means that only the event generator and consumer need to know about the event at all. The consumer doesn't need to know where the source of the event is, how many such sources there are, or anything else about it.
Is this possible in Qt - is there another approach? Maybe there is an alternative event mechanism to signals and slots?
This isn't easily possible - you have to have something that knows about the signaling object and the receiving object to connect the two. Depending on what you need, however, you might be able to set up a class that mediates between the two (so objects with signals tell the class they exist, and have such-and-such a signal, while objects with slots tell the class they exist and have such-and-such a slot to connect to a given signal, and the mediator class tracks both of those, making connections when necessary).
Don't you just want a good old fashioned method invocation? The response is just the return value of the method.
Signal handlers do NOT know the emitter (only the signal type) and emitters do NOT know what handlers are connected. Many handlers can connect to the same signal and they are executed in the order of connection. A signal can be emitted from many places.