django custom signals vs regular functions - python

I started to learn on django signal feature when I wanted to run a method only when a certain event has occurred.
So I read about custom signals and I couldn't find a reason to use signals over regular functions calls.
I read in the django Guide about how to call a signal:
The signal
pizza_done = django.dispatch.Signal(providing_args=["toppings", "size"])
The call:
def send_pizza(self, toppings, size):
pizza_done.send(sender=self.__class__, toppings=toppings, size=size)
and i can't understand how they are different. Then I just call directly to the
receiver function...
Can someone enlighten me about it?

You can think of signals as an implementation of observer design pattern.
It's propose is to remove the dependencies from observable to observer. So observable know nothing about the observer.
Like saving a model (post_save and pre_save signals). You could face cases you need YourModel.save method not to know about that a task should be done after/before saving.
Django default signals are a good sample. But I don't think you will need to create custom signal everyday (nor every project!).
And I suggest that you shouldn't use signals (default signals and custom signals) until you really need them, make them the last choice. As they can make your code hard to read, as reader can't figure out easily where are your logic.
Django before 1.7 doesn't have a convention for where to place your signals receivers. In django >= 1.7 you should use AppConfig.ready to connect the receivers.

Django signals are the implementation of the publish-subscribe design pattern. They allows you to decouple the different parts of the project. For example when you develop the "make pizza" app you don't have to know anything about "delivery pizza" app.
Another great example of this pattern - django built-in pre_save/post_save signal. You can add any logic to this event at any time without modifying of the django core.
Widely used scenario is to add user profile info after creating user instance. You just add the listener to post_save(sender=User) event.

Related

Django save model everyday

I have a model and a signal in models.py and this model sends message to discord webhook with how many days left to something. I want to refresh it everyday at 12:00 AM everyday automatically without using django-celery cause it doesnt work for me. My plan is do something like this
time_set = 12
if time_set == timezone.now().hour:
...save model instances...
but i have totally no idea how to do it
And i want to do it this way cause when model instance are saved signal runs
Django doesn't handle this scenario out of the box, hence the need for celery and its ilk. The simplest way is to set a scheduled task on the operating system that calls a custom django management command (which is essentially a python script that can reference your django models and methods etc by calling python manage.py myNewCommand).
You can find more about custom commands at https://docs.djangoproject.com/en/4.0/howto/custom-management-commands/
You can create a custom management command and call it by using a cron entry, set to run every day.
Check Django official documentation for the instructions on creating the custom command.
Instead of calling the save() method each time, I'd create a send_discord_message() on the model, and call it wherever required. If you need to execute it every time an instance is saved, then is preferred to use an override save() method in the model. Signals are a great way to plug and extend different apps together, but they have some caveats and it is simpler to override the save() method.
I'm supposing you are using a Unix-like system. You can check how to configure and create cron jobs.

How to implement event handling in Python Flask

Is there a canonical way in Python, and better yet in Flask, to implement an embedded event handling system?
With event handling, we mean something similar to Symfony's EventDispatcher, which is a simple, embedded system to decouple event dispatch and handling.
The goal is, in a Flask SQLAlchemy API App, to react to creation (or modification) of data objects with creation of additional, derived data objects to maintain a certain sense of consistency. There should be various handlers, each making sure that a certain type of consistency is maintained. These should be separated. It is possible that a handler triggers creation of data, which in turn prompts another handler to create data. This chain should be executed within the context of each request.
These are the options I came across, plus the reasons I ruled them out:
Flask signals - discourage us to manipulate data
Pyee - this seems closest to what we need, but not sure how well it integrated with Flask
Celery - seems more suited for async long running tasks
Whistle - one of many shots at the topic that don't seem exceptionally well maintained
RabbitMQ, ironMQ or similar - We would like to handle everything within the Flask app

Viewflow Signal for New tasks?

I am looking to announce in my slack channels whenever a new task becomes available.
Looking at the src, it seems like there is only a signal whenever a task is started.
How can I create a signal when a task becomes available?
Generally, using signals to interact within your own application is a bad design decision.
You can implement same functionality more explicit by implementing a custom node, that would perform an callback on create
class MyFlow(Flow):
...
approve = (
MyView(flow_views.UpdateProcessView, fields=['approved'])
.onCreate(this.send_notification)
.Next(this.check_approve)
)
...
You can handle create action by overriding the node activation class activate method
The viewflow custom node sample could be helpful as a reference for custom node implementation https://github.com/viewflow/viewflow/blob/master/demo/customnode/nodes.py

why there is no response_finished signal in Django?

I am trying to implement some kind of background task queue in Django, because Celery is too huge & complex, then it occured to me that, there is already a signal called request_finished
https://docs.djangoproject.com/en/dev/ref/signals/#django.core.signals.request_finished
But why Django do not have a signal called response_finished ?
Django may be synchronous, but I can do some post-response data processing and saving tasks, it only taks few more steps.
Is hacking a way to do some post-response work possible in Django?
TIA
You can write your own middleware (specifically using process_response) if you need to perform tasks after the response has been assembled. There would be no point in having a signal handler after the response is 'finished' as by that stage, you have executed your view and rendered your template.
since no one answers this, I have some conclusions myself
https://groups.google.com/d/topic/python-web-sig/OEahWtsPwq4/discussion
It's basically a wsgi design behavior. Wsgi will not care what happens after respons iterator stops.

EventListener mechanism in twisted

I just wanted to hear ideas on correct usage of EventListener/EventSubscription provider in twisted. Most of the examples and the twisted source handle events through the specific methods with a pretty hard coupling. Dispatching target methods of those events are "hardcoded" in a specific Protocol class and then it is a duty of inheriting class to override these to receive the "event". This is very nice and transparent to use while we know of all of potential subscribers when creating the Protocol.
However in larger projects there is a need (perhaps I am in a wrong mindset) for a more dynamic event subscription and subscription removal: think of hundereds of object with a lifespan of a minute all interested in the same event.
What would be correct way to acheive this according to the "way of twisted". I currently have created an event subscription / dispatching mechanism, however there is a lingering thought that the lack of this pattern in twisted library might suggest that there is a better way.
Twisted havs a package "twisted.words.xish.utility.EventDispatcher", pydoc it to know the usage, it is simple. However, I think what make Twisted strong is its "Deferred". You can looks Deferred object as a Closure of related events (something OK, something failed), callback, fallback are registed observer function. Deferred has advanced feature, such as can be nested.
So in my opinion, you can use default EventDispatcher in Twisted, or invent some simple new. But If you introduce some complicated mechanism into Twisted, it dooms to lead a confusion and mess.

Categories