Viewflow Signal for New tasks? - python

I am looking to announce in my slack channels whenever a new task becomes available.
Looking at the src, it seems like there is only a signal whenever a task is started.
How can I create a signal when a task becomes available?

Generally, using signals to interact within your own application is a bad design decision.
You can implement same functionality more explicit by implementing a custom node, that would perform an callback on create
class MyFlow(Flow):
...
approve = (
MyView(flow_views.UpdateProcessView, fields=['approved'])
.onCreate(this.send_notification)
.Next(this.check_approve)
)
...
You can handle create action by overriding the node activation class activate method
The viewflow custom node sample could be helpful as a reference for custom node implementation https://github.com/viewflow/viewflow/blob/master/demo/customnode/nodes.py

Related

Celery: custom Task/Request attribute shared with Queue

There is a tracker class, that just counts success, failed, pending, and started tasks via redis.
The goal is to extend Celery, so its workers can access the group_id and keep statistics for the group. I expect an interface similar to:
def on_important_event(...):
group_id=uuid4()
for _ in range(count_of_jobs):
my_task.apply_async(..., group_id=group_id)
custom Task class would look like:
class MyTask(Task):
# declaring group_id somehow
def apply_async(...):
get_tracker(self.request.group_id).task_pending()
...
def before_start(...):
get_tracker(self.request.group_id).task_started()
...
def on_success(...):
get_tracker(self.request.group_id).task_success()
...
def on_failure(...):
get_tracker(self.request.group_id).task_failed()
...
I could not find a way to implement the class so it will properly save and receive custom attribute through AMQP.
UPD, to make clear:
The problem is to mark some calls of Tasks as a participant of a group. So I can track the group not general Task or a single call.
As it seems to me, there must be a way to add an attribute for Task that would be saved into Queue and then received by a Celery's worker so i can access it on Task class layer.
I would recommend a different approach - write a custom monitor (check the Monitoring API document in the official Celery docs). A good starting point: Real-time processing.
This is basically how Flower and Leek work.

Django save model everyday

I have a model and a signal in models.py and this model sends message to discord webhook with how many days left to something. I want to refresh it everyday at 12:00 AM everyday automatically without using django-celery cause it doesnt work for me. My plan is do something like this
time_set = 12
if time_set == timezone.now().hour:
...save model instances...
but i have totally no idea how to do it
And i want to do it this way cause when model instance are saved signal runs
Django doesn't handle this scenario out of the box, hence the need for celery and its ilk. The simplest way is to set a scheduled task on the operating system that calls a custom django management command (which is essentially a python script that can reference your django models and methods etc by calling python manage.py myNewCommand).
You can find more about custom commands at https://docs.djangoproject.com/en/4.0/howto/custom-management-commands/
You can create a custom management command and call it by using a cron entry, set to run every day.
Check Django official documentation for the instructions on creating the custom command.
Instead of calling the save() method each time, I'd create a send_discord_message() on the model, and call it wherever required. If you need to execute it every time an instance is saved, then is preferred to use an override save() method in the model. Signals are a great way to plug and extend different apps together, but they have some caveats and it is simpler to override the save() method.
I'm supposing you are using a Unix-like system. You can check how to configure and create cron jobs.

How can I maintain, atleast, some order in which callbacks connect to a signal?

When I simply use 'connect', there is no way I can specify the order in the callbacks will be called. Having connect_before and connect_after to connect before/after the default doesn't serve my purpose. I want something like connect_first and connect_last or anything else that will help me specify the order in which callbacks connect to a signal.
Something like this?
something.connect_first('my-signal', callback1)
somethingelse.connect_last('my-signal', callback2)
There is only one explicit ordering guarantee for GObject signals:
the class closure added when creating a new signal with G_SIGNAL_RUN_FIRST will be called before all callbacks added using g_signal_connect()
the class closure added when creating a new signal eith G_SIGNAL_RUN_LAST will be called after all callbacks added using g_signal_connect() and before all callbacks added using g_signal_connect_after()
This means that you can only control whether a callback is invoked before or after all other callbacks connected manually when you're creating a new signal - for obvious reasons, since you may want to provide an initial state at the start of an emission chain, or you want to ensure a stable state at the end of an emission chain.
As for the order of the callbacks added using g_signal_connect(), there is no explicit guarantee of any ordering. There's an implicit order, the order of connection, though, that won't likely ever be changed. Remember that signal handlers can install new handlers, or disconnect them, or block the signal emission, so relying on a specific order is usually an indication of a design problem in the code.

django custom signals vs regular functions

I started to learn on django signal feature when I wanted to run a method only when a certain event has occurred.
So I read about custom signals and I couldn't find a reason to use signals over regular functions calls.
I read in the django Guide about how to call a signal:
The signal
pizza_done = django.dispatch.Signal(providing_args=["toppings", "size"])
The call:
def send_pizza(self, toppings, size):
pizza_done.send(sender=self.__class__, toppings=toppings, size=size)
and i can't understand how they are different. Then I just call directly to the
receiver function...
Can someone enlighten me about it?
You can think of signals as an implementation of observer design pattern.
It's propose is to remove the dependencies from observable to observer. So observable know nothing about the observer.
Like saving a model (post_save and pre_save signals). You could face cases you need YourModel.save method not to know about that a task should be done after/before saving.
Django default signals are a good sample. But I don't think you will need to create custom signal everyday (nor every project!).
And I suggest that you shouldn't use signals (default signals and custom signals) until you really need them, make them the last choice. As they can make your code hard to read, as reader can't figure out easily where are your logic.
Django before 1.7 doesn't have a convention for where to place your signals receivers. In django >= 1.7 you should use AppConfig.ready to connect the receivers.
Django signals are the implementation of the publish-subscribe design pattern. They allows you to decouple the different parts of the project. For example when you develop the "make pizza" app you don't have to know anything about "delivery pizza" app.
Another great example of this pattern - django built-in pre_save/post_save signal. You can add any logic to this event at any time without modifying of the django core.
Widely used scenario is to add user profile info after creating user instance. You just add the listener to post_save(sender=User) event.

Django signals and User Auth

I'm currently following Hacked Existence's Django tutorial a bit. I'm having trouble understanding the Django signals involved
def create_User_callback(sender, instance, **kwargs):
a, b = User.objects.get_or_create(user = instance)
post_save.connect(create_User_callback, User)
I'm not quite sure the logic behind
post_save.connect(create_User_callback, User)
In order for a signal handler to work, you need to bind it to a signal. That is done using the signal's connect method. In your case, the signal is post_save.
connect is called with the name of the method and the model for which the method will be called. All models will emit post_save, so when you add User as the second argument to connect, it "filters" the signals so only the post_save signal that is emitted by the User model will trigger your method.
Think of it like tuning a radio to listen on a frequency.
Having said all that, this actual code seems a bit pointless. You are creating an object (or fetching one if it exists) of the same class that is emitting the signal; after any object has been saved.

Categories