Django Celery: Clocked task is not running - python

In a Django app, I have a form that schedules an email to be sent out. It has four fields: name, email, body, send_date. I want to dynamically create a Celery task (email) to run another Celery task at the designated time.
I have been able to send out the email at regular intervals (every 30 seconds) based on the form using the following code:
schedule, _ = IntervalSchedule.objects.update_or_create(every=30, period=IntervalSchedule.SECONDS)
#shared_task(name="schedule_interval_email")
def schedule_email_interval(name, email, body, send_date):
PeriodicTask.objects.update_or_create(
defaults={
"interval": schedule,
"task": "email_task"
},
name="Send message at interval",
args=json.dumps(['name', 'test#test.com', 'body']),
)
However, when I have tried to schedule a task to run at a specific time (3 minutes later from the current time) via ClockedSchedule, Celery beat records the tasks and saves all the relevant settings. The task appears active in the Django admin area. However, the email never actually gets sent.
clocked = ClockedSchedule.objects.create(clocked_time=datetime.now() + timedelta(minutes=3))
#shared_task(name="schedule_clock_email")
def schedule_email_clocked(name, email, body, send_date):
PeriodicTask.objects.create(
clocked=clocked,
name="Send message at specific date/time",
task="email_task",
one_off=True,
args=json.dumps(['name', 'test#test.com', 'body']),
)
I eventually want to dynamically set the clocked field based on the datetime the user enters into the form, so the current code is just trying to test out the way Celery works. I think I'm missing something about how this works, though. Any thoughts would be greatly appreciated.

I guess it's the time zone.
If your configuration does not take effect.The celery database stores UTC time by default.
You can verify this by executing the following code.
In Django's shell environment
import datetime
import json
from ProjectName.celerytest import test
execution_time = datetime.datetime.utcnow() + datetime.timedelta(minutes=2)
schedule, created = ClockedSchedule.objects.get_or_create(clocked_time=execution_time)
PeriodicTask.objects.create(clocked=schedule,one_off=True,name="test input",task="projectName.celerytest.test",args=json.dumps(['112233']))

Related

How to activate a function for sending reminder emails from data collected from a form?

I am still new to Python. I am working on a code that collects data from a form including datetime object and stores on a database and then returns a success page.
My Problem and this is where i'm stuck is how to send reminder emails at certain intervals to specified emails based on the datetime value entered in the form if the datetime entered in the form is sometime in the future. reminders can automatically be sent at two days interval until the datetime.
I have created the send email function and it works fine. I send emails to specified emails when a data is submitted. but my problem is how do I send reminder emails based on the date entered in the form.
I need help please.
I have tried creating a while loop that checks if the current date and time is equal to the date entered in the form. but this while loop will prevent the form from sending data to the database since it has to wait until the while loop is exited.
I need ideas on how to achieve this
Here is the code that sends email automatically when a data is entered in the database.
from flask import Flask, render_template, request
from flask_sqlalchemy import SQLAlchemy
from flask import redirect
import jinja2
import datetime
from datetime import datetime
from datetime import time
from datetime import date
from datetime import timedelta
from send_email import send_email
#app.route("/success", methods=['POST'])
def success():
if request.method == 'POST':
company=request.form["companyname"]
title=request.form["ttitle"]
tnumber=request.form["tnumber"]
ttype=request.form["ttype"]
tstatus=request.form["tstatus"]
tduedate=request.form["tduedate"]
cperson=request.form["cperson"]
cpersonemail=request.form["cpersonemail"]
cpersonno=request.form["cpersonnumber"]
comments=request.form["comments"]
send_email(tnumber, company)
data=Data(company, title, tnumber,ttype,tstatus,tduedate,cperson,cpersonemail, cpersonno, comments)
db.session.add(data)
db.session.commit()
return render_template("success.html")
I need someone to guide me on how i can send out reminders when this success() method is executed based on the tduedate entered in the form.
Thanks in advance
The Advanced Python Scheduler might be more appropriate for your use case https://apscheduler.readthedocs.io/en/latest/
from apscheduler.schedulers.background import BackgroundScheduler
scheduler = BackgroundScheduler()
scheduler.start()
Then in your view
from apscheduler.triggers.date import DateTrigger
#app.route("/success", methods=['POST'])
def success():
...
scheduler.add_job(
send_email, DateTrigger(tduedate), args=(tnumber, company)
)
You'll want to consider a persistent jobstore if scheduled jobs need to persist when the application stops running.

Python Celery: Update django model after state change

I managed to find 2 similar topics with discussion about this issue, but unfortunately I couldn't get what is the best solution from it:
Update Django Model Field Based On Celery Task Status
Update Django Model Field Based On Celery Task Status
I use Django & Celery (+redis as Message broker) and I would like to update django model when celery task status changes(from pending -> success, pending -> failure) etc.
My code:
import time
from celery import shared_task
#shared_task(name="run_simulation")
def run_simulation(simulation_id: str):
t1_start = time.perf_counter()
doSomeWork() # we may change this to sleep for instance
t1_end = time.perf_counter()
return{'process_time': t1_end - t1_start}
and the particular view from which I am calling the task:
def run_simulation(request):
form = SimulationForm(request.POST)
if form.is_valid():
new_simulation = form.save()
new_simulation.save()
task_id = tasks.run_simulation.delay(new_simulation.id)
The question is, what is the preferred way to update django model state of Simulation when status of task has been changed?
In the docs I found handlers that are using methods on_failure, on_success etc. http://docs.celeryproject.org/en/latest/userguide/tasks.html#handlers
I don't think there's a preferred method to do something like this since it depends on your project.
You can use a monitoring task like the link you sent. Give the task a task id and re-schedule the task until the monitored task is in its FINISHED state.
from celery import AsyncResult
#app.task(bind=True)
def monitor_task(self, t_id):
"""Monitor a task"""
res = AsyncResult(t_id, backend=self.backend, app=self.app)
if res.ready():
raise self.retry(
countdown=10,
exc=Exception("Main task not done yet.")
)
You can also create an event receiver and check the state of the task then save it on the DB.
http://docs.celeryproject.org/en/latest/userguide/monitoring.html#real-time-processing
Now, if you are only interested in success and failure states then you can create success and failure callbacks and the care of saving the success or failure state in the DB there.
tasks.run_simulation.apply_async(
(sim_id,),
link=tasks.success_handler.s(),
link_error=tasks.error_handler()
)
http://docs.celeryproject.org/en/latest/userguide/calling.html#linking-callbacks-errbacks

How to provide user constant notification about Celery's Task execution status?

I integrated my project with celery in this way, inside views.py after receving request from the user
def upload(request):
if "POST" == request.method:
# save the file
task_parse.delay()
# continue
and in tasks.py
from __future__ import absolute_import
from celery import shared_task
from uploadapp.main import aunit
#shared_task
def task_parse():
aunit()
return True
In short, the shared task will run a function aunit() from a third python file located in uploadapp/ directory named main.py.
Let's assume that aunit() is a resource heavy process which takes time (like file parsing). As I integrated that with celery, It works totally asynchronously now which is good to me. So, the task start -> Celery process -> It finishes then celery set status to Finish. I can view that using flower .
But what I want to do is that I want to notify the user who is using my app also through django UI that Your Task is done processing as soon as Celery has finished processing at back-side and set status to SUCCESS.
Now, I know this is possible if :
1.) I constantly request the STATUS and see wheather it returns SUCCESS or not.
How do I do that via Celery. How can you query Celery Task status from your views.py and notify user asynchronously with just celery's python module ?
You need a real time mechanism. I would suggest Firebase. Update the Firebase real time DB field of user id with a boolean=True at the end of the celery task. Implement a javascript function to listen to Firebase database user_id object changes -> update the UI

How to push notification to specific logged user. My code is shown below

I am using event stream in front end. Yield function in back end. Storing client in redis queue.
I am storing users correctly in redis queue, but I don't know to send push notifications to specific logged user.
My problem is how to push notification to specific logged user.
Front end code:
var source = new EventSource('/stream');
source.onmessage = function (event) {
console.log(event)
};
Back end code:
from redis import Redis
redis = Redis()
import time
from datetime import datetime
p = redis.pipeline()
app.config['ONLINE_LAST_MINUTES'] = 5
def check_updates():
yield 'data: %s \n\n' % data
#app.route('/stream')
##nocache
def stream():
return Response(check_updates(),mimetype='text/event-stream')
Here is a snippet shows how to use redis operate user:
http://flask.pocoo.org/snippets/71/
if you use flask-login, it's easy to get current_user and return the specify message for him/her by checking all the user's fields in database.

Sending django signals from django-admin command?

I have an unsual problem. In my Django application I use signals to send emails.
All of signals work except for the one fired from django-admin command - django.core.management.base.NoArgsCommand (which is run through manage.py).
I checked my signal in different places, it works except for this place.
Here's the code where I send the signal:
from django.core.management.base import NoArgsCommand
class Command(NoArgsCommand):
help = "Send email advertisement expiration reminder to users"
def handle_noargs(self, **options):
from app.models import Advertisement, User
from app.signals import ad_expires
from datetime import datetime
start=datetime(datetime.now().year, datetime.now().month, datetime.now().day+4,0,0)
end=datetime(datetime.now().year,datetime.now().month,datetime.now().day+4,23,59)
ads=Advertisement.objects.filter(visible_till__gte=start).filter(visible_till__lte=end)
for ad in ads:
ad_expires.send(self,ad=ad, user=ad.user)
print "Expiration reminders sent to %s users" % len(ads)
Am I doing something wrong?
Also, is there any easier way to check date within one day?
The shortcut is :
start = datetime.now() + timedelta(days=4)
end = start + timedelta(days=1)
ads=Advertisement.objects.filter(visible_till__gte=start).filter(visible_till__lt=end)
Can you post your project structure here? Your code looks good to me.
The only thing I can think of, is that the signal handler hasn't been registered at the time the django-admin function executes. You can check this by preceding the listener with a print statement and running your management command.
Try putting the signal listener into the app/__init__.py file. Since you're accessing the app package, everything in __init__.py should execute, registering the listener.

Categories