I have an unsual problem. In my Django application I use signals to send emails.
All of signals work except for the one fired from django-admin command - django.core.management.base.NoArgsCommand (which is run through manage.py).
I checked my signal in different places, it works except for this place.
Here's the code where I send the signal:
from django.core.management.base import NoArgsCommand
class Command(NoArgsCommand):
help = "Send email advertisement expiration reminder to users"
def handle_noargs(self, **options):
from app.models import Advertisement, User
from app.signals import ad_expires
from datetime import datetime
start=datetime(datetime.now().year, datetime.now().month, datetime.now().day+4,0,0)
end=datetime(datetime.now().year,datetime.now().month,datetime.now().day+4,23,59)
ads=Advertisement.objects.filter(visible_till__gte=start).filter(visible_till__lte=end)
for ad in ads:
ad_expires.send(self,ad=ad, user=ad.user)
print "Expiration reminders sent to %s users" % len(ads)
Am I doing something wrong?
Also, is there any easier way to check date within one day?
The shortcut is :
start = datetime.now() + timedelta(days=4)
end = start + timedelta(days=1)
ads=Advertisement.objects.filter(visible_till__gte=start).filter(visible_till__lt=end)
Can you post your project structure here? Your code looks good to me.
The only thing I can think of, is that the signal handler hasn't been registered at the time the django-admin function executes. You can check this by preceding the listener with a print statement and running your management command.
Try putting the signal listener into the app/__init__.py file. Since you're accessing the app package, everything in __init__.py should execute, registering the listener.
Related
In a Django app, I have a form that schedules an email to be sent out. It has four fields: name, email, body, send_date. I want to dynamically create a Celery task (email) to run another Celery task at the designated time.
I have been able to send out the email at regular intervals (every 30 seconds) based on the form using the following code:
schedule, _ = IntervalSchedule.objects.update_or_create(every=30, period=IntervalSchedule.SECONDS)
#shared_task(name="schedule_interval_email")
def schedule_email_interval(name, email, body, send_date):
PeriodicTask.objects.update_or_create(
defaults={
"interval": schedule,
"task": "email_task"
},
name="Send message at interval",
args=json.dumps(['name', 'test#test.com', 'body']),
)
However, when I have tried to schedule a task to run at a specific time (3 minutes later from the current time) via ClockedSchedule, Celery beat records the tasks and saves all the relevant settings. The task appears active in the Django admin area. However, the email never actually gets sent.
clocked = ClockedSchedule.objects.create(clocked_time=datetime.now() + timedelta(minutes=3))
#shared_task(name="schedule_clock_email")
def schedule_email_clocked(name, email, body, send_date):
PeriodicTask.objects.create(
clocked=clocked,
name="Send message at specific date/time",
task="email_task",
one_off=True,
args=json.dumps(['name', 'test#test.com', 'body']),
)
I eventually want to dynamically set the clocked field based on the datetime the user enters into the form, so the current code is just trying to test out the way Celery works. I think I'm missing something about how this works, though. Any thoughts would be greatly appreciated.
I guess it's the time zone.
If your configuration does not take effect.The celery database stores UTC time by default.
You can verify this by executing the following code.
In Django's shell environment
import datetime
import json
from ProjectName.celerytest import test
execution_time = datetime.datetime.utcnow() + datetime.timedelta(minutes=2)
schedule, created = ClockedSchedule.objects.get_or_create(clocked_time=execution_time)
PeriodicTask.objects.create(clocked=schedule,one_off=True,name="test input",task="projectName.celerytest.test",args=json.dumps(['112233']))
I am still new to Python. I am working on a code that collects data from a form including datetime object and stores on a database and then returns a success page.
My Problem and this is where i'm stuck is how to send reminder emails at certain intervals to specified emails based on the datetime value entered in the form if the datetime entered in the form is sometime in the future. reminders can automatically be sent at two days interval until the datetime.
I have created the send email function and it works fine. I send emails to specified emails when a data is submitted. but my problem is how do I send reminder emails based on the date entered in the form.
I need help please.
I have tried creating a while loop that checks if the current date and time is equal to the date entered in the form. but this while loop will prevent the form from sending data to the database since it has to wait until the while loop is exited.
I need ideas on how to achieve this
Here is the code that sends email automatically when a data is entered in the database.
from flask import Flask, render_template, request
from flask_sqlalchemy import SQLAlchemy
from flask import redirect
import jinja2
import datetime
from datetime import datetime
from datetime import time
from datetime import date
from datetime import timedelta
from send_email import send_email
#app.route("/success", methods=['POST'])
def success():
if request.method == 'POST':
company=request.form["companyname"]
title=request.form["ttitle"]
tnumber=request.form["tnumber"]
ttype=request.form["ttype"]
tstatus=request.form["tstatus"]
tduedate=request.form["tduedate"]
cperson=request.form["cperson"]
cpersonemail=request.form["cpersonemail"]
cpersonno=request.form["cpersonnumber"]
comments=request.form["comments"]
send_email(tnumber, company)
data=Data(company, title, tnumber,ttype,tstatus,tduedate,cperson,cpersonemail, cpersonno, comments)
db.session.add(data)
db.session.commit()
return render_template("success.html")
I need someone to guide me on how i can send out reminders when this success() method is executed based on the tduedate entered in the form.
Thanks in advance
The Advanced Python Scheduler might be more appropriate for your use case https://apscheduler.readthedocs.io/en/latest/
from apscheduler.schedulers.background import BackgroundScheduler
scheduler = BackgroundScheduler()
scheduler.start()
Then in your view
from apscheduler.triggers.date import DateTrigger
#app.route("/success", methods=['POST'])
def success():
...
scheduler.add_job(
send_email, DateTrigger(tduedate), args=(tnumber, company)
)
You'll want to consider a persistent jobstore if scheduled jobs need to persist when the application stops running.
I'm new in Django and I am creating a web application for uni project. I have to send emails periodically, and to do so I'm using a management command, but I don't know how to make it automatically run when I start the server.
I'm working on Pycharm in Windows 8.1
from django.core.mail import send_mail
from django.core.management.base import BaseCommand
from ProgettoDinamici.settings import EMAIL_HOST_USER
from products.models import Notification
from users.models import User
class Command(BaseCommand):
help = 'Sends emails periodically'
def handle(self, *args, **options):
users = User.objects.all()
for u in users:
try:
notify = Notification.objects.filter(receiver=u, read=False)
count = notify.count()
except:
print("No notification found")
try:
if notify:
send_mail(
'E-Commerce',
'You have ' + str(count) + ' notifications.',
EMAIL_HOST_USER,
[u.email],
fail_silently=False,
)
except:
print("error")
For now I tried to use schedule and cron to repeat the send_email every n minutes, but nothing worked and searching online I found out that cron (and cron based) ins't supported by Windows. But this is another problem...
You can use celery for periodic tasks. Just convert the function handle into a celery task and you can schedule cron jobs on that tasks.
You can refer: https://docs.celeryproject.org/en/latest/userguide/periodic-tasks.html
I have been trying to run a cron job with GAE (code developed in Python), but when I trigger the job, it fails without any error message -- I can't find anything at all in the logs.
This is happening for a service for which I'm using the flexible environment.
This is the structure of my files:
my_service.yaml looks like this:
service: my_service
runtime: custom
env: flex
env_variables:
a:
b:
the my_service.app looks like this:
from __future__ import absolute_import
from flask import Flask
from flask import request
import logging
import datetime
import os
import tweepy
from google.cloud import datastore
import time
logging.basicConfig(level=logging.INFO)
app = Flask(__name__)
#app.route('/Main')
def hello():
"""A no-op."""
return 'nothing to see.'
#app.route('/my_service')
def get_service():
is_cron = request.headers.get('X-Appengine-Cron', False)
logging.info("is_cron is %s", is_cron)
# Comment out the following test to allow non cron-initiated requests.
if not is_cron:
return 'Blocked.'
## data scraping and saving in Datastore
return 'Done.'
#app.errorhandler(500)
def server_error(e):
logging.exception('An error occurred during a request.')
return """
An internal error occurred: <pre>{}</pre>
See logs for full stacktrace.
""".format(e), 500
if __name__ == '__main__':
app.run(host='127.0.0.1', port=8080, debug=True)
Then I have a dispatch.yaml with this structure:
dispatch:
- url: "*/my_service*"
service: my_service
And a cron.yaml:
cron:
- description: run my service
url: /my_service
schedule: 1 of month 10:00
target: my_service
Not sure what I'm doing wrong here.
EDIT
A bit of context. This is something I'm editing starting from this repo.
The service called backend that is defined in there works perfectly (it has also the same schedule in the cron job as my_service but when I trigger it in a day different from the one in which it's scheduled, it works just fine). What I did was to create an additional service with its own yaml file, which looks exactly the same as the beckend.yaml, its own my_service.py and adding it to the dispacth.yamland the cron.yaml. In theory this should work, since the structure is exactly the same, but it doesn't.
This service was originally developed in the standard environment and there it was working, the problem originated when I moved it to the flex environment.
EDIT 2:
The problem was actually in the Dockerfile, that was calling a service that I was not using.
EDIT:
def get(self): may have some issues.
First, get may be reserved. Second, you aren't able to send self to that function. Change that to:
def get_service():
EDIT2:
You also need to import logging at the top of any page that uses it. And, you have not imported Flask and its components:
from flask import Flask, request, render_template # etc...
import logging
Your 1 of month 10:00 cron schedule specification can most likely be the culprit: it specifies to run the job at 10:00 only on the first day of each month! From Defining the cron job schedule:
Example: For the 1,2,3 of month 07:00 schedule, the job runs one
time at 07:00 on the first three days of each month.
So the last execution happened 3 days ago (if this cron config was deployed at the time) and no other attempt will happen until Nov 1st :)
Change the schedule to something easier to test with, like every 5 minutes or every 1 hours and revert the change once you're happy it works as expected.
I integrated my project with celery in this way, inside views.py after receving request from the user
def upload(request):
if "POST" == request.method:
# save the file
task_parse.delay()
# continue
and in tasks.py
from __future__ import absolute_import
from celery import shared_task
from uploadapp.main import aunit
#shared_task
def task_parse():
aunit()
return True
In short, the shared task will run a function aunit() from a third python file located in uploadapp/ directory named main.py.
Let's assume that aunit() is a resource heavy process which takes time (like file parsing). As I integrated that with celery, It works totally asynchronously now which is good to me. So, the task start -> Celery process -> It finishes then celery set status to Finish. I can view that using flower .
But what I want to do is that I want to notify the user who is using my app also through django UI that Your Task is done processing as soon as Celery has finished processing at back-side and set status to SUCCESS.
Now, I know this is possible if :
1.) I constantly request the STATUS and see wheather it returns SUCCESS or not.
How do I do that via Celery. How can you query Celery Task status from your views.py and notify user asynchronously with just celery's python module ?
You need a real time mechanism. I would suggest Firebase. Update the Firebase real time DB field of user id with a boolean=True at the end of the celery task. Implement a javascript function to listen to Firebase database user_id object changes -> update the UI