I am using the below packages.
celery==5.1.2
Django==3.1
I have 2 periodic celery tasks, in which I want the first task to run every 15 mins and the second to run every 20 mins. But the problem is that the first task is running on time, while the second is running on random timing.
Although I'm getting a message on console on time for both tasks:
Scheduler: Sending due task <task_name> (<task_name>)
Please find the following files,
celery.py
from celery import Celery, Task
app = Celery('settings')
...
class PeriodicTask(Task):
#classmethod
def on_bound(cls, app):
app.conf.beat_schedule[cls.name] = {
"schedule": cls.run_every,
"task": cls.name,
"args": cls.args if hasattr(cls, "args") else (),
"kwargs": cls.kwargs if hasattr(cls, "kwargs") else {},
"options": cls.options if hasattr(cls, "options") else {}
}
tasks.py
from celery.schedules import crontab
from settings.celery import app, PeriodicTask
...
#app.task(
base=PeriodicTask,
run_every=crontab(minute='*/15'),
name='task1',
options={'queue': 'queue_name'}
)
def task1():
logger.info("task1 called")
#app.task(
base=PeriodicTask,
run_every=crontab(minute='*/20'),
name='task2'
)
def task2():
logger.info("task2 called")
Please help me to find the bug here. Thanks!
Related
I have written an algorithm. And when I use python manage.py runserver, my website will run on the local server.
Now I want to run my algorithm after python manage.py runserver.
In other words, when I start the django website, I hope the algorithm will run in the background until it is completed. And I want to know if the algorithm is still running or the algorithm is complete.
What should I do?
Thanks.
something like this:
def function_that_downloads(my_args):
# do some he
re
def __init__(self, function_that_downloads):
threading.Thread.__init__(self)
self.runnable = function_that_downloads
self.daemon = True
def run(self):
self.runnable()
Hi you might want to checkout Django Celery Beat so later you can just define some task inside your Django Application and execute your function periodically
from celery import Celery
from celery.schedules import crontab
app = Celery()
#app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
# Calls test('hello') every 10 seconds.
sender.add_periodic_task(10.0, test.s('hello'), name='add every 10')
# Calls test('world') every 30 seconds
sender.add_periodic_task(30.0, test.s('world'), expires=10)
# Executes every Monday morning at 7:30 a.m.
sender.add_periodic_task(
crontab(hour=7, minute=30, day_of_week=1),
test.s('Happy Mondays!'),
)
#app.task
def test(arg):
print(arg)from celery import Celery
I have problem with daily scheduled tasks with crontab.
Here is my celery.py
app.conf.beat_schedule = {
'run-cache-updater': {
'task': 'tasks.run_cache_updater',
'schedule': crontab(
minute=0,
hour='1-4'
),
}
}
Below is my tasks.py
What I am doing there is, getting all records from DB. Triggering other jobs to update my caches on Redis.
#app.task
def run_cache_updater():
batch_size = 1000
cache_records = models.CacheRecords.objects.all()
def _chunk_list(all_records, size_of_batch):
for i in range(0, len(all_records), size_of_batch):
yield [item.id for item in all_records[i: i + batch_size]]
for items in _chunk_list(cache_records, batch_size):
update_cache.delay(items)
#app.task
def update_cache(ids_in_chunks):
for id in ids_in_chunks:
# Some calls are done here. Then sleep for 200 ms.
time.sleep(0.2)
My tasks are running good. However, they start to run between 1 and 4 and then they start again every 4 hours like 8-11, 15-18..
What I am doing wrong here and how can I fix it?
This sounds like a Celery bug, it's probably worth raising on their Github repo.
However, as a workaround, you could try the more explicit notation, hour='1,2,3,4', just in case the issue is in the parsing of that specific crontab interval style.
I need configue to which queue celery should put result of task execution, I am using this way as described in documentation (item "reply_to"):
#app.task(reply_to='export_task') # <= configured right way
def test_func():
return "here is result of task"
Expected behavior
Task result should be in queue with name "export_task" (as configured in decorator)
Actual behavior
Task result locates in queue with name like:
d5587446-0149-3133-a3ed-d9a297d52a96
celery report:
python -m celery -A my_worker report
software -> celery:3.1.24 (Cipater) kombu:3.0.37 py:3.5.1
billiard:3.3.0.23 py-amqp:1.4.9
platform -> system:Windows arch:64bit, WindowsPE imp:CPython
loader -> celery.loaders.app.AppLoader
settings -> transport:amqp results:rpc:///
CELERY_ACCEPT_CONTENT: ['json']
CELERY_RESULT_BACKEND: 'rpc:///'
CELERY_QUEUES:
(<unbound Queue main_check -> <unbound Exchange main_check(direct)> -> main_check>,)
CELERYD_CONCURRENCY: 10
CELERY_TASK_SERIALIZER: 'json'
CELERY_RESULT_PERSISTENT: True
CELERY_ROUTES: {
'my_worker.test_func': {'queue': 'main_check'}}
BROKER_TRANSPORT: 'amqp'
CELERYD_MAX_TASKS_PER_CHILD: 3
CELERY_RESULT_SERIALIZER: 'json'
Steps to reproduce
Please create files of project.
celery_app.py:
from celery import Celery
from kombu import Exchange, Queue
app = Celery('worker')
app.conf.update(
CELERY_ROUTES={
'my_worker.test_func': {'queue': 'main_check'},
},
BROKER_TRANSPORT='amqp',
CELERY_RESULT_BACKEND='rpc://',
CELERY_RESULT_PERSISTENT=True,
# CELERY_DEFAULT_DELIVERY_MODE='persistent',
# CELERY_RESULT_EXCHANGE='export_task',
CELERYD_CONCURRENCY=10,
CELERYD_MAX_TASKS_PER_CHILD=3,
CELERY_TASK_SERIALIZER='json',
CELERY_RESULT_SERIALIZER='json',
CELERY_ACCEPT_CONTENT=['json'],
CELERY_QUEUES=(
Queue('main_check', Exchange('main_check', type='direct'), routing_key='main_check'),
),
)
my_worker.py:
from celery_app import app
#app.task(reply_to='export_task')
def test_func():
return "here is result of task"
then start celery:
python -m celery -A my_worker worker --loglevel=info
then in python debug console add new task:
from my_worker import *
result = test_func.delay()
I asked to help on official issue tracker, but nobody cares.
I don't see in your code where that queue (export_task) has been declared.
Hello friends I need your help in periodic tasks for django application. I am trying to do periodic task using celery but it is not properly working. I have a simple task, in which there is only a print statement. Celery is working only for 1st time, and I also tried 'celery beat', but haven't got result
my "task.py" is
from __future__ import absolute_import
from myapp.celery import app
from celery.schedules import crontab
from celery.task import periodic_task
from celery.registry import tasks
#periodic_task(run_every=(crontab(minute='*/1')), name="some_task")
def every_minute(a,b):
print("This is running after one minute",a+b)
return "task done"
tasks.register(every_minute)
and "view.py"
from django.http import HttpResponse
from django.views.generic import View
from .tasks import *
from .models import *
from datetime import datetime, timedelta
class CeleryTest(View):
def get(self,request):
send_date = datetime.now() + timedelta(seconds=200)
task=every_minute.apply_async([5,6],etc=send_date)
while not task.ready():
print "calling............task is not ready"
return HttpResponse("hi get ur task")
I just added this schedule in earlier celery setting."setting.py"
CELERYBEAT_SCHEDULE = {
'every_minute': {
'task': 'every_minute.add',
'schedule': crontab(minute='*/1'),
'args': (5, 6),
},
}
Thank you friends for your time.
use your Schedule like this....
CELERYBEAT_SCHEDULE = {
'every_minute': {
'task': 'every_minute',
},
}
and run this command for celery
python manage.py celeryd -BE -l info
now my periodic tasks running fine.
In the celery docs, section Instantiation (http://celery.readthedocs.org/en/latest/userguide/tasks.html#custom-task-classes) the following is stated:
A task is not instantiated for every request, but is registered in the task registry as a global instance.
This means that the init constructor will only be called once per process, and that the task class is semantically closer to an Actor.
Nevertheless, when I run the following example I see that the init method is called at least 3 times. What is wrong in the setup? The CELERYD_CONCURRENCY = 1 should make sure that there is only one process per worker, right?
$ celery -A proj beat
celery beat v3.1.17 (Cipater) is starting.
init Task1
40878160
x=1.0
init Task1
40878352
x=1.0
init Task1
40879312
x=1.0
__ - ... __ - _
Configuration ->
. broker -> amqp://guest:**#localhost:5672//
. loader -> celery.loaders.app.AppLoader
. scheduler -> celery.beat.PersistentScheduler
. db -> celerybeat-schedule
. logfile -> [stderr]#%INFO
. maxinterval -> now (0s)
[2015-02-05 23:05:21,875: INFO/MainProcess] beat: Starting...
[2015-02-05 23:05:21,971: INFO/MainProcess] Scheduler: Sending due task task1-every-5-seconds (proj.tasks.t1)
[2015-02-05 23:05:26,972: INFO/MainProcess] Scheduler: Sending due task task1-every-5-seconds (proj.tasks.t1)
celery.py:
from __future__ import absolute_import
from datetime import timedelta
from celery import Celery
app = Celery('proj',
broker='amqp://guest#localhost//',
backend='amqp://',
include=['proj.tasks'])
app.conf.update(
CELERY_REDIRECT_STDOUTS=True,
CELERY_TASK_RESULT_EXPIRES=60,
CELERYD_CONCURRENCY = 1,
CELERYBEAT_SCHEDULE = {
'task1-every-5-seconds': {
'task': 'proj.tasks.t1',
'schedule': timedelta(seconds=5)
},
},
CELERY_TIMEZONE = 'GMT',
)
if __name__ == '__main__':
app.start()
tasks.py:
from __future__ import absolute_import
from proj.celery import app
from celery import Task
import time
class Foo():
def __init__(self, x):
self.x = x
class Task1(Task):
abstract = True
def __init__(self):
print "init Task1"
print id(self)
self.f = Foo(1.0)
print "x=1.0"
#app.task(base=Task1)
def t1():
t1.f.x +=1
print t1.f.x
So, as per your comment, you need to maintain one connection per thread.
Why not to use a thread storage then? It should be a safe solution in your case.
from threading import local
thread_storage = local()
def get_or_create_conntection(*args, **kwargs):
if not hasattr(thread_storage, 'connection'):
thread_storage.connection = Connection(*args, **kwargs)
return thread_storage.connection
#app.task()
def do_stuff():
connection = get_or_create_connection('some', connection='args')
connection.ping()