I've followed the celery doc https://docs.celeryproject.org/en/latest/userguide/periodic-tasks.html to create an app and periodic tasks as below:
$ tree demo/
demo/
├── config.py
├── __init__.py
└── tasks.py
$ cat demo/__init__.py
# -*- coding: utf-8 -*-
from celery import Celery
app = Celery('demo')
app.config_from_object('demo.config')
$ cat demo/config.py
# -*- coding: utf-8 -*-
BROKER_URL = "redis://127.0.0.1:6379"
CELERY_TIMEZONE='UTC'
CELERY_IMPORTS = [
"demo.tasks",
]
$ cat demo/tasks.py
from demo import app
#app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
sender.add_periodic_task(3.0, say.s(), name='say hello every 3s')
#app.task
def say():
print("Hello!")
And then run celery beat as below:
$ celery beat -A demo -l info --max-interval 10
celery beat v4.3.0 (rhubarb) is starting.
__ - ... __ - _
LocalTime -> 2019-12-12 16:26:41
Configuration ->
. broker -> redis://127.0.0.1:6379//
. loader -> celery.loaders.app.AppLoader
. scheduler -> celery.beat.PersistentScheduler
. db -> celerybeat-schedule
. logfile -> [stderr]#%INFO
. maxinterval -> 10.00 seconds (10.0s)
[2019-12-12 16:26:41,234: INFO/MainProcess] beat: Starting...
Wait for a while, there isn't any task scheduled.
However if I change to set CELERYBEAT_SCHEDULE in config, it can work well. I do that by changing demo/config.py and demo/tasks.py as below:
$ cat demo/config.py
# -*- coding: utf-8 -*-
from datetime import timedelta
BROKER_URL = "redis://127.0.0.1:6379"
CELERY_TIMEZONE='UTC'
CELERY_IMPORTS = [
"demo.tasks",
]
CELERYBEAT_SCHEDULE = {
'say hello every 10 seconds': {
'task': 'demo.tasks.say',
'schedule': timedelta(seconds=3),
},
}
$ cat demo/tasks.py
from demo import app
#app.task
def say():
print("Hello!")
Then run celery beat with the same command as before, the periodic tasks can be scheduled every 3 seconds as expected.
What's wrong with my previous setup?
FYI, I figured out another solution, changing the decorator #app.on_after_configure.connect to #app.on_after_finalize.connect can make it work. Though I don't know the exact reason at this moment.
Celery 4.x does not use the upper-case config variables - see the New lowercase settings section of the Celery documentation. Scheduler-specific configuration arguments are listed here. So, try to modify your code to have beat_schedule = { instead of CELERYBEAT_SCHEDULE = {. The link you say you followed also uses lower-case names...
You must specify the scheduler by --scheduler key according this userguide. For example you could try something like this code below:
celery beat -A demo -l info --scheduler django_celery_beat.schedulers:DatabaseScheduler
P.S. I don't sure that --max-interval not interfere with your schedule period. Think is better to remove this option. IMHO.
Related
How does you develop when using celery ?
Seem it require reload for every change,
I'm using command:
watchmedo auto-restart --directory=proj/ -p '*.py' --recursive -- celery -A proj worker --concurrency=1 --loglevel=INFO
cellery.py
from decouple import AutoConfig
cwd = os.getcwd()
DOTENV_FILE = cwd + '/proj/config/.env'
config = AutoConfig(search_path='DOTENV_FILE')
app = Celery('proj',
broker=config('CELERY_BROKER_URL'),
backend=config('CELERY_RESULT_BACKEND'),
include=['proj.tasks'])
app.conf.update(
result_expires=3600,
)
if __name__ == '__main__':
app.start()
tasks.py
from .celery import app
#app.task
def add(x, y):
return x + y
Even if there is a technical solution for this kind of reloading I would suggest you shouldn't use celery stuff as you develop your task function because, well, it's just a function! So my approach here is to get the function done first and add celery stuff then to check if it integrates well with other things like tasks in the chain, django, etc. The same technic will apply if you think about unit testing.
I need a minimum example to do periodic task (run some function after every 5 minutes, or run something at 12:00:00 etc.).
In my myapp/tasks.py, I have,
from celery.task.schedules import crontab
from celery.decorators import periodic_task
from celery import task
#periodic_task(run_every=(crontab(hour="*", minute=1)), name="run_every_1_minutes", ignore_result=True)
def return_5():
return 5
#task
def test():
return "test"
When I run celery workers it does show the tasks (given below) but does not return any values (in either terminal or flower).
[tasks]
. mathematica.core.tasks.test
. run_every_1_minutes
Please provide a minimum example or hints to achieve the desired results.
Background:
I have a config/celery.py which contains the following:
import os
from celery import Celery
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "config.settings.local")
app = Celery('config')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
And in my config/__init__.py, I have
from .celery import app as celery_app
__all__ = ['celery_app']
I added a function something like below in myapp/tasks.py
from celery import task
#task
def test():
return "test"
When I run test.delay() from shell, it runs successfully and also shows the task information in flower
To run periodic task you should run celery beat also. You can run it with this command:
celery -A proj beat
Or if you are using one worker:
celery -A proj worker -B
I need configue to which queue celery should put result of task execution, I am using this way as described in documentation (item "reply_to"):
#app.task(reply_to='export_task') # <= configured right way
def test_func():
return "here is result of task"
Expected behavior
Task result should be in queue with name "export_task" (as configured in decorator)
Actual behavior
Task result locates in queue with name like:
d5587446-0149-3133-a3ed-d9a297d52a96
celery report:
python -m celery -A my_worker report
software -> celery:3.1.24 (Cipater) kombu:3.0.37 py:3.5.1
billiard:3.3.0.23 py-amqp:1.4.9
platform -> system:Windows arch:64bit, WindowsPE imp:CPython
loader -> celery.loaders.app.AppLoader
settings -> transport:amqp results:rpc:///
CELERY_ACCEPT_CONTENT: ['json']
CELERY_RESULT_BACKEND: 'rpc:///'
CELERY_QUEUES:
(<unbound Queue main_check -> <unbound Exchange main_check(direct)> -> main_check>,)
CELERYD_CONCURRENCY: 10
CELERY_TASK_SERIALIZER: 'json'
CELERY_RESULT_PERSISTENT: True
CELERY_ROUTES: {
'my_worker.test_func': {'queue': 'main_check'}}
BROKER_TRANSPORT: 'amqp'
CELERYD_MAX_TASKS_PER_CHILD: 3
CELERY_RESULT_SERIALIZER: 'json'
Steps to reproduce
Please create files of project.
celery_app.py:
from celery import Celery
from kombu import Exchange, Queue
app = Celery('worker')
app.conf.update(
CELERY_ROUTES={
'my_worker.test_func': {'queue': 'main_check'},
},
BROKER_TRANSPORT='amqp',
CELERY_RESULT_BACKEND='rpc://',
CELERY_RESULT_PERSISTENT=True,
# CELERY_DEFAULT_DELIVERY_MODE='persistent',
# CELERY_RESULT_EXCHANGE='export_task',
CELERYD_CONCURRENCY=10,
CELERYD_MAX_TASKS_PER_CHILD=3,
CELERY_TASK_SERIALIZER='json',
CELERY_RESULT_SERIALIZER='json',
CELERY_ACCEPT_CONTENT=['json'],
CELERY_QUEUES=(
Queue('main_check', Exchange('main_check', type='direct'), routing_key='main_check'),
),
)
my_worker.py:
from celery_app import app
#app.task(reply_to='export_task')
def test_func():
return "here is result of task"
then start celery:
python -m celery -A my_worker worker --loglevel=info
then in python debug console add new task:
from my_worker import *
result = test_func.delay()
I asked to help on official issue tracker, but nobody cares.
I don't see in your code where that queue (export_task) has been declared.
I am using celery and trying to run the corntab. Below is my celery.py
from __future__ import absolute_import
from celery.schedules import crontab
from celery import Celery
app = Celery('Celery_1',
broker='amqp://test:test#localhost//',
include=['Celery_1.tasks'])
# Optional configuration, see the application user guide.
app.conf.update(
CELERY_TASK_RESULT_EXPIRES=3600,
CELERYBEAT_SCHEDULE = {
'T1': {
'task': 'Celery_1.tasks.add',
'schedule': crontab(minute='*/1'),
'args': (4, 5)
}
},
CELERY_IMPORTS = ('Celery_1.tasks', )
)
if __name__ == '__main__':
app.start()
And my tasks.py
from __future__ import absolute_import
from Celery_1.celery import app
#app.task(name='Celery_1.add')
def add(x, y):
return x + y
when i schedule by celery beat
but it is not running task for every minute. Can any one please help me ?
You should run it this way (while still in the Celery_1 directory):
echo $null >> __init__.py #to make your directory a python project
cd ..
celery -A Celery_1 beat
Celery_1 is the name of your app.
I have an issue with Celery queue routing when using current_app.send_task
I have two workers (each one for each queue)
python manage.py celery worker -E -Q priority --concurrency=8 --loglevel=DEBUG
python manage.py celery worker -Q low --concurrency=8 -E -B --loglevel=DEBUG
I have two queues defined in celeryconfig.py file:
# -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.core.exceptions import ImproperlyConfigured
from celery import Celery
from django.conf import settings
try:
app = Celery('proj', broker=getattr(settings, 'BROKER_URL', 'redis://'))
except ImproperlyConfigured:
app = Celery('proj', broker='redis://')
app.conf.update(
CELERY_TASK_SERIALIZER='json',
CELERY_ACCEPT_CONTENT=['json'],
CELERY_RESULT_SERIALIZER='json',
CELERY_RESULT_BACKEND='djcelery.backends.database:DatabaseBackend',
CELERY_DEFAULT_EXCHANGE='tasks',
CELERY_DEFAULT_EXCHANGE_TYPE='topic',
CELERY_DEFAULT_ROUTING_KEY='task.priority',
CELERY_QUEUES=(
Queue('priority',routing_key='priority.#'),
Queue('low', routing_key='low.#'),
),
CELERY_DEFAULT_EXCHANGE='priority',
CELERY_IMPORTS=('mymodule.tasks',)
CELERY_ENABLE_UTC = True
CELERY_TIMEZONE = 'UTC'
if __name__ == '__main__':
app.start()
In the definition of tasks, we use decorator to explicit the queue:
#task(name='mymodule.mytask', routing_key='low.mytask', queue='low')
def mytask():
# does something
pass
This task is run indeed in the low queue when this task is run using:
from mymodule.tasks import mytask
mytask.delay()
But it's not the case when it's run using: (it's run in the default queue: "priority")
from celery import current_app
current_app.send_task('mymodule.mytask')
I wonder why this later way doesn't route the task to the "low" queue!
p.s: I use redis.
send_task is a low-level method. It sends directly to the broker the task signature without going through your task decorator.
With this method, you can even send a task without loading the task code/module.
To solve your problem, you can fetch the routing_key/queue from configuration directly:
route = celery.amqp.routes[0].route_for_task("mymodule.mytask")
Out[10]: {'queue': 'low', 'routing_key': 'low.mytask'}
celery.send_task("myodule.mytask", queue=route['queue'], routing_key=route['routing_key']`