Celery Beat doesn't follow schedule Fast API - python

This celery beat scheduler says the max tick is 5 minutes. I am developing things here so firstly it needs to automatically pick up the right times to run. Seems like it's not working correctly.
celery_beat_1 | [2022-05-12 23:18:44,224: DEBUG/MainProcess] beat: Synchronizing schedule...
celery_beat_1 | [2022-05-12 23:18:44,225: DEBUG/MainProcess] beat: Waking up in 5.00 minutes.
celery_app.py
from celery import Celery
from celery.schedules import crontab
import os
celery_app = Celery(
"worker",
backend=os.getenv("CELERY_RESULT_BACKEND", "redis://:3091#redis:6379/0"),
broker=os.getenv("CELERY_BROKER_URL", "amqp://guest:guest#rabbitmq:5672//")
)
celery_app.conf.task_routes = {
"worker.celery_worker.test_celery": "test-queue"
}
celery_app.conf.beat_schedule = {
'celery_beat_testing': {
'task': 'celery_app.tasks.test_beat',
'schedule': crontab(minute='*/1')
}
}
celery_app.conf.timezone = 'UTC'
celery_app.conf.update(task_track_started=True)
celery_worker.py
from worker.celery_app import celery_app
#celery_app.task(bind=True)
def test_celery(self):
return ('Celery Pong: {0!r}'.format(self.request))
#celery_app.task(name='test_beat')
def test_beat():
print('beat test', flush=True)

Related

Django Celery Periodic Task How to call Task method That is inside class

I want to send periodic mail to certain user After the user has been registered to certain platform I tried to send email using celery which works fine and Now I want django celery to send periodic mail as of now I have set the email time period to be 15 seconds. Further code is here
celery.py
app.conf.beat_schedule = {
'send_mail-every-day-at-4': {
'task': 'apps.users.usecases.BaseUserUseCase().email_send_task()',
'schedule': 15
}
}
I have my Class in this way ->apps.users.usecases.BaseUserUseCase.email_send_task
here is my usecases to send email
class BaseUserUseCase:
##other code is skipped
#shared_task
def email_send_task(self):
print("done")
return ConfirmationEmail(context=self.context).send(to=[self.receipent])
How do I call this email_send_task method am I doing right any help regarding this It is not working.
To enable class-based tasks:
Change your class to inherit from celery.Task
Change your #shared_task:email_send_task() to run()
Call Celery.register_task() for your class. The result would be the callable task.
Directly call the callable task from step-3 to manually enqueue tasks.
References:
https://docs.celeryproject.org/en/4.0/whatsnew-4.0.html#the-task-base-class-no-longer-automatically-register-tasks
https://docs.celeryproject.org/en/stable/userguide/application.html#breaking-the-chain
https://docs.celeryproject.org/en/stable/userguide/application.html#abstract-tasks
Register Celery Class-based Task
File structure
.
├── apps
│   └── users
│   └── usecases.py
└── my_proj
├── celery.py
└── settings.py
celery.py
import os
from celery import Celery
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "my_proj.settings") # Only if using Django. Otherwise remove this line.
app = Celery("my_app")
app.conf.update(
imports=['apps.users.usecases'],
beat_schedule={
'send_mail-every-day-at-4': {
'task': 'apps.users.usecases.BaseUserUseCase',
'schedule': 5,
},
},
)
usecases.py
from celery import Task
from my_proj.celery import app
class BaseUserUseCase(Task):
def __init__(self, context, recipient):
self.context = context
self.recipient = recipient
def run(self, context=None, recipient=None): # Optional arguments context and recipient only if you need to explicitly change it for some calls
target_context = context or self.context
target_recipient = recipient or self.recipient
print(f"Send email with {target_context} to {target_recipient}")
BaseUserUseCaseTask = app.register_task(
BaseUserUseCase(
context={'setting': 'default'},
recipient="default#email.com",
)
)
Logs (Producer)
$ celery --app=my_proj beat --loglevel=INFO
[2021-08-20 09:50:22,193: INFO/MainProcess] Scheduler: Sending due task send_mail-every-day-at-4 (apps.users.usecases.BaseUserUseCase)
[2021-08-20 09:50:27,181: INFO/MainProcess] Scheduler: Sending due task send_mail-every-day-at-4 (apps.users.usecases.BaseUserUseCase)
Logs (Consumer)
$ celery --app=my_proj worker --queues=celery --loglevel=INFO
[tasks]
. apps.users.usecases.BaseUserUseCase
[2021-08-20 09:50:22,206: INFO/MainProcess] Task apps.users.usecases.BaseUserUseCase[3bce46e8-98c0-410e-a156-83e293ba6337] received
[2021-08-20 09:50:22,207: WARNING/ForkPoolWorker-4] Send email with {'setting': 'default'} to default#email.com
[2021-08-20 09:50:22,207: WARNING/ForkPoolWorker-4]
[2021-08-20 09:50:22,207: INFO/ForkPoolWorker-4] Task apps.users.usecases.BaseUserUseCase[3bce46e8-98c0-410e-a156-83e293ba6337] succeeded in 0.0002498120002201176s: None
[2021-08-20 09:50:27,183: INFO/MainProcess] Task apps.users.usecases.BaseUserUseCase[e1d2de2a-3e7d-4253-9641-41fd328a17ce] received
[2021-08-20 09:50:27,183: WARNING/ForkPoolWorker-4] Send email with {'setting': 'default'} to default#email.com
[2021-08-20 09:50:27,184: WARNING/ForkPoolWorker-4]
[2021-08-20 09:50:27,184: INFO/ForkPoolWorker-4] Task apps.users.usecases.BaseUserUseCase[e1d2de2a-3e7d-4253-9641-41fd328a17ce] succeeded in 0.0001804589992389083s: None
If you want to change the context and recipient as a scheduled task.
celery.py
Just change the following lines.
...
beat_schedule={
'send_mail-every-day-at-4': {
'task': 'apps.users.usecases.BaseUserUseCase',
'schedule': 5,
'kwargs': {
'context': {'setting': 'custom'},
'recipient': "custom#email.com",
},
},
},
...
Logs (Consumer):
[2021-08-20 09:54:36,530: INFO/MainProcess] Task apps.users.usecases.BaseUserUseCase[6b69b79b-6764-4d83-ad24-9b0723dd8c79] received
[2021-08-20 09:54:36,531: WARNING/ForkPoolWorker-4] Send email with {'setting': 'custom'} to custom#email.com
[2021-08-20 09:54:36,531: WARNING/ForkPoolWorker-4]
[2021-08-20 09:54:36,532: INFO/ForkPoolWorker-4] Task apps.users.usecases.BaseUserUseCase[6b69b79b-6764-4d83-ad24-9b0723dd8c79] succeeded in 0.0012146830003985087s: None
[2021-08-20 09:54:41,498: INFO/MainProcess] Task apps.users.usecases.BaseUserUseCase[a20d34e7-3214-4130-aba2-5544238096d0] received
[2021-08-20 09:54:41,499: WARNING/ForkPoolWorker-4] Send email with {'setting': 'custom'} to custom#email.com
[2021-08-20 09:54:41,499: WARNING/ForkPoolWorker-4]
[2021-08-20 09:54:41,500: INFO/ForkPoolWorker-4] Task apps.users.usecases.BaseUserUseCase[a20d34e7-3214-4130-aba2-5544238096d0] succeeded in 0.00047696000001451466s: None
If you intend to call the task manually e.g. from one of the Django views:
>>> from apps.users.usecases import BaseUserUseCaseTask
>>> BaseUserUseCaseTask.apply_async()
<AsyncResult: fd347270-59b8-4cec-8772-cf82a79e60df>
>>> BaseUserUseCaseTask.apply_async(kwargs={'context': {'setting': 'manual'}, 'recipient': "manual#email.com"})
<AsyncResult: 13d9df7e-e1c4-4f50-847f-72a413404c82>
Logs (Consumer):
[2021-08-20 09:57:19,324: INFO/MainProcess] Task apps.users.usecases.BaseUserUseCase[fd347270-59b8-4cec-8772-cf82a79e60df] received
[2021-08-20 09:57:19,324: WARNING/ForkPoolWorker-4] Send email with {'setting': 'default'} to default#email.com
[2021-08-20 09:57:19,324: WARNING/ForkPoolWorker-4]
[2021-08-20 09:57:19,325: INFO/ForkPoolWorker-4] Task apps.users.usecases.BaseUserUseCase[fd347270-59b8-4cec-8772-cf82a79e60df] succeeded in 0.00026283199986210093s: None
[2021-08-20 12:35:29,238: INFO/MainProcess] Task apps.users.usecases.BaseUserUseCase[13d9df7e-e1c4-4f50-847f-72a413404c82] received
[2021-08-20 12:35:29,240: WARNING/ForkPoolWorker-4] Send email with {'setting': 'manual'} to manual#email.com
[2021-08-20 12:35:29,240: WARNING/ForkPoolWorker-4]
[2021-08-20 12:35:29,240: INFO/ForkPoolWorker-4] Task apps.users.usecases.BaseUserUseCase[13d9df7e-e1c4-4f50-847f-72a413404c82] succeeded in 0.000784056000156852s: None
I am not 100% sure but below should work for you
Function class:
class BaseUserUseCase:
def __init__(self):
self.context = {}
self.receipent = ""
Add a task.py file in project folder:
import datetime
import logging
from path_to.celery import app
from rest_framework import status
from path_to import BaseUserUseCase
#app.task
def email_send_task(self):
context = BaseUserUseCase.context
receipent = BaseUserUseCase.receipent
print("done")
return ConfirmationEmail(context=context).send(to=[receipent])
Configure a celery.py file:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
from celery.schedules import crontab
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'project.settings')
app = Celery('project_name')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
# # add cron beat
app.conf.beat_schedule = {
'send-email-celery-task': {
'task': 'app_name.tasks.email_send_task',
'schedule': crontab(hour=0, minute=1)
},
}
Could you please try and let me know if it works for you?

Django Celery Changes Not Applied

My django-celery code cannot reload, which I concluded after seeing an error that was supposedly resolved. Can anyone tell me how to properly restart my Celery server, or is the problem still existent?
Running on Windows 10, by the way.
file structure
|-- manage.py
|-- nttracker\
|-- celery.py
|-- tasks.py
|-- settings.py
I have not yet added any separate configuration files yet.
nttracker/celery.py
import os
from celery import Celery
from celery.schedules import crontab
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'nttracker.settings')
postgres_broker = 'sqla+postgresql://user:pass#host/name'
app = Celery('nttracker', broker='amqp://', backend='rpc://', include=['nttracker.tasks'])
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
app.conf.update(
result_expires=3600,
)
app.conf.beat_schedule = {
'add-every-10-seconds': {
'task': 'nttracker.tasks.add',
'schedule': 10.0,
'args': (16, 16)
},
}
if __name__ == '__main__':
app.start()
nttracker/tasks.py
from __future__ import absolute_import
import django
django.setup()
from celery import Celery
from celery.schedules import crontab
app = Celery()
#app.task
def add(x, y):
print(x + y)
nttracker/settings.py
# Celery Configuration Options
CELERY_TIMEZONE = "valid/timezone"
CELERY_RESULT_BACKEND = 'django-db'
CELERY_BROKER_URL = 'redis://127.0.0.1:6379'
# celery setting.
CELERY_CACHE_BACKEND = 'default'
# django setting.
CACHES = {
'default': {
'BACKEND': 'django.core.cache.backends.db.DatabaseCache',
'LOCATION': 'my_cache_table',
}
}
terminal one output (celery -A nttracker worker --pool=solo -l INFO)
[2021-06-04 20:03:54,409: INFO/MainProcess] Received task: nttracker.tasks.add[4f9e0e15-82de-4cdb-be84-d3690ebe142e]
[2021-06-04 20:03:54,411: WARNING/MainProcess] 32
[2021-06-04 20:03:54,494: INFO/MainProcess] Task nttracker.tasks.add[4f9e0e15-82de-4cdb-be84-d3690ebe142e] succeeded in 0.09399999999732245s:
None
[2021-06-04 20:04:04,451: INFO/MainProcess] Received task: nttracker.tasks.add[da9c8999-3937-44fd-8d4b-15ff83977a4b]
[2021-06-04 20:04:04,452: WARNING/MainProcess] 32
[2021-06-04 20:04:04,529: INFO/MainProcess] Task nttracker.tasks.add[da9c8999-3937-44fd-8d4b-15ff83977a4b] succeeded in 0.07800000000861473s:
None
[2021-06-04 20:04:14,497: INFO/MainProcess] Received task: nttracker.tasks.add[c82b5099-e1dd-4f7b-a068-8041268571d1]
[2021-06-04 20:04:14,498: WARNING/MainProcess] 32
[2021-06-04 20:04:14,568: INFO/MainProcess] Task nttracker.tasks.add[c82b5099-e1dd-4f7b-a068-8041268571d1] succeeded in 0.0629999999946449s: None
[2021-06-04 20:04:23,187: ERROR/MainProcess] Received unregistered task of type 'tasks.add'.
The message has been ignored and discarded.
Did you remember to import the module containing this task?
Or maybe you're using relative imports?
Please see
http://docs.celeryq.org/en/latest/internals/protocol.html
for more information.
The full contents of the message body was:
b'[[16, 16], {}, {"callbacks": null, "errbacks": null, "chain": null, "chord": null}]' (83b)
Traceback (most recent call last):
File "c:\users\xxxx\onedrive\desktop\github_new\nttracker\venv\lib\site-packages\celery\worker\consumer\consumer.py", line 555, in on_task_received
strategy = strategies[type_]
KeyError: 'tasks.add'
[2021-06-04 20:04:24,544: INFO/MainProcess] Received task: nttracker.tasks.add[66f050ac-b17d-4a7c-9bc6-564cc1d84ae1]
[2021-06-04 20:04:24,545: WARNING/MainProcess] 32
[2021-06-04 20:04:24,620: INFO/MainProcess] Task nttracker.tasks.add[66f050ac-b17d-4a7c-9bc6-564cc1d84ae1] succeeded in 0.07799999999406282s:
None
terminal two output (celery -A nttracker beat -S django)
celery beat v5.0.5 (singularity) is starting.
__ - ... __ - _
LocalTime -> 2021-06-04 19:58:21
Configuration ->
. broker -> redis://127.0.0.1:6379//
. loader -> celery.loaders.app.AppLoader
. scheduler -> django_celery_beat.schedulers.DatabaseScheduler
. logfile -> [stderr]#%WARNING
. maxinterval -> 5.00 seconds (5s)
I'd like to stress the point that, in a 30-second interval, 32 was printed trice (from add(16, 16) but there will also be one tasks.add error.
I've tried to restart my redis server and celery's worker and beat, but the initial import error was still not resolved.
Can anyone please help? Many thanks in advance.
import your task project settings.py file
CELERY_IMPORTS = (
"import task here"
)

Why does Celery periodic tasks fire a function only once

I've built a small web scraper function to get some data from the web and populate it to my db which works just well.
Now I would like to fire this function periodically every 20 seconds using Celery periodic tasks.
I walked through the docu and everything seems to be set up for development (using redis as broker).
This is my tasks.py file in project/stocksapp where my periodically fired functions are:
# Celery imports
from celery.task.schedules import crontab
from celery.decorators import periodic_task
from celery.utils.log import get_task_logger
from datetime import timedelta
logger = get_task_logger(__name__)
# periodic functions
#periodic_task(
run_every=(timedelta(seconds=20)),
name="getStocksDataDax",
ignore_result=True
)
def getStocksDataDax():
print("fired")
Now when I start the worker, the function seems to be fired once and only once (the database gets populated). But after that, the function doesn't get fired anymore, although the console suggests it:
C:\Users\Jonas\Desktop\CFD\CFD>celery -A CFD beat -l info
celery beat v4.4.2 (cliffs) is starting.
__ - ... __ - _
LocalTime -> 2020-05-15 23:06:29
Configuration ->
. broker -> redis://localhost:6379/0
. loader -> celery.loaders.app.AppLoader
. scheduler -> celery.beat.PersistentScheduler
. db -> celerybeat-schedule
. logfile -> [stderr]#%INFO
. maxinterval -> 5.00 minutes (300s)
[2020-05-15 23:06:29,990: INFO/MainProcess] beat: Starting...
[2020-05-15 23:06:30,024: INFO/MainProcess] Scheduler: Sending due task getStocksDataDax (getStocksDataDax)
[2020-05-15 23:06:50,015: INFO/MainProcess] Scheduler: Sending due task getStocksDataDax (getStocksDataDax)
[2020-05-15 23:07:10,015: INFO/MainProcess] Scheduler: Sending due task getStocksDataDax (getStocksDataDax)
[2020-05-15 23:07:30,015: INFO/MainProcess] Scheduler: Sending due task getStocksDataDax (getStocksDataDax)
[2020-05-15 23:07:50,015: INFO/MainProcess] Scheduler: Sending due task getStocksDataDax (getStocksDataDax)
[2020-05-15 23:08:10,016: INFO/MainProcess] Scheduler: Sending due task getStocksDataDax (getStocksDataDax)
[2020-05-15 23:08:30,016: INFO/MainProcess] Scheduler: Sending due task getStocksDataDax (getStocksDataDax)
[2020-05-15 23:08:50,016: INFO/MainProcess] Scheduler: Sending due task getStocksDataDax (getStocksDataDax)
project/project/celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'CFD.settings')
app = Celery('CFD',
broker='redis://localhost:6379/0',
backend='amqp://',
include=['CFD.tasks'])
app.conf.broker_transport_options = {'visibility_timeout': 3600}
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
The function itself runs about 1 second totally.
Where could basically be an issue in this setup to make the worker/celery fire the function every 20 seconds as supposed to?
celery -A CFD beat -l info only starts the Celery beat process. You should have a separate Celery worker process - in a different terminal run something like celery -A CFD worker -c 8 -O fair -l info.

django celerybeat not invoking tasks.py function

Based on the tutorial: https://www.merixstudio.com/blog/django-celery-beat/
celery.py file code
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from Backend.settings import INSTALLED_APPS
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'app.settings')
app = Celery('proj')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: INSTALLED_APPS)
tasks.py file code
from celery.task import task
#task(name='task1')
def emailt():
print("email func invoked")
# code...
settings.py
from __future__ import absolute_import
import os
import djcelery
from celery.schedules import crontab
djcelery.setup_loader()
INSTALLED_APPS = [
'djcelery',
'django_celery_beat',
....
]
REDIS_HOST = 'localhost'
REDIS_PORT = '6379'
BROKER_URL = 'redis://' + REDIS_HOST + ':' + REDIS_PORT + '/0'
CELERY_RESULT_BACKEND = 'redis://' + REDIS_HOST + ':' + REDIS_PORT + '/0'
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TIMEZONE = 'Asia/Kolkata'
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
CELERYBEAT_SCHEDULE = {
'task1':
{
'task': 'proj.tasks.emailt',
'schedule': crontab(hour='*', minute='1', day_of_week='mon,tue,wed,thu,fri,sat,sun'),
}
}
In one command shell, redis server and django py manage.py runserver is running.
On another shell, the celery command is run as follows: celery -A proj.tasks beat -l INFO --scheduler django_celery_beat.schedulers:DatabaseScheduler
Log files denote that celerybeat is running.
Configuration ->
. broker -> redis://localhost:6379/0
. loader -> djcelery.loaders.DjangoLoader
. scheduler -> django_celery_beat.schedulers.DatabaseScheduler
. logfile -> [stderr]#%INFO
. maxinterval -> 5.00 seconds (5s)
[*************: INFO/MainProcess] beat: Starting...
[*************: INFO/MainProcess] Writing entries...
[*************: INFO/MainProcess] DatabaseScheduler: Schedule changed.
[*************: INFO/MainProcess] Writing entries...
[*************: INFO/MainProcess] Writing entries...
However, the function emailt() within tasks.py is still not getting invoked.
I am unable to locate the issue with celerybeat.
DatabaseScheduler is the database scheduler implementation and doesn't take tasks from CELERYBEAT_SCHEDULE dictionary
If you are going to use this type of scheduler you should create PeriodicTask through django admin or through data migrations/views
You can use crontab notation with default scheduler in tasks or with DatabaseScheduler by creating CrontabSchedule and attaching it to PeriodicTask

Celery beat tasks not executing

I'm learning periodic tasks in Django with celery beat. But my tasks are not executing.
my __init__.py file:
from __future__ import absolute_import, unicode_literals
from .celery import app as celery_app
__all__ = ('celery_app',)
my celery.py file:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'new_todo_app.settings')
app = Celery('new_todo_app')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
my tasks.py file:
from celery import Celery
from celery import shared_task
app = Celery('tasks', broker='pyamqp://guest#localhost//')
#shared_task
def progress_bar():
print("Executed every minute")
and my settings.py file
CELERY_BROKER_URL = 'amqp://localhost'
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TIMEZONE = 'Asia/Baku'
CELERY_ENABLE_UTC = True
CELERY_BEAT_SCHEDULE = {
'progress-bar': {
'task': 'app1.tasks.progress_bar',
'schedule': 5.0,
},
}
I run celery beat worker by writing:
#celery -A new_todo_app beat -l info
celery beat starts, but tasks don`t execute.I tried DEBUG logging mode and I get:
Configuration ->
. broker -> amqp://guest:**#localhost:5672//
. loader -> celery.loaders.app.AppLoader
. scheduler -> celery.beat.PersistentScheduler
. db -> celerybeat-schedule
. logfile -> [stderr]#%DEBUG
. maxinterval -> 5.00 minutes (300s)
[2019-12-04 19:35:24,937: DEBUG/MainProcess] Setting default socket timeout to 30
[2019-12-04 19:35:24,938: INFO/MainProcess] beat: Starting...
[2019-12-04 19:35:24,975: DEBUG/MainProcess] Current schedule:
<ScheduleEntry: progress-bar app1.tasks.progress_bar() <freq: 5.00 seconds>
<ScheduleEntry: celery.backend_cleanup celery.backend_cleanup() <crontab: 0 4 * * * (m/h/d/dM/MY)>
[2019-12-04 19:35:24,975: DEBUG/MainProcess] beat: Ticking with max interval->5.00 minutes
[2019-12-04 19:35:24,977: DEBUG/MainProcess] beat: Waking up in 5.00 minutes.
I just started learning celery, and feel like maybe something is wrong with my configurations.
Thanks beforehand

Categories