Django Celery Changes Not Applied - python

My django-celery code cannot reload, which I concluded after seeing an error that was supposedly resolved. Can anyone tell me how to properly restart my Celery server, or is the problem still existent?
Running on Windows 10, by the way.
file structure
|-- manage.py
|-- nttracker\
|-- celery.py
|-- tasks.py
|-- settings.py
I have not yet added any separate configuration files yet.
nttracker/celery.py
import os
from celery import Celery
from celery.schedules import crontab
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'nttracker.settings')
postgres_broker = 'sqla+postgresql://user:pass#host/name'
app = Celery('nttracker', broker='amqp://', backend='rpc://', include=['nttracker.tasks'])
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
app.conf.update(
result_expires=3600,
)
app.conf.beat_schedule = {
'add-every-10-seconds': {
'task': 'nttracker.tasks.add',
'schedule': 10.0,
'args': (16, 16)
},
}
if __name__ == '__main__':
app.start()
nttracker/tasks.py
from __future__ import absolute_import
import django
django.setup()
from celery import Celery
from celery.schedules import crontab
app = Celery()
#app.task
def add(x, y):
print(x + y)
nttracker/settings.py
# Celery Configuration Options
CELERY_TIMEZONE = "valid/timezone"
CELERY_RESULT_BACKEND = 'django-db'
CELERY_BROKER_URL = 'redis://127.0.0.1:6379'
# celery setting.
CELERY_CACHE_BACKEND = 'default'
# django setting.
CACHES = {
'default': {
'BACKEND': 'django.core.cache.backends.db.DatabaseCache',
'LOCATION': 'my_cache_table',
}
}
terminal one output (celery -A nttracker worker --pool=solo -l INFO)
[2021-06-04 20:03:54,409: INFO/MainProcess] Received task: nttracker.tasks.add[4f9e0e15-82de-4cdb-be84-d3690ebe142e]
[2021-06-04 20:03:54,411: WARNING/MainProcess] 32
[2021-06-04 20:03:54,494: INFO/MainProcess] Task nttracker.tasks.add[4f9e0e15-82de-4cdb-be84-d3690ebe142e] succeeded in 0.09399999999732245s:
None
[2021-06-04 20:04:04,451: INFO/MainProcess] Received task: nttracker.tasks.add[da9c8999-3937-44fd-8d4b-15ff83977a4b]
[2021-06-04 20:04:04,452: WARNING/MainProcess] 32
[2021-06-04 20:04:04,529: INFO/MainProcess] Task nttracker.tasks.add[da9c8999-3937-44fd-8d4b-15ff83977a4b] succeeded in 0.07800000000861473s:
None
[2021-06-04 20:04:14,497: INFO/MainProcess] Received task: nttracker.tasks.add[c82b5099-e1dd-4f7b-a068-8041268571d1]
[2021-06-04 20:04:14,498: WARNING/MainProcess] 32
[2021-06-04 20:04:14,568: INFO/MainProcess] Task nttracker.tasks.add[c82b5099-e1dd-4f7b-a068-8041268571d1] succeeded in 0.0629999999946449s: None
[2021-06-04 20:04:23,187: ERROR/MainProcess] Received unregistered task of type 'tasks.add'.
The message has been ignored and discarded.
Did you remember to import the module containing this task?
Or maybe you're using relative imports?
Please see
http://docs.celeryq.org/en/latest/internals/protocol.html
for more information.
The full contents of the message body was:
b'[[16, 16], {}, {"callbacks": null, "errbacks": null, "chain": null, "chord": null}]' (83b)
Traceback (most recent call last):
File "c:\users\xxxx\onedrive\desktop\github_new\nttracker\venv\lib\site-packages\celery\worker\consumer\consumer.py", line 555, in on_task_received
strategy = strategies[type_]
KeyError: 'tasks.add'
[2021-06-04 20:04:24,544: INFO/MainProcess] Received task: nttracker.tasks.add[66f050ac-b17d-4a7c-9bc6-564cc1d84ae1]
[2021-06-04 20:04:24,545: WARNING/MainProcess] 32
[2021-06-04 20:04:24,620: INFO/MainProcess] Task nttracker.tasks.add[66f050ac-b17d-4a7c-9bc6-564cc1d84ae1] succeeded in 0.07799999999406282s:
None
terminal two output (celery -A nttracker beat -S django)
celery beat v5.0.5 (singularity) is starting.
__ - ... __ - _
LocalTime -> 2021-06-04 19:58:21
Configuration ->
. broker -> redis://127.0.0.1:6379//
. loader -> celery.loaders.app.AppLoader
. scheduler -> django_celery_beat.schedulers.DatabaseScheduler
. logfile -> [stderr]#%WARNING
. maxinterval -> 5.00 seconds (5s)
I'd like to stress the point that, in a 30-second interval, 32 was printed trice (from add(16, 16) but there will also be one tasks.add error.
I've tried to restart my redis server and celery's worker and beat, but the initial import error was still not resolved.
Can anyone please help? Many thanks in advance.

import your task project settings.py file
CELERY_IMPORTS = (
"import task here"
)

Related

Celery does not discovers tasks inside project

I have a project myproject and an app app.
Inside myproject I have tasks.py
from celery import shared_task
#shared_task
def add(x, y):
return x + y
Inside my app I have following tasks.py
from celery import shared_task
from django.core.mail import send_mail
#shared_task
def send_email_task(email):
"background task to send an email asynchronously"
subject = 'Hello from Celery'
message = 'This is a test email sent asynchronously with Celery.'
time.sleep(1)
return send_mail(
subject,
message,
'stackoverflow#gmail.com',
[email],
fail_silently=False
)
When running the celery workers I see only shared tasks from app and not from myproject
(myprojectenv) root#ubuntu-s-1vcpu-1gb-blr1-02:/etc/myproject# celery -A myproject worker -l info
/etc/myprojectenv/lib/python3.8/site-packages/celery/platforms.py:840: SecurityWarning: You're running the worker with superuser privileges: this is
absolutely not recommended!
Please specify a different user using the --uid option.
User information: uid=0 euid=0 gid=0 egid=0
warnings.warn(SecurityWarning(ROOT_DISCOURAGED.format(
.> celery exchange=celery(direct) key=celery
[tasks]
. app.tasks.send_email_task
[2022-06-15 09:16:44,314: INFO/MainProcess] Connected to amqp://hpoddar:**#IPADDRESS:5672/vhostcheck
[2022-06-15 09:16:44,322: INFO/MainProcess] mingle: searching for neighbors
Here is my celery.py file
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproject.settings')
app = Celery('myproject')
app.autodiscover_tasks()

Why does Celery periodic tasks fire a function only once

I've built a small web scraper function to get some data from the web and populate it to my db which works just well.
Now I would like to fire this function periodically every 20 seconds using Celery periodic tasks.
I walked through the docu and everything seems to be set up for development (using redis as broker).
This is my tasks.py file in project/stocksapp where my periodically fired functions are:
# Celery imports
from celery.task.schedules import crontab
from celery.decorators import periodic_task
from celery.utils.log import get_task_logger
from datetime import timedelta
logger = get_task_logger(__name__)
# periodic functions
#periodic_task(
run_every=(timedelta(seconds=20)),
name="getStocksDataDax",
ignore_result=True
)
def getStocksDataDax():
print("fired")
Now when I start the worker, the function seems to be fired once and only once (the database gets populated). But after that, the function doesn't get fired anymore, although the console suggests it:
C:\Users\Jonas\Desktop\CFD\CFD>celery -A CFD beat -l info
celery beat v4.4.2 (cliffs) is starting.
__ - ... __ - _
LocalTime -> 2020-05-15 23:06:29
Configuration ->
. broker -> redis://localhost:6379/0
. loader -> celery.loaders.app.AppLoader
. scheduler -> celery.beat.PersistentScheduler
. db -> celerybeat-schedule
. logfile -> [stderr]#%INFO
. maxinterval -> 5.00 minutes (300s)
[2020-05-15 23:06:29,990: INFO/MainProcess] beat: Starting...
[2020-05-15 23:06:30,024: INFO/MainProcess] Scheduler: Sending due task getStocksDataDax (getStocksDataDax)
[2020-05-15 23:06:50,015: INFO/MainProcess] Scheduler: Sending due task getStocksDataDax (getStocksDataDax)
[2020-05-15 23:07:10,015: INFO/MainProcess] Scheduler: Sending due task getStocksDataDax (getStocksDataDax)
[2020-05-15 23:07:30,015: INFO/MainProcess] Scheduler: Sending due task getStocksDataDax (getStocksDataDax)
[2020-05-15 23:07:50,015: INFO/MainProcess] Scheduler: Sending due task getStocksDataDax (getStocksDataDax)
[2020-05-15 23:08:10,016: INFO/MainProcess] Scheduler: Sending due task getStocksDataDax (getStocksDataDax)
[2020-05-15 23:08:30,016: INFO/MainProcess] Scheduler: Sending due task getStocksDataDax (getStocksDataDax)
[2020-05-15 23:08:50,016: INFO/MainProcess] Scheduler: Sending due task getStocksDataDax (getStocksDataDax)
project/project/celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'CFD.settings')
app = Celery('CFD',
broker='redis://localhost:6379/0',
backend='amqp://',
include=['CFD.tasks'])
app.conf.broker_transport_options = {'visibility_timeout': 3600}
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
The function itself runs about 1 second totally.
Where could basically be an issue in this setup to make the worker/celery fire the function every 20 seconds as supposed to?
celery -A CFD beat -l info only starts the Celery beat process. You should have a separate Celery worker process - in a different terminal run something like celery -A CFD worker -c 8 -O fair -l info.

Celery Task not getting assigned through redis

Using Celery/redis i tried creating a task, But on checking the celery working info with the below code
celery -A intranet_project worker -l info
I am unable to get the task added there.
Settings.py
BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = TIME_ZONE
init.py
from __future__ import absolute_import
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app',)
celery.py
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'intranet_project.settings')
app = Celery('intranet_project')
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
my_task.py
from celery.decorators import task
from celery import shared_task
#shared_task
def add(a,b):
d = a + b
return d
Below is the server log
[tasks]
. intranet_project.celery.debug_task
[2020-02-26 19:38:59,051: INFO/MainProcess] Connected to redis://localhost:6379//
[2020-02-26 19:38:59,160: INFO/MainProcess] mingle: searching for neighbors
[2020-02-26 19:39:00,379: INFO/MainProcess] mingle: all alone
I don`t see the code from where you execute your task.
view.py (or wherever you want to trigger the task from)
from .my_tasks import add
def action(request):
add.delay(2,2)
You should then see a line like this appear in your celery log:
[2020-02-26 14:11:40,765: INFO/MainProcess] Received task: intranet_project.tasks.add[xxxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx]

django celerybeat not invoking tasks.py function

Based on the tutorial: https://www.merixstudio.com/blog/django-celery-beat/
celery.py file code
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from Backend.settings import INSTALLED_APPS
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'app.settings')
app = Celery('proj')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: INSTALLED_APPS)
tasks.py file code
from celery.task import task
#task(name='task1')
def emailt():
print("email func invoked")
# code...
settings.py
from __future__ import absolute_import
import os
import djcelery
from celery.schedules import crontab
djcelery.setup_loader()
INSTALLED_APPS = [
'djcelery',
'django_celery_beat',
....
]
REDIS_HOST = 'localhost'
REDIS_PORT = '6379'
BROKER_URL = 'redis://' + REDIS_HOST + ':' + REDIS_PORT + '/0'
CELERY_RESULT_BACKEND = 'redis://' + REDIS_HOST + ':' + REDIS_PORT + '/0'
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TIMEZONE = 'Asia/Kolkata'
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
CELERYBEAT_SCHEDULE = {
'task1':
{
'task': 'proj.tasks.emailt',
'schedule': crontab(hour='*', minute='1', day_of_week='mon,tue,wed,thu,fri,sat,sun'),
}
}
In one command shell, redis server and django py manage.py runserver is running.
On another shell, the celery command is run as follows: celery -A proj.tasks beat -l INFO --scheduler django_celery_beat.schedulers:DatabaseScheduler
Log files denote that celerybeat is running.
Configuration ->
. broker -> redis://localhost:6379/0
. loader -> djcelery.loaders.DjangoLoader
. scheduler -> django_celery_beat.schedulers.DatabaseScheduler
. logfile -> [stderr]#%INFO
. maxinterval -> 5.00 seconds (5s)
[*************: INFO/MainProcess] beat: Starting...
[*************: INFO/MainProcess] Writing entries...
[*************: INFO/MainProcess] DatabaseScheduler: Schedule changed.
[*************: INFO/MainProcess] Writing entries...
[*************: INFO/MainProcess] Writing entries...
However, the function emailt() within tasks.py is still not getting invoked.
I am unable to locate the issue with celerybeat.
DatabaseScheduler is the database scheduler implementation and doesn't take tasks from CELERYBEAT_SCHEDULE dictionary
If you are going to use this type of scheduler you should create PeriodicTask through django admin or through data migrations/views
You can use crontab notation with default scheduler in tasks or with DatabaseScheduler by creating CrontabSchedule and attaching it to PeriodicTask

Celery beat tasks not executing

I'm learning periodic tasks in Django with celery beat. But my tasks are not executing.
my __init__.py file:
from __future__ import absolute_import, unicode_literals
from .celery import app as celery_app
__all__ = ('celery_app',)
my celery.py file:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'new_todo_app.settings')
app = Celery('new_todo_app')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
my tasks.py file:
from celery import Celery
from celery import shared_task
app = Celery('tasks', broker='pyamqp://guest#localhost//')
#shared_task
def progress_bar():
print("Executed every minute")
and my settings.py file
CELERY_BROKER_URL = 'amqp://localhost'
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TIMEZONE = 'Asia/Baku'
CELERY_ENABLE_UTC = True
CELERY_BEAT_SCHEDULE = {
'progress-bar': {
'task': 'app1.tasks.progress_bar',
'schedule': 5.0,
},
}
I run celery beat worker by writing:
#celery -A new_todo_app beat -l info
celery beat starts, but tasks don`t execute.I tried DEBUG logging mode and I get:
Configuration ->
. broker -> amqp://guest:**#localhost:5672//
. loader -> celery.loaders.app.AppLoader
. scheduler -> celery.beat.PersistentScheduler
. db -> celerybeat-schedule
. logfile -> [stderr]#%DEBUG
. maxinterval -> 5.00 minutes (300s)
[2019-12-04 19:35:24,937: DEBUG/MainProcess] Setting default socket timeout to 30
[2019-12-04 19:35:24,938: INFO/MainProcess] beat: Starting...
[2019-12-04 19:35:24,975: DEBUG/MainProcess] Current schedule:
<ScheduleEntry: progress-bar app1.tasks.progress_bar() <freq: 5.00 seconds>
<ScheduleEntry: celery.backend_cleanup celery.backend_cleanup() <crontab: 0 4 * * * (m/h/d/dM/MY)>
[2019-12-04 19:35:24,975: DEBUG/MainProcess] beat: Ticking with max interval->5.00 minutes
[2019-12-04 19:35:24,977: DEBUG/MainProcess] beat: Waking up in 5.00 minutes.
I just started learning celery, and feel like maybe something is wrong with my configurations.
Thanks beforehand

Categories