Celery Task not getting assigned through redis - python

Using Celery/redis i tried creating a task, But on checking the celery working info with the below code
celery -A intranet_project worker -l info
I am unable to get the task added there.
Settings.py
BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = TIME_ZONE
init.py
from __future__ import absolute_import
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app',)
celery.py
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'intranet_project.settings')
app = Celery('intranet_project')
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
my_task.py
from celery.decorators import task
from celery import shared_task
#shared_task
def add(a,b):
d = a + b
return d
Below is the server log
[tasks]
. intranet_project.celery.debug_task
[2020-02-26 19:38:59,051: INFO/MainProcess] Connected to redis://localhost:6379//
[2020-02-26 19:38:59,160: INFO/MainProcess] mingle: searching for neighbors
[2020-02-26 19:39:00,379: INFO/MainProcess] mingle: all alone

I don`t see the code from where you execute your task.
view.py (or wherever you want to trigger the task from)
from .my_tasks import add
def action(request):
add.delay(2,2)
You should then see a line like this appear in your celery log:
[2020-02-26 14:11:40,765: INFO/MainProcess] Received task: intranet_project.tasks.add[xxxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx]

Related

Celery does not discovers tasks inside project

I have a project myproject and an app app.
Inside myproject I have tasks.py
from celery import shared_task
#shared_task
def add(x, y):
return x + y
Inside my app I have following tasks.py
from celery import shared_task
from django.core.mail import send_mail
#shared_task
def send_email_task(email):
"background task to send an email asynchronously"
subject = 'Hello from Celery'
message = 'This is a test email sent asynchronously with Celery.'
time.sleep(1)
return send_mail(
subject,
message,
'stackoverflow#gmail.com',
[email],
fail_silently=False
)
When running the celery workers I see only shared tasks from app and not from myproject
(myprojectenv) root#ubuntu-s-1vcpu-1gb-blr1-02:/etc/myproject# celery -A myproject worker -l info
/etc/myprojectenv/lib/python3.8/site-packages/celery/platforms.py:840: SecurityWarning: You're running the worker with superuser privileges: this is
absolutely not recommended!
Please specify a different user using the --uid option.
User information: uid=0 euid=0 gid=0 egid=0
warnings.warn(SecurityWarning(ROOT_DISCOURAGED.format(
.> celery exchange=celery(direct) key=celery
[tasks]
. app.tasks.send_email_task
[2022-06-15 09:16:44,314: INFO/MainProcess] Connected to amqp://hpoddar:**#IPADDRESS:5672/vhostcheck
[2022-06-15 09:16:44,322: INFO/MainProcess] mingle: searching for neighbors
Here is my celery.py file
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproject.settings')
app = Celery('myproject')
app.autodiscover_tasks()

django celerybeat not invoking tasks.py function

Based on the tutorial: https://www.merixstudio.com/blog/django-celery-beat/
celery.py file code
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from Backend.settings import INSTALLED_APPS
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'app.settings')
app = Celery('proj')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: INSTALLED_APPS)
tasks.py file code
from celery.task import task
#task(name='task1')
def emailt():
print("email func invoked")
# code...
settings.py
from __future__ import absolute_import
import os
import djcelery
from celery.schedules import crontab
djcelery.setup_loader()
INSTALLED_APPS = [
'djcelery',
'django_celery_beat',
....
]
REDIS_HOST = 'localhost'
REDIS_PORT = '6379'
BROKER_URL = 'redis://' + REDIS_HOST + ':' + REDIS_PORT + '/0'
CELERY_RESULT_BACKEND = 'redis://' + REDIS_HOST + ':' + REDIS_PORT + '/0'
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TIMEZONE = 'Asia/Kolkata'
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
CELERYBEAT_SCHEDULE = {
'task1':
{
'task': 'proj.tasks.emailt',
'schedule': crontab(hour='*', minute='1', day_of_week='mon,tue,wed,thu,fri,sat,sun'),
}
}
In one command shell, redis server and django py manage.py runserver is running.
On another shell, the celery command is run as follows: celery -A proj.tasks beat -l INFO --scheduler django_celery_beat.schedulers:DatabaseScheduler
Log files denote that celerybeat is running.
Configuration ->
. broker -> redis://localhost:6379/0
. loader -> djcelery.loaders.DjangoLoader
. scheduler -> django_celery_beat.schedulers.DatabaseScheduler
. logfile -> [stderr]#%INFO
. maxinterval -> 5.00 seconds (5s)
[*************: INFO/MainProcess] beat: Starting...
[*************: INFO/MainProcess] Writing entries...
[*************: INFO/MainProcess] DatabaseScheduler: Schedule changed.
[*************: INFO/MainProcess] Writing entries...
[*************: INFO/MainProcess] Writing entries...
However, the function emailt() within tasks.py is still not getting invoked.
I am unable to locate the issue with celerybeat.
DatabaseScheduler is the database scheduler implementation and doesn't take tasks from CELERYBEAT_SCHEDULE dictionary
If you are going to use this type of scheduler you should create PeriodicTask through django admin or through data migrations/views
You can use crontab notation with default scheduler in tasks or with DatabaseScheduler by creating CrontabSchedule and attaching it to PeriodicTask

Celery beat tasks not executing

I'm learning periodic tasks in Django with celery beat. But my tasks are not executing.
my __init__.py file:
from __future__ import absolute_import, unicode_literals
from .celery import app as celery_app
__all__ = ('celery_app',)
my celery.py file:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'new_todo_app.settings')
app = Celery('new_todo_app')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
my tasks.py file:
from celery import Celery
from celery import shared_task
app = Celery('tasks', broker='pyamqp://guest#localhost//')
#shared_task
def progress_bar():
print("Executed every minute")
and my settings.py file
CELERY_BROKER_URL = 'amqp://localhost'
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TIMEZONE = 'Asia/Baku'
CELERY_ENABLE_UTC = True
CELERY_BEAT_SCHEDULE = {
'progress-bar': {
'task': 'app1.tasks.progress_bar',
'schedule': 5.0,
},
}
I run celery beat worker by writing:
#celery -A new_todo_app beat -l info
celery beat starts, but tasks don`t execute.I tried DEBUG logging mode and I get:
Configuration ->
. broker -> amqp://guest:**#localhost:5672//
. loader -> celery.loaders.app.AppLoader
. scheduler -> celery.beat.PersistentScheduler
. db -> celerybeat-schedule
. logfile -> [stderr]#%DEBUG
. maxinterval -> 5.00 minutes (300s)
[2019-12-04 19:35:24,937: DEBUG/MainProcess] Setting default socket timeout to 30
[2019-12-04 19:35:24,938: INFO/MainProcess] beat: Starting...
[2019-12-04 19:35:24,975: DEBUG/MainProcess] Current schedule:
<ScheduleEntry: progress-bar app1.tasks.progress_bar() <freq: 5.00 seconds>
<ScheduleEntry: celery.backend_cleanup celery.backend_cleanup() <crontab: 0 4 * * * (m/h/d/dM/MY)>
[2019-12-04 19:35:24,975: DEBUG/MainProcess] beat: Ticking with max interval->5.00 minutes
[2019-12-04 19:35:24,977: DEBUG/MainProcess] beat: Waking up in 5.00 minutes.
I just started learning celery, and feel like maybe something is wrong with my configurations.
Thanks beforehand

Django celery beat task not working

celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'TwitterApiProxy.settings')
app = Celery('TwitterApiProxy')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
#app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
# Calls test('hello') every 10 seconds.
sender.add_periodic_task(10.0, hello_test.s('hello'), name='add every 10')
#app.task
def hello_test(arg):
print(arg)
settings.py
CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'America/Los_Angeles'
I want to print "hello" every 10 seconds. So on running celery -A TwitterApiProxy beat in my terminal, I see as below:
LocalTime -> 2018-04-06 23:27:09
Configuration ->
. broker -> redis://localhost:6379//
. loader -> celery.loaders.app.AppLoader
. scheduler -> celery.beat.PersistentScheduler
. db -> celerybeat-schedule
. logfile -> [stderr]#%WARNING
. maxinterval -> 5.00 minutes (300s)
it did not print anything related to the task that I scheduled. Where did I go wrong?
Nothing is wrong with your set up.
Start your worker and celery beat in two separate cmd windows
celery -A TwitterApiProxy worker -l info
celery -A TwitterApiProxy beat -l info
If you are using Celery 4.0+, you have to install eventlet first, then start your worker like this:
celery -A TwitterApiProxy worker -l info -P eventlet
Task Admin Backend
If you want a task admin backend, you can install and use django-celery

Celery tasks doesn't works

Celery docs say that Celery 3.1 can work with django out of box. But tasks not working. I have tasks.py:
from celery import task
from datetime import timedelta
#task.periodic_task(run_every=timedelta(seconds=20), ignore_result=True)
def disable_not_confirmed_users():
print "start"
Configs:
from kombu import Exchange, Queue
CELERY_SEND_TASK_ERROR_EMAILS = True
BROKER_URL = 'amqp://guest#localhost//'
CELERY_DEFAULT_QUEUE = 'project-queue'
CELERY_DEFAULT_EXCHANGE = 'project-queue'
CELERY_DEFAULT_ROUTING_KEY = 'project-queue'
CELERY_QUEUES = (
Queue('project-queue', Exchange('project-queue'), routing_key='project-queue'),
)
project/celery.py
from future import absolute_import
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'project.settings')
from django.conf import settings
app = Celery('project')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
Run celery: celery -A project worker --loglevel=INFO
But nothing not happend.
you should use celery beat to run periodic task.
celery -A project worker --loglevel=INFO
starts the worker, which does the actually work.
celery -A proj beat
starts the beat service, which asks the work to do the job.

Categories