we're using django 1.10, Celery 4.1.0
I'm trying to use apply_async.
This is the task:
from celery import Celery
app = Celery('my_app', broker='redis://127.0.0.1:6379/2')
#app.task
def add(x, y):
print(str(x+y))
raise status.HTTP_500_INTERNAL_SERVER_ERROR
When calling it with 'delay' it runs the 'add' function but doesn't retry:
add.delay(4, 4)
I tried to run the task with 'apply_async' and 'retry' and 'retry_policy' but it doesn't seem to even run the task:
add.apply_async((4, 4),
retry=True,
retry_policy={
'max_retries': 3,
'interval_start': 0,
'interval_step': 0.2,
'interval_max': 0.2,
}
)
Am I missing something?
Check if you've missed the configuration on your proj/proj/__init__.py as described in Celery Docs
The file must have:
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app',)
Related
I am very new to Celery and I am trying to use it to schedule a function, but its not working properly it seems.
Here is my settings.py: (Along with the default settings given by django)
CELERY_BROKER_URL = 'amqp://guest:guest#localhost'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_RESULT_BACKEND = 'db+sqlite:///results.sqlite'
CELERY_TASK_SERIALIZER = 'json'
celery.py:
rom __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from celery.schedules import crontab
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'mera_project.settings')
app = Celery('mera_project')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
init.py:
from __future__ import absolute_import, unicode_literals
from .celery import app as celery_app
__all__ = ['celery_app']
tasks_notes/tasks.py:(tasks_notes is my app name)
from celery.decorators import periodic_task
from celery.task.schedules import crontab
from tasks_notes.models import BudgetInfo
#periodic_task(run_every=(crontab(minute='*/15')))
def monthly_starting_value():
print("hi")
return 0
views.py:
from .tasks import monthly_starting_value
def app_view(request):
abcd = monthly_starting_value.delay()
print("new"+str(abcd))
I had expected value zero and hi in my terminal, but instead of that I have got a random number as new 42bf83ef-850f-4b34-af78-da696d2ee0f2 and the random number keeps on changing in every 15 minutes.
In my ``celery beat``` running terminal tab, I am getting something like:
WARNING/ForkPoolWorker-9] hi
Task tasks_notes.tasks.monthly_starting_value[42bf83ef-850f-4b34-af78-da696d2ee0f2] succeeded in 0.0009442089994990965s: 0
in every 15 minutes.
I have even tried ``app.beat.conf_scheduleincelery.py``` and also tried running in admin phase, but its not working as expected.
Where can I be wrong?
Any help is highly appreciated.
It is definitely not irregular - it behaves exactly as it should.
If you wanted to grab the result of a task, then you should have something like:
abcd = monthly_starting_value.delay().get()
delay() returns an instance of AsyncResult class.
Finally, do not call print() inside task. Use the Celery logger.
Example:
import os
from celery.utils.log import get_task_logger
from worker import app
logger = get_task_logger(__name__)
#app.task()
def add(x, y):
result = x + y
logger.info(f'Add: {x} + {y} = {result}')
return result
This issue has been discussed before and looking over numerous posts, I am so far unable to find a solution to this problem. I'm new to celery so my learning curve is still fairly steep. Below my current scripts:
myapp.__init__.py
from __future__ import absolute_import, unicode_literals
from .celery_main import app as celery_app # Ensures app is always imported when Django starts so that shared_task will use this app.
__all__ = ['celery_app']
myapp.celery_main.py
from __future__ import absolute_import
from celery import Celery
from django.apps import apps
# Initialise the app
app = Celery()
app.config_from_object('myapp.celeryconfig') # WORKS WHEN CALLED THROUGH VIEW/DJANGO: Tell Celery instance to use celeryconfig module
#app.config_from_object('celeryconfig') # WORKS WHEN CALLED THROUGH TERMINAL
# Load task modules from all registered Django app configs.
app.autodiscover_tasks(lambda: [n.name for n in apps.get_app_configs()])
myapp.celeryconfig.py
from __future__ import absolute_import, unicode_literals
from datetime import timedelta
## List of modules to import when celery starts.
CELERY_IMPORTS = ('celery_tasks',)
## Message Broker (RabbitMQ) settings.
BROKER_URL = 'amqp://'
BROKER_PORT = 5672
## Result store settings.
CELERY_RESULT_BACKEND = 'rpc://'
## Misc
#CELERY_IGNORE_RESULT = False
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_ACCEPT_CONTENT=['json']
CELERY_TIMEZONE = 'Europe/Berlin'
CELERY_ENABLE_UTC = True
CELERYBEAT_SCHEDULE = {
'doctor-every-10-seconds': {
'task': 'celery_tasks.fav_doctor',
'schedule': timedelta(seconds=3),
},
}
myapp.celery_tasks.py
from __future__ import absolute_import
from celery.task import task
suf = lambda n: "%d%s" % (n, {1: "st", 2: "nd", 3: "rd"}.get(n if n < 20 else n % 10, "th"))
#task
def fav_doctor():
# Stuff happend here
#task
def reverse(string):
return string[::-1]
#task
def send_email(user_id):
# Stuff happend here
#task
def add(x, y):
return x+y
anotherapp.settings.py
INSTALLED_APPS = [
...
'kombu.transport.django',
]
myapp.views.admin_scripts.py
from celery.result import AsyncResult
from myapp.celery_tasks import fav_doctor, reverse, send_email, add
from myapp.celery_main import app
#login_required
def admin_script_dashboard(request):
if request.method == 'POST':
form = Admin_Script(request.POST)
if form.is_valid():
# Results
async_result = add.delay(2, 5)
task_id = async_result.task_id
res = AsyncResult(async_result)
res_1 = add.AsyncResult(async_result)
res_2 = add.AsyncResult(async_result.id)
print ("async_result: {0}\ntask_id: {1}\nres: {2}\nres_1: {3}\nres_2: {4}".format(async_result, task_id, res, res_1, res_2))
# Backend: Make sure the client is configured with the right backend
print("Backend check: {0}".format(async_result.backend))
# States/statuses
task_state = res.state
A = async_result.status
B = res.status
print ("task_state: {0}\nA: {1}\nB: {2}".format(task_state, A, B))
The results when triggering the celery workers through my django application (related to the print statements in app.views.admin_scripts.py):
async_result: 00d7ec84-ebdb-4968-9ea6-f20ca2a793b7
task_id: 00d7ec84-ebdb-4968-9ea6-f20ca2a793b7
res: 00d7ec84-ebdb-4968-9ea6-f20ca2a793b7
res_1: 00d7ec84-ebdb-4968-9ea6-f20ca2a793b7
res_2: 00d7ec84-ebdb-4968-9ea6-f20ca2a793b7
Backend check: <celery.backends.rpc.RPCBackend object at 0x106e308d0>
task_state: PENDING
A: PENDING
B: PENDING
Output in Terminal triggered:
[2018-07-15 21:41:47,015: ERROR/MainProcess] Received unregistered task of type 'MyApp.celery_tasks.add'.
The message has been ignored and discarded.
Did you remember to import the module containing this task?
Or maybe you are using relative imports?
Please see <link> for more information.
The full contents of the message body was:
{'task': 'MyApp.celery_tasks.add', 'id': 'b21ffa43-d1f1-4767-9ab8-e58afec3ea0f', 'args': [2, 5], 'kwargs': {}, 'retries': 0, 'eta': None, 'expires': None, 'utc': True, 'callbacks': None, 'errbacks': None, 'timelimit': [None, None], 'taskset': None, 'chord': None} (266b)
Traceback (most recent call last):
File "/Users/My_MBP/anaconda3/lib/python3.6/site-packages/celery/worker/consumer.py", line 465, in on_task_received
strategies[type_](message, body,
KeyError: 'MyApp.celery_tasks.add'
I have several questions:
1. I can trigger the expected results by using commands in Terminal:
celery -A celery_tasks worker -l info
Then in the Python shell:
from celery_tasks import *
add.delay(2,3)
Which succeeds:
[2018-07-13 10:12:14,943: INFO/MainProcess] Received task: celery_tasks.add[c100ad91-2f94-40b1-bb0e-9bc2990ff3bc]
[2018-07-13 10:12:14,961: INFO/MainProcess] Task celery_tasks.add[c100ad91-2f94-40b1-bb0e-9bc2990ff3bc] succeeded in 0.017578680999577045s: 54
So executing the tasks in Terminal works, but not in my view.py in Django, why not?
2. Perhaps related to 1.: I have to, annoyingly, configure in app.celery_main.py the app.config_from_object depending if I want to test via Django, or via Terminal. You can see either I set the celeryconfig.py with myapp name prefixed, or without. Otherwise, an error message is thrown. I suspect some kind of import looping is causing an issue here (though I could be wrong) but I don't know why/where. How can I overcome this?
3. In my settings.py file (not celeryconfig.py) I have configured in INSTALLED_APPS: 'kombu.transport.django'. Is this necessary? I'm using celery 3.1.26.post2 (Cipater)
4. In all my files I have at the top:
from __future__ import absolute_import, unicode_literals
For what purpose is this for exactly and for 3.1.26 is it required?
5. I read here, that you need to ensure the client is configured with the right backend. But I'm not sure exactly what this means. My print out is (as per app.views.admin_scripts.py):
Backend check: <celery.backends.rpc.RPCBackend object at 0x106e308d0>
If there are any abnormalities in my code you recognise, please feel free to let me know.
I'm still trying to figure out the answer to my question 2, meanwhile I've figured out how to retrieve the required results: I have async_result = add.delay(2, 5) but then after this I need to have async_result.get() followed by task_output = async_result.result. The result status/state (async_result.state or async_result.status) is then set to succeed.
Celery tasks should have proper names. When running from django, the task name is MyApp.celery_tasks.add which is why celery worker not able to run it. But from terminal when your imported using from celery_tasks import *, task name is celery_tasks.add which is why it is working correctly.
You can change config based on environment variable.
kombu.transport.django Adding this not necessary.
This is related to Python 2/3. See this docs for more info.
If you want task results after the task is completed, it should be stored somewhere. So backend is needed for this. If you don't want to retrieve results, you don't need this.
Following the documentation and the Demo Django project here https://github.com/celery/celery/tree/3.1/examples/django
Project Structure
piesup2
|
piesup2
| |__init__.py
| |celery.py
| |settings.py
| |urls.py
reports
|tasks.py
|models.py
|etc....
My Code
piesup2/celery.py
from __future__ import absolute_import
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'piesup2.settings')
from django.conf import settings # noqa
app = Celery('piesup2')
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
piesup2/__init__.py
from __future__ import absolute_import
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app # noqa
piesup2/reports/tasks.py
from __future__ import absolute_import
from celery import shared_task
#shared_task
def add(x, y):
return x + y
After starting Celery from the command line with:
celery -A piesup2 worker -l info
When I attempt to run a task from within my models.py file like this:
def do_stuff(self):
from reports.tasks import add
number = add.delay(2, 2)
print(number)
I get the following error:
[2016-08-05 16:50:31,625: ERROR/MainProcess] Received unregistered
task of type 'reports.tasks.add'. The message has been ignored and
discarded.
Did you remember to import the module containing this task? Or maybe
you are using relative imports? Please see http://docs.celeryq.org/en/latest/userguide/tasks.html#task-names for
more information.
The full contents of the message body was: {'callbacks': None,
'retries': 0, 'chord': None, 'errbacks': None, 'task':
'reports.tasks.add', 'args': [2, 2], 'timelimit': [None, None],
'kwargs': {}, 'id': 'b12eb387-cf8c-483d-b53e-f9ce0ad6b421', 'taskset':
None, 'eta': None, 'expires': None, 'utc': True} (258b) Traceback
(most recent call last): File
"/home/jwe/piesup2/venv/lib/python3.4/site-packages/celery/worker/consumer.py",
line 456, in on_task_received
strategies[name](message, body, KeyError: 'reports.tasks.add'
In your django settings you need to add each module that has a celery task to CELERY_IMPORTS
CELERY_IMPORTS = (
'reports.tasks',
'some_app.some_module',
)
The celery worker is giving me error saying unregistered task. I am using celeryd as daemon. Restarting didn't help.
Here is my task.py
celeryApp > tasks.py
from __future__ import absolute_import
from celery import task
#task
def add(a,b):
return a+b
celeryProj.py << I have changed the name from celery.py as it was creating problem
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
# Indicate Celery to use the default Django settings module
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'my_proj.settings')
app = Celery('my_app')
app.config_from_object('django.conf:settings')
# This line will tell Celery to autodiscover all your tasks.py that are in your app folders
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
__init__.py
from __future__ import absolute_import
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celeryProj import app as celeryApp
settings.py
from __future__ import absolute_import
import os
import sys
import djcelery
from celery.schedules import crontab
djcelery.setup_loader()
INSTALLED_APPS = (
..
...
'my_app',
'djcelery',
'celeryApp',
)
BROKER_URL = "amqp://user:password#localhost:5672/v_host"
## celery configrations
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_ACCEPT_CONTENT=['json']
CELERY_TIMEZONE = 'Asia/Kolkata'
CELERY_ENABLE_UTC = True
CELERY_RESULT_BACKEND='djcelery.backends.database:DatabaseBackend'
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
## These config can be commented in production
CELERY_BIN = "/usr/local/bin/celery"
CELERYD_NODES = "worker1 worker2 worker3"
CELERYD_CHDIR="/home/user/proj/"
CELERYD_OPTS="--time-limit=300 --concurrency=8"
CELERYD_USER="celery"
CELERYD_GROUP="celery"
CELERY_CREATE_DIRS=1
CELERY_IMPORTS = ('celeryApp.tasks',)
My directory structure is :
myProj > my_app ,\_\_init\_\_.py, setting.py, celeryProj.py
myProj > celeryApp > tasks.py
I am getting this error log:
[2014-07-06 10:17:43,674: ERROR/MainProcess] Received unregistered task of type u'celeryApp.tasks.add'.
The message has been ignored and discarded.
Did you remember to import the module containing this task?
Or maybe you are using relative imports?
Please see http://bit.ly/gLye1c for more information.
The full contents of the message body was:
{u'utc': True, u'chord': None, u'args': [2, 2], u'retries': 0, u'expires': None, u'task': u'celeryApp.tasks.add', u'callbacks': None, u'errbacks': None, u'timelimit': [None, None], u'taskset': None, u'kwargs': {}, u'eta': None, u'id': u'620007e6-4c7d-49cc-9516-4f089c200df9'} (260b)
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/celery/worker/consumer.py", line 455, in on_task_received
strategies[name](message, body,
KeyError: u'celeryApp.tasks.add'
I have configured celery and the backend:
cleryapp = Celery(
'tasks_app', brocker='amqp://guest#localhost//',
backend='db+postgresql://guest#localhost:5432'
)
'results' appears disabled when i start the worker, but I read on another question here that that's not the issue.
The database is getting all the data correctly, but
result = AsyncResult(task_id)
raises
AttributeError: 'DisabledBackend' object has no attribute '_get_task_meta_for'
I found a more convenient way to do that.
result = celery.AsyncResult(task_id)
celery is the Celery instance of your application, not the celery module.
try using this instead where task is the name of your task function:
result = task.AsyncResult(task_id)
you can try:
from celery import result, Celery
app = Celery(backend='redis://localhost:6379/0')
res = result.AsyncResult(id='7037247e-f528-43ba-bce5-ee0e30704c58', app=app)
print(res.info)
just like it said celery , you should specify the value of backend,
just like: app = Celery("tasks", broker='mongodb://localhost:27017/test',backend='mongodb://localhost:27017/test1')
Try to import the task also in your AyscResult script to let celery know the backend setting. I have faced the similar issue (AttributeError: 'DisabledBackend' object has no attribute '_get_task_meta_for') with the backend well configured and this help me a lot.
from <your celery app> import <tasks> # add this one
from celery.result AsyncResult
result = AsyncResult(task_id)
print(result.state) # check if it worked or not, it should
For those of you who are also coming from django background might be tempted to use : from celery.result import AsyncResult in the shell.
However, remember that in django we are using python manage.py shell. There is a lot of configuration settings that django does behind the scene.
In other applications that might not be the case, especially with normal python shell. That's why we have to explicitly specially our own celery app.
e.g. if your main.py looks like this:
from celery import current_app
def create_app() -> FastAPI:
app = FastAPI()
celery_app = current_app
celery_app.config_from_object(config.settings, namespace="CELERY")
You can use the below code in a normal python shell.
(env) ✘ ⚙ ss#nofoobar ~/Documents/fastapi-celery python
>>> from main import celery
>>> from celery.result import AsyncResult
>>> AsyncResult("e3d3ef1c-65a5-4045-87c1-014aa159f52f")