Can't import models in tasks.py with Celery + Django - python

I want to create a background task to update a record on a specific date. I'm using Django and Celery with RabbitMQ.
I've managed to get the task called when the model is saved with this dummy task function:
tasks.py
from __future__ import absolute_import
from celery import Celery
from celery.utils.log import get_task_logger
logger = get_task_logger(__name__)
app = Celery('tasks', broker='amqp://localhost//')
#app.task(name='news.tasks.update_news_status')
def update_news_status(news_id):
# (I pass the news id and return it, nothing complicated about it)
return news_id
this task is called from my save() method in my models.py
from django.db import models
from celery import current_app
class News(models.model):
(...)
def save(self, *args, **kwargs):
current_app.send_task('news.tasks.update_news_status', args=(self.id,))
super(News, self).save(*args, **kwargs)
Thing is I want to import my News model in tasks.py but if I try to like this:
from .models import News
I get this error :
django.core.exceptions.ImproperlyConfigured: Requested setting
DEFAULT_INDEX_TABLESPACE, but settings are not configured. You must
either define the environment variable DJANGO_SETTINGS_MODULE or call
settings.configure() before accessing settings.
This is how mi celery.py looks like
from __future__ import absolute_import, unicode_literals
from celery import Celery
import os
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myapp.settings')
app = Celery('myapp')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
I have already tried this:
can't import django model into celery task
I have tried to make the import inside the task method Django and Celery, AppRegisteredNotReady exception
I have also tried this Celery - importing models in tasks.py
I also tried to create a utils.py and import it and was not possible.
and ran into different errors but in the end I'm not able to import any module in tasks.py
There might be something wrong with my config but I can't see the error, I followed the steps in The Celery Docs: First steps with Django
Also, my project structure looks like this:
├── myapp
│ ├── __init__.py
├── ├── celery.py
│ ├── settings.py
│ ├── urls.py
│ └── wsgi.py
├── news
│ ├── __init__.py
│ ├── admin.py
│ ├── apps.py
│ ├── tasks.py
│ ├── urls.py
│ ├── models.py
│ ├── views.py
├── manage.py
I'm executing the worker from myapp directory like this:
celery -A news.tasks worker --loglevel=info
What am I missing here? Thanks in advance for your help!
lambda: settings.INSTALLED_APPS
EDIT
After making the changes suggested in comments:
Add this to celery.py
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
and import inside method: tasks.py
from __future__ import absolute_import
from celery import Celery
from celery.utils.log import get_task_logger
logger = get_task_logger(__name__)
app = Celery('tasks', broker='amqp://localhost//')
#app.task(name='news.tasks.update_news_status')
def update_news_status(news_id):
from .models import News
return news_id
I get the following error:
[2018-07-20 12:24:29,337: ERROR/ForkPoolWorker-1] Task news.tasks.update_news_status[87f9ec92-c260-4ee9-a3bc-5f684c819f79] raised unexpected: ValueError('Attempted relative import in non-package',)
Traceback (most recent call last):
File "/Users/carla/Develop/App/backend/myapp-venv/lib/python2.7/site-packages/celery/app/trace.py", line 382, in trace_task
R = retval = fun(*args, **kwargs)
File "/Users/carla/Develop/App/backend/myapp-venv/lib/python2.7/site-packages/celery/app/trace.py", line 641, in __protected_call__
return self.run(*args, **kwargs)
File "/Users/carla/Develop/App/backend/news/tasks.py", line 12, in update_news_status
from .models import News
ValueError: Attempted relative import in non-package

Ok so for anyone struggling with this... turns out my celery.py wasn't reading env variables from the settings.
After a week and lots of research I realised that Celery is not a process of Django but a process running outside of it (duh), so when I tried to load the settings they were loaded but then I wasn't able to access the env variables I have defined in my .env ( I use the dotenv library). Celery was trying to look up for the env variables in my .bash_profile (of course)
So in the end my solution was to create a helper module in the same directory where my celery.py is defined, called load_env.py with the following
from os.path import dirname, join
import dotenv
def load_env():
"Get the path to the .env file and load it."
project_dir = dirname(dirname(__file__))
dotenv.read_dotenv(join(project_dir, '.env'))
and then on my celery.py (note the last import and first instruction)
from __future__ import absolute_import, unicode_literals
from celery import Celery
from django.conf import settings
import os
from .load_env import load_env
load_env()
# set the default Django settings module for the 'celery' program.
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "myapp.settings")
app = Celery('myapp')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('myapp.settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
after the call to load_env() env variables are loaded and the celery worker has access to them. By doing this I am now able to access other modules from my tasks.py, which was my main problem.
Credits to this guys (Caktus Consulting Group) and their django-project-template because if it wasn't for them I wouldn't find the answer. Thanks.

try something like this. its working in 3.1 celery, import should happen inside save method and after super()
from django.db import models
class News(models.model):
(...)
def save(self, *args, **kwargs):
(...)
super(News, self).save(*args, **kwargs)
from task import update_news_status
update_news_status.apply_async((self.id,)) #apply_async or delay

Here what i would do (Django 1.11 and celery 4.2), you have a problem in your celery config and you try to re-declare the Celery instance :
tasks.py
from myapp.celery import app # would contain what you need :)
from celery.utils.log import get_task_logger
logger = get_task_logger(__name__)
#app.task(name='news.tasks.update_news_status')
def update_news_status(news_id):
# (I pass the news id and return it, nothing complicated about it)
return news_id
celery.py
from __future__ import absolute_import, unicode_literals
from celery import Celery
from django.conf import settings
import os
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "myapp.settings")
app = Celery('myapp', backend='rpc://', broker=BROKER_URL) # your config here
app.config_from_object('django.myapp:settings', namespace='CELERY') # change here
app.autodiscover_tasks()
models.py
from django.db import models
class News(models.model):
(...)
def save(self, *args, **kwargs):
super(News, self).save(*args, **kwargs)
from news.tasks import update_news_status
update_news_status.delay(self.id) # change here
And launch it with celery -A myapp worker --loglevel=info because your app is defined in myapp.celery so -A parameter need to be the app where the conf is declared

Related

Can't call Celery in Django

So the problem is that when I try to call main task from Django lpr.views.py page shows that loading icon and thats it, nothin else happens. There is no output in Django or Celery console. When I try and run the task from python shell it runs without a problem and saves result in db. I added add task for test purposes and when I run add task it returns an error because of missing 'y' argument which is normal. But what is up with that main task?
There is my code just in case.
Project structure:
Project
├── acpvs
│   ├── celery.py
│   ├── __init__.py
│   ├── settings.py
│   ├── urls.py
│   └── wsgi.py
├── db.sqlite3
├── lpr
│   ├── __init__.py
│   ├── tasks.py
│   ├── urls.py
│   └── views.py
└── manage.py
settings.py
import djcelery
INSTALLED_APPS = [
...
'djcelery',
'django_celery_results',
]
CELERY_BROKER_URL = 'redis://localhost:6379/0'
CELERY_RESULT_BACKEND = 'django-db'
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_ALWAYS_EAGER = False
djcelery.setup_loader()
init.py
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from acpvs.celery import app as celery_app
__all__ = ['celery_app']
acpvs.celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'acpvs.settings')
app = Celery('acpvs')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
lpr.tasks.py
from __future__ import absolute_import, unicode_literals
from celery import shared_task
from djcelery import celery
#shared_task
def add(x, y):
return x + y
#shared_task
def main():
...
args = {
'imageName': imageName,
'flag': True
}
return args
lpr.urls.py
from django.conf.urls import url
from . import views
urlpatterns = [
url(r'^t/$', views.test_add),
url(r'^t1/$', views.test_main),
]
lpr.views.py
from . import tasks
from django.http import HttpResponse
def test_add(request):
result = tasks.add.delay()
return HttpResponse(result.task_id)
der test_main(request):
result = tasks.main.delay()
return HttpResponse(result.task_id)
Update
It seems to me that there is still something wrong with that how I have integrated Celery. When I remowe .delay() from views.py it works but ofcourse not async and not using Celery.
delay() is actually execute the task asynchronously, Please confirm if it updating the values in db,
I think it will update the value in db (if main method is doing) but will not return the value since client adds a message to the queue, the broker then delivers that message to a worker which then perform the operation of main
So I got it working by removing all djcelery instances and upgrading Django from 1.11 to 2.0.3
By the way I'm using Celery 4.1.0

Celery 4.1 periodic tasks error

I am trying to setup a task to run every ten seconds.Using Celery Beat.
I am using:
Django==1.11.3
celery==4.1.0
django-celery-beat==1.1.1
django-celery-results==1.0.1
It is giving me the following error:
Received unregistered task of type 'operations.tasks.message'
I am new to Celery, I have tried numerous solutions and cannot seem to find a solution,would appreciate the help
settings.py
CELERY_BROKER_URL = 'pyamqp://guest#localhost//'
CELERY_RESULT_BACKEND = 'django-db'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Africa/Johannesburg'
CELERY_BEAT_SCHEDULE = {
'message': {
'task': 'operations.tasks.message',
'schedule': 10.0
}
}
celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'nodiso.settings')
app = Celery('nodiso')
# Using a string here means the worker don't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
__init__.py
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ['celery_app']
task.py
from __future__ import absolute_import, unicode_literals
from celery import shared_task
from operations import models
from .celery import periodic_task
#task
def message():
t = models.Celerytest.objects.create(Message='Hello World')
t.save()
files structure
proj-
proj-
__init__.py
settings.py-
celery.py-
app-
tasks.py-
Within my celery.py file I define app like this:
app = Celery(
'your_celery_app_name',
include=[
'your_celery_app_name.module.task1',
'your_celery_app_name.module.task2',
]
)
app.config_from_object('your_celery_app_name.celeryconfig')
My celeryconfig.py is where I define my beats and other settings (I think this would be same as your settings.py).
Below is probably not relevant - I'm not an expert with Python and how package should be put together - but from my limited understanding your tasks should be a submodule of your celery app module. Take this with pinch of salt though.
My project structure looks more like this:
your_celery_app_name (dir)
setup.py (file)
your_celery_app_name (dir)
__init__.py (file)
celery.py (file)
celeryconfig.py (file)
module (dir)
__init__.py (importing task1 and task2 from tasks)
tasks.py (implementing task1 and task2)

Celery, Django and Scrapy: error importing from django app

I'm using celery (and django-celery) to allow a user to launch periodic scrapes through the django admin. This is part of a larger project but I've boiled the issue down to a minimal example.
Firstly, celery/celerybeat are running daemonized. If instead I run them with celery -A evofrontend worker -B -l info from my django project dir then I get no issues weirdly.
When I run celery/celerybeat as daemons however then I get a strange import error:
[2016-01-06 03:05:12,292: ERROR/MainProcess] Task evosched.tasks.scrapingTask[e18450ad-4dc3-47a0-b03d-4381a0e65c31] raised unexpected: ImportError('No module named myutils',)
Traceback (most recent call last):
File "/home/lee/Desktop/pyco/evo-scraping-min/venv/local/lib/python2.7/site-packages/celery/app/trace.py", line 240, in trace_task
R = retval = fun(*args, **kwargs)
File "/home/lee/Desktop/pyco/evo-scraping-min/venv/local/lib/python2.7/site-packages/celery/app/trace.py", line 438, in __protected_call__
return self.run(*args, **kwargs)
File "evosched/tasks.py", line 35, in scrapingTask
cs = CrawlerScript('TestSpider', scrapy_settings)
File "evosched/tasks.py", line 13, in __init__
self.crawler = CrawlerProcess(scrapy_settings)
File "/home/lee/Desktop/pyco/evo-scraping-min/venv/local/lib/python2.7/site-packages/scrapy/crawler.py", line 209, in __init__
super(CrawlerProcess, self).__init__(settings)
File "/home/lee/Desktop/pyco/evo-scraping-min/venv/local/lib/python2.7/site-packages/scrapy/crawler.py", line 115, in __init__
self.spider_loader = _get_spider_loader(settings)
File "/home/lee/Desktop/pyco/evo-scraping-min/venv/local/lib/python2.7/site-packages/scrapy/crawler.py", line 296, in _get_spider_loader
return loader_cls.from_settings(settings.frozencopy())
File "/home/lee/Desktop/pyco/evo-scraping-min/venv/local/lib/python2.7/site-packages/scrapy/spiderloader.py", line 30, in from_settings
return cls(settings)
File "/home/lee/Desktop/pyco/evo-scraping-min/venv/local/lib/python2.7/site-packages/scrapy/spiderloader.py", line 21, in __init__
for module in walk_modules(name):
File "/home/lee/Desktop/pyco/evo-scraping-min/venv/local/lib/python2.7/site-packages/scrapy/utils/misc.py", line 71, in walk_modules
submod = import_module(fullpath)
File "/usr/lib/python2.7/importlib/__init__.py", line 37, in import_module
__import__(name)
File "retail/spiders/Retail_spider.py", line 16, in <module>
ImportError: No module named myutils
i.e. the spider is having issues importing from the django project app despite adding the relevant things to syslog, and doing django.setup().
My hunch is that this may be caused by a " circular import" during initialization, but I'm not sure (see here for notes on same error)
Celery daemon config
For completeness the celeryd and celerybeat configuration scripts are:
# /etc/default/celeryd
CELERYD_NODES="worker1"
CELERY_BIN="/home/lee/Desktop/pyco/evo-scraping-min/venv/bin/celery"
CELERY_APP="evofrontend"
DJANGO_SETTINGS_MODULE="evofrontend.settings"
CELERYD_CHDIR="/home/lee/Desktop/pyco/evo-scraping-min/evofrontend"
CELERYD_OPTS="--concurrency=1"
# Workers should run as an unprivileged user.
CELERYD_USER="lee"
CELERYD_GROUP="lee"
CELERY_CREATE_DIRS=1
and
# /etc/default/celerybeat
CELERY_BIN="/home/lee/Desktop/pyco/evo-scraping-min/venv/bin/celery"
CELERY_APP="evofrontend"
CELERYBEAT_CHDIR="/home/lee/Desktop/pyco/evo-scraping-min/evofrontend/"
# Django settings module
export DJANGO_SETTINGS_MODULE="evofrontend.settings"
They are largely based on the the generic ones, with the Django settings thrown in and using the celery bin in my virtualenv rather than system.
I'm also using the init.d scripts which are the generic ones.
Project structure
As for the project: it lives at /home/lee/Desktop/pyco/evo-scraping-min. All files under it have ownership lee:lee.
The dir contains both a Scrapy (evo-retail) and Django (evofrontend) project that live under it and the complete tree structure looks like
├── evofrontend
│   ├── db.sqlite3
│   ├── evofrontend
│   │   ├── celery.py
│   │   ├── __init__.py
│   │   ├── settings.py
│   │   ├── urls.py
│   │   └── wsgi.py
│   ├── evosched
│   │   ├── __init__.py
│   │   ├── myutils.py
│   │   └── tasks.py
│   └── manage.py
└── evo-retail
└── retail
├── logs
├── retail
│   ├── __init__.py
│   ├── settings.py
│   └── spiders
│   ├── __init__.py
│   └── Retail_spider.py
└── scrapy.cfg
Django project relevant files
Now the relevant files: the evofrontend/evofrontend/celery.py looks like
# evofrontend/evofrontend/celery.py
from __future__ import absolute_import
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'evofrontend.settings')
from django.conf import settings
app = Celery('evofrontend')
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
The potentially relevant settings from the Django settings file, evofrontend/evofrontend/settings.py are
import os
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
PROJECT_ROOT = os.path.abspath(os.path.join(os.path.dirname(__file__), os.pardir))
INSTALLED_APPS = (
...
'djcelery',
'evosched',
)
# Celery settings
BROKER_URL = 'amqp://guest:guest#localhost//'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Europe/London'
CELERYD_MAX_TASKS_PER_CHILD = 1 # Each worker is killed after one task, this prevents issues with reactor not being restartable
# Use django-celery backend database
CELERY_RESULT_BACKEND = 'djcelery.backends.database:DatabaseBackend'
# Set periodic task
CELERYBEAT_SCHEDULER = "djcelery.schedulers.DatabaseScheduler"
The tasks.py in the scheduling app, evosched, looks like (it just launches the Scrapy spider using the relevant settings after changing dir)
# evofrontend/evosched/tasks.py
from __future__ import absolute_import
from celery import shared_task
from celery.utils.log import get_task_logger
logger = get_task_logger(__name__)
import os
from scrapy.crawler import CrawlerProcess
from scrapy.utils.project import get_project_settings
from django.conf import settings as django_settings
class CrawlerScript(object):
def __init__(self, spider, scrapy_settings):
self.crawler = CrawlerProcess(scrapy_settings)
self.spider = spider # just a string
def run(self, **kwargs):
# Pass the kwargs (usually command line args) to the crawler
self.crawler.crawl(self.spider, **kwargs)
self.crawler.start()
#shared_task
def scrapingTask(**kwargs):
logger.info("Start scrape...")
# scrapy.cfg file here pointing to settings...
base_dir = django_settings.BASE_DIR
os.chdir(os.path.join(base_dir, '..', 'evo-retail/retail'))
scrapy_settings = get_project_settings()
# Run crawler
cs = CrawlerScript('TestSpider', scrapy_settings)
cs.run(**kwargs)
The evofrontend/evosched/myutils.py simply contains (in this min example):
# evofrontend/evosched/myutils.py
SCRAPY_XHR_HEADERS = 'SOMETHING'
Scrapy project relevant files
In the complete Scrapy project the settings file looks like
# evo-retail/retail/retail/settings.py
BOT_NAME = 'retail'
import os
PROJECT_ROOT = os.path.dirname(os.path.abspath(__file__))
SPIDER_MODULES = ['retail.spiders']
NEWSPIDER_MODULE = 'retail.spiders'
and (in this min example) the spider is just
# evo-retail/retail/retail/spiders/Retail_spider.py
from scrapy.conf import settings as scrapy_settings
from scrapy.spiders import Spider
from scrapy.http import Request
import sys
import django
import os
import posixpath
SCRAPY_BASE_DIR = scrapy_settings['PROJECT_ROOT']
DJANGO_DIR = posixpath.normpath(os.path.join(SCRAPY_BASE_DIR, '../../../', 'evofrontend'))
sys.path.insert(0, DJANGO_DIR)
os.environ.setdefault("DJANGO_SETTINGS_MODULE", 'evofrontend.settings')
django.setup()
from evosched.myutils import SCRAPY_XHR_HEADERS
class RetailSpider(Spider):
name = "TestSpider"
def start_requests(self):
print SCRAPY_XHR_HEADERS
yield Request(url='http://www.google.com', callback=self.parse)
def parse(self, response):
print response.url
return []
EDIT:
I discovered through lots of trial and error that if the app I'm trying to import from is in my INSTALLED_APPS django setting, then it fails with the import error, but if I remove the app from there then no longer do I get the import error (e.g. removing evosched from INSTALLED_APPS then the import in the spider goes through fine...). Obviously not a solution, but may be a clue.
EDIT 2
I put a print of sys.path immediately before the failing import in the spider, the result was
/home/lee/Desktop/pyco/evo-scraping-min/evofrontend/../evo-retail/retail
/home/lee/Desktop/pyco/evo-scraping-min/venv/lib/python2.7
/home/lee/Desktop/pyco/evo-scraping-min/venv/lib/python2.7/plat-x86_64-linux-gnu
/home/lee/Desktop/pyco/evo-scraping-min/venv/lib/python2.7/lib-tk
/home/lee/Desktop/pyco/evo-scraping-min/venv/lib/python2.7/lib-old
/home/lee/Desktop/pyco/evo-scraping-min/venv/lib/python2.7/lib-dynload
/usr/lib/python2.7
/usr/lib/python2.7/plat-x86_64-linux-gnu
/usr/lib/python2.7/lib-tk
/home/lee/Desktop/pyco/evo-scraping-min/venv/local/lib/python2.7/site-packages
/home/lee/Desktop/pyco/evo-scraping-min/evofrontend
/home/lee/Desktop/pyco/evo-scraping-min/evo-retail/retail`
EDIT 3
If I do import evosched then print dir(evosched), I see "tasks" and if I choose to include such a file, I can also see "models", so importing from models would actually be possible. I don't however see " myutils". Even from evosched import myutils fails and also fails if the statement is put in a function below rather than as a global(I thought this might route out a circular import issue...). The direct import evosched works...possibly import evosched.utils will work. Not yet tried...
It seems the celery daemon is running using the system's python and not the python binary inside the virtualenv. You need to use
# Python interpreter from environment.
ENV_PYTHON="$CELERYD_CHDIR/env/bin/python"
As mentioned here to tell celeryd to run using the python inside the virtualenv.

Received unregistered task of type xx. The message has been ignored and discarded

I have a django project on an ubuntu EC2 node, that performs a computationally intensive long running process, that typically takes over 60 seconds. I need to cache the results. I've been reading http://www.caktusgroup.com/blog/2014/06/23/scheduling-tasks-celery/ and http://michal.karzynski.pl/blog/2014/05/18/setting-up-an-asynchronous-task-queue-for-django-using-celery-redis/ along with the docs. I've been able to get a basic task working at the command line, but now I'm trying to get it going as a django script.
Right now the structure of my code in my django tp1 view is :
from django.shortcuts import render
from django.http import HttpResponse
from django.views.decorators.csrf import csrf_exempt
from __future__ import absolute_import
from celery import shared_task
#csrf_exempt
def index(request):
token = str(request.POST.get('token', False))
calculator(token)
return HttpResponse(token)
#shared_task
def calculator(token):
# do calculation
# store result in cache
return
At the command line I ran:
(env1)ubuntu#ip-172-31-22-65:~/projects/tp$ celery --app=tp.celery:app worker --loglevel=INFO
At the end of the message I got:
[2015-03-24 19:49:47,045: ERROR/MainProcess] Received unregistered task of type 'tp1.views.calculator'.
The message has been ignored and discarded. Did you remember to import the module containing this task? Or maybe you are using relative imports?
My tasks.py:
from __future__ import absolute_import
from celery import shared_task
#shared_task
def test(param):
return 'The test task executed with argument "%s" ' % param
How can I get this working?
Firstly, if you are using autodiscover Celery may not be able to autodiscover your tasks.py file. Did you configure it according to the docs by creating a celery_app file for it to autodiscover your tasks:
# project/celery_app.py
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
app = Celery('project_name')
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
# YOUR APPLICATION MUST BE LISTED IN settings.INSTALLED_APPS for this to work
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
In addition, if you are relying on autodiscover, it typically can't find stuff located in modules that are not called tasks.py.
Thus, in addition to checking your configuration, I would try moving that task that's in your views.py to the tasks.py:
# project/app_name/tasks.py
from celery_app import app
#app.task()
def calculator(token):
# do calculation
# store result in cache
return
And, finally, you can import this function from tasks.py and call it in your view.
Hope that helps.

Save the Celery task in DB- Django

I'm referring to Django Celery documents.
I created celery.py in my proj/proj just as the document said. and then included __init__.py
celery.py
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
app = Celery('proj')
app.conf.update(
CELERY_RESULT_BACKEND='djcelery.backends.database:DatabaseBackend',
)
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
__init__.py
from __future__ import absolute_import
from .celery import app as celery_app
I installed pip install django-celery , then migrated python manage.py migrate djcelery
It made some of the tables in my DB.
tasks.py
from __future__ import absolute_import
from celery import shared_task
import requests
import json
#shared_task
def post_notification(data,url):
headers = {'content-type': 'application/json'}
requests.post(url, data=json.dumps(data), headers=headers)
After that I called my task in my views as
task = post_notification.delay(data,url)
print task.id #it prints an id
print task.status # prints PENDING
But nothing gets logged into any of my tables.
I've read my threads on SO,Thread1 , Thread2 and many more given on these threads, but nothing happens.
It provides me the ID & status of the task but how do I save the task in the DB? Usually it should get logged into celery_taskmeta, but there's nothing in there.
Though the task gets execute but I want to save the task in DB as well. How can I do it? Is there something I'm missing?
try this in celery.py
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
from celery.schedules import crontab
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'app_name.dev_settings')
app = Celery('app_name')
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
app.conf.CELERY_TIMEZONE = 'UTC'
app.conf.update(
CELERY_RESULT_BACKEND='djcelery.backends.database:DatabaseBackend',
)
Add following in settings.py file
BROKER_URL = 'amqp://guest:guest#localhost//'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
And start the worker.

Categories