Django import messes up Celery - python

I have a django project with the following Django project structure:
project/
...
some_app/
__init__.py
some_module_where_i_import_some_utils.py
server/
__init__.py
settings/
__init__.py
common.py
dev.py
...
celery.py
...
utils/
__init__.py
some_utils.py
manage.py
...
When using utils I import them the following way:
from project.utils.some_utils import whatever
And it works well. However when I run celery worker using DJANGO_SETTINGS_MODULE=server.settings.dev celery -A server worker --beat -l info autodiscover_tasks fails with the following error ModuleNotFoundError: No module named 'project'.
Here are contents of server/celery.py:
import os
from celery import Celery
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "server.settings.prod")
app = Celery("server")
app.config_from_object("django.conf:settings", namespace="CELERY")
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print("Request: {0!r}".format(self.request))
Here is server/__init__.py:
from .celery import app as celery_app
__all__ = ("celery_app",)

Modifying celery.py the following way did the job:
import os
import sys
from celery import Celery
sys.path.append("..")
...
I'm not sure if this could cause problems in the future and will continue looking into it and update the answer if I come up with something better.

Related

consumer: Cannot connect to amqp://guest:**#127.0.0.1:5672//: [Errno 61] Connection refused

This is my project structure
myproj
│
├── app1
├── __init__.py
├── tasks.py
|---gettingstarted
├── __init__.py
├── urls.py
├── settings.py
│
├── manage.py
|-- Procfile
In gettingstarted/settings:
BROKER_URL = 'redis://'
In Procfile:
web: gunicorn gettingstarted.wsgi --log-file -
worker: celery worker --app=app1.tasks.app
In app1/tasks.py
from __future__ import absolute_import, unicode_literals
import random
import celery
import os
app = celery.Celery('hello')
#app.task
def add(x, y):
return x + y
When I run "celery worker" it gives me:
consumer: Cannot connect to amqp://guest:**#127.0.0.1:5672//: [Errno 61] Connection refused.
You're not configuring celery from your django settings. To integrate celery with django, it's best to just follow the guide:
from __future__ import absolute_import, unicode_literals
import random
import celery
import os
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'gettingstarted.settings')
app = celery.Celery('hello')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
#app.task
def add(x, y):
return x + y
And in settings.py change BROKER_URL to CELERY_BROKER_URL.

Celery won't discover shared_tasks in django 1.11 and celery 4.0.0 or 4.1.0

I have a layout in my project like this: (As the documentation saids it has to be)
/zonia
/backend
__init__.py
celery.py
...
/musics
tasks.py
...
...
In the init.py:
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ['celery_app']
In the celery.py:
from __future__ import absolute_import, unicode_literals
import os
import environ
from celery import Celery
env = environ.Env()
environ.Env.read_env()
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('C_FORCE_ROOT', 'true')
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'backend.settings')
app = Celery('backend', backend='rpc://', broker=env('broker'))
# Using a string here means the worker don't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self))
I have several shared_tasks in the tasks.py module that are like this:
#shared_task
def recalc_ranking_music():
musics = musics_models.Music.objects.all().order_by("-num_points")
for music, rank in enumerate(musics):
music.ranking = rank + 1
music.save()
When i start the celery with command: celery -A backend worker -l info
As you can see, the tasks that i have in the tasks.py module doesn't get read by the celery process, but the one in the celery.py folder does.
The weird thing is that i have the same exact layout in another project and it works fine.
I have two days on this, and it is really taking some time away, any help?
UPDATE: (from comment) 'musics' is a django app, so it has a __init__.py file in it.
I've also tried passing the Apps names to the auto discover method on the celery instance without any luck.
If i set in the app.autodiscover_tasks(force=True) it throws an error:
django.core.exceptions.AppRegistryNotReady: Apps aren't loaded yet.
Does you musics directory have a __init__.py?
Also, try to specify the package name explicitly:
app.autodiscover_tasks(['musics'])
Celery documentation here
Your problem might be related to the scope in which you are running Celery (e.g.: virtual environment)
For instance, when I run celery like this:
celery -A demoproject worker --loglevel=info
It outputs just one task (the only one in my demo app):
[tasks]
. core.tasks.demo
but when I run it from the virtualenv, it results into this:
[tasks]
. core.tasks.demo
. myapp.tasks.demo_task
see? It has discovered a new app just because of the environment.
I found a solution, i just import the module (Django App) in the celery.py file, and it reads all the tasks.
But, this is not the behavior described in the celery documentation

Celery task not registering in django database

I am working on celery beat task and task is working fine (running properly on scheduled time) but i am not able to see task in my admin page and all celery related tables are empty in my PostgreSQL database (ex django_celery_beat_periodictask)
what i am missing here ??
Requirement
Django==1.9.7
python==3.5
celery==4.1.0
django-celery-beat==1.0.1
Project Tree
advocate
|
drive
|
-- celery.py
-- tasks.py
celery.py
from __future__ import absolute_import, unicode_literals
from celery import Celery
from celery.schedules import crontab
import os
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'advocate.settings')
# app = Celery('drive', broker='redis://localhost:6379/0')
app = Celery('drive', broker='redis://localhost:6379/0', include=['drive.tasks'])
app.control.purge()
app.config_from_object('django.conf:settings', namespace='CELERY')
# crontab for test
app.conf.beat_schedule = {
'Emails-Every-Mon_Wed_Fri': {
'task': 'drive.tasks.weekly_task',
'schedule': crontab(minute='*/5'),
},
}
app.conf.timezone = 'UTC'
# Optional configuration, see the application user guide.
app.conf.update(
result_expires=3600,
)
app.autodiscover_tasks()
if __name__ == '__main__':
app.start()
Run Command Used
celery -A drive worker -B --loglevel=debug -S django
You need to import app in proj/proj/__init__.py module. This ensures that the app is loaded when Django starts.
__init__.py
from __future__ import absolute_import, unicode_literals
from .celery import app as celery_app
__all__ = ['celery_app']
docs

Why can't Celery daemon see tasks?

I have a Django 1.62 application running on Debian 7.8 with Nginx 1.2.1 as my proxy server and Gunicorn 19.1.1 as my application server. I've installed Celery 3.1.7 and RabbitMQ 2.8.4 to handle asynchronous tasks. I'm able to start a Celery worker as a daemon but whenever I try to run the test "add" task as shown in the Celery docs, I get the following error:
Received unregistred task of type u'apps.photos.tasks.add'.
The message has been ignored and discarded.
Traceback (most recent call last):
File "/home/swing/venv/swing/local/lib/python2.7/site-packages/celery/worker/consumer.py", line 455, in on_task_received
strategies[name](message, body,
KeyError: u'apps.photos.tasks.add'
All of my configuration files are kept in a "conf" directory that sits just below my "myproj" project directory. The "add" task is in apps/photos/tasks.py.
myproj
│
├── apps
   ├── photos
   │   ├── __init__.py
   │   ├── tasks.py
conf
├── celeryconfig.py
├── celeryconfig.pyc
├── celery.py
├── __init__.py
├── middleware.py
├── settings
│   ├── base.py
│   ├── dev.py
│   ├── __init__.py
│   ├── prod.py
├── urls.py
├── wsgi.py
Here is the tasks file:
# apps/photos/tasks.py
from __future__ import absolute_import
from conf.celery import app
#app.task
def add(x, y):
return x + y
Here are my Celery application and configuration files:
# conf/celery.py
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
from conf import celeryconfig
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'conf.settings')
app = Celery('conf')
app.config_from_object(celeryconfig)
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
# conf/celeryconfig.py
BROKER_URL = 'amqp://guest#localhost:5672//'
CELERY_RESULT_BACKEND = 'amqp'
CELERY_ACCEPT_CONTENT = ['json', ]
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
This is my Celery daemon config file. I commented out CELERY_APP because I've found that the Celery daemon won't even start if I uncomment it. I also found that I need to add the "--config" argument to CELERYD_OPTS in order for the daemon to start. I created a non-privileged "celery" user who can write to the log and pid files.
# /etc/default/celeryd
CELERYD_NODES="worker1"
CELERYD_LOG_LEVEL="DEBUG"
CELERY_BIN="/home/myproj/venv/myproj/bin/celery"
#CELERY_APP="conf"
CELERYD_CHDIR="/www/myproj/"
CELERYD_OPTS="--time-limit=300 --concurrency=8 --config=celeryconfig"
CELERYD_LOG_FILE="/var/log/celery/%N.log"
CELERYD_PID_FILE="/var/run/celery/%N.pid"
CELERYD_USER="celery"
CELERYD_GROUP="celery"
CELERY_CREATE_DIRS=1
I can see from the log file that when I run the command, "sudo service celeryd start", Celery starts without any errors. However, if I open the Python shell and run the following commands, I'll see the error I described at the beginning.
$ python shell
In [] from apps.photos.tasks import add
In [] result = add.delay(2, 2)
What's interesting is that if I examine Celery's registered tasks object, the task is listed:
In [] import celery
In [] celery.registry.tasks
Out [] {'celery.chain': ..., 'apps.photos.tasks.add': <#task: apps.photos.tasks.add of conf:0x16454d0> ...}
Other similar questions here have discussed having a PYTHONPATH environment variable and I don't have such a variable. I've never understood how to set PYTHONPATH and this project has been running just fine for over a year without it.
I should also add that my production settings file is conf/settings/prod.py. It imports all of my base (tier-independent) settings from base.py and adds some extra production-dependent settings.
Can anyone tell me what I'm doing wrong? I've been struggling with this problem for three days now.
Thanks!
Looks like it is happening due to relative import error.
>>> from project.myapp.tasks import mytask
>>> mytask.name
'project.myapp.tasks.mytask'
>>> from myapp.tasks import mytask
>>> mytask.name
'myapp.tasks.mytask'
If you’re using relative imports you should set the name explicitly.
#task(name='proj.tasks.add')
def add(x, y):
return x + y
Checkout: http://celery.readthedocs.org/en/latest/userguide/tasks.html#automatic-naming-and-relative-imports
I'm using celery 4.0.2 and django, and I created a celery user and group for use with celeryd and had this same problem. The command-line version worked fine, but celeryd was not registering the tasks. It was NOT a relative naming problem.
The solution was to add the celery user to the group that can access the django project. In my case, this group is www-data with read, execute, and no write.

Celery autodiscover_tasks not working for all Django 1.7 apps

I have a Django 1.7 project with Celery 3.1. All the apps in my Django project work with the new AppConfig. The problem is that not all the tasks are found with autodiscover_tasks:
app.autodiscover_tasks(settings.INSTALLED_APPS)
If i use autodiscover_tasks like this it wil work:
app.autodiscover_tasks(settings.INSTALLED_APPS + ('apps.core','apps.sales'))
The tasks defined in websites are found but the tasks in core and sales are not. All have the same layout with apps.py and tasks.py.
The project folder structure is:
apps
core
apps.py
tasks.py
dashboard
apps.py
sales
apps.py
tasks.py
websites
apps.py
tasks.py
The class definitions are as follows:
class WebsitesConfig(AppConfig):
name = 'apps.websites'
verbose_name = 'Websites'
class SalesConfig(AppConfig):
name = 'apps.sales'
verbose_name = 'Sales'
This is discussed in a number of Celery issues, such as #2596 and #2597.
If you are using Celery 3.x, the fix is to use:
from django.apps import apps
app.autodiscover_tasks(lambda: [n.name for n in apps.get_app_configs()])
As mentioned in #3341, if you are using Celery 4.x (soon to be released) you can use:
app.autodiscover_tasks()
I just had this problem because of a misconfigured virtual environment.
If an installed app has a dependency missing from the virtual environment in which you're running celery, then the installed app's tasks will not be auto discovered. This hit me as I was moving from running my web server and celery on the same machine to a distributed solution. A bad build resulted in different environment files on different nodes.
I added the dependencies that were missing then restarted the celery service.
I had to add this to the module where my celery app was defined:
from __future__ import absolute_import
My problem was the wrong path to the celery.py in the celery command.
the command should be something like this:
celery worker -A <project_name.celery_app_module> -l info
where celery celery_app_module is like this:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# Set default Django settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', '<project_name>.settings')
app = Celery('<project_name>', include=['<app_name>.tasks'])
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))

Categories