How can i set up Celery on a Flask application? - python

I'm trying to setup Celery on my Flask application. To do so, i followed this example.
So here is what i did:
main.py
# Celery configuration
app.config['CELERY_BROKER_URL'] = 'redis://localhost:6379/0'
app.config['CELERY_RESULT_BACKEND'] = 'redis://localhost:6379/0'
# Initialize Celery
celery = Celery(app.name, broker=app.config['CELERY_BROKER_URL'])
celery.conf.update(app.config)
The problem is that, whenever i try to start the celery process, i get the following error:
celery -A app.celery worker
Error:
Unable to load celery application.
The module app was not found.
Can anyone help me out on this?

Related

Celery executing only one task at a time

I'm running celery on windows through this command
celery -A project worker --concurrency=1 --loglevel=info -E -P gevent -Ofair
but it only executes one task at a time, even tried to add some configurations
import os
from celery import Celery
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "project.settings")
app = Celery("django_celery")
app.conf.worker_prefetch_multiplier = 1
app.conf.task_acks_late = True
app.config_from_object("django.conf:settings", namespace="CELERY")
app.autodiscover_tasks()
I've no idea what I'm doing wrong

How to start remote celery workers from django

I'm trying to use django in combination with celery.
Therefore I came across autodiscover_tasks() and I'm not fully sure on how to use them. The celery workers get tasks added by other applications (in this case a node backend).
So far I used this to start the worker:
celery worker -Q extraction --hostname=extraction_worker
which works fine.
Now I'm not sure what the general idea of the django-celery integration is. Should workers still be started from external (e.g. with the command above), or should they be managed and started from the django application?
My celery.py looks like:
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'main.settings')
app = Celery('app')
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
then I have 2 apps containing a tasks.py file with:
#shared_task
def extraction(total):
return 'Task executed'
how can I now register django to register the worker for those tasks?
You just start worker process as documented, you don't need to register anything else
In a production environment you’ll want to run the worker in the
background as a daemon - see Daemonization - but for testing and
development it is useful to be able to start a worker instance by
using the celery worker manage command, much as you’d use Django’s
manage.py runserver:
celery -A proj worker -l info
For a complete listing of the command-line options available, use the
help command:
celery help
celery worker collects/registers task when it runs and also consumes tasks which it found out

Flask Error: Unable to load celery application

Please help me to get out of this problem I am getting this when I am running
celery -A app.celery worker --loglevel=info
Error:
Unable to load celery application.
The module app.celery was not found.
My code is--
# Celery Configuration
from celery import Celery
from app import app
print("App Name=",app.import_name)
celery=Celery(app.name,broker=app.config['CELERY_BROKER_URL'])
celery.conf.update(app.config)
#celery.task
def download_content():
return "hello"
directory structure--
newYoutube/app/auth/routes.py and this function is present inside routes.py
auth is blueprint.
When invoking celery via
celery -A app.celery ...
celery will look for the name celery in the app namespace, expecting it to hold an instance of Celery. If you put that elsewhere (say, in app.auth.routes), then celery won't find it.
I have a working example you can crib from at https://github.com/davewsmith/flask-celery-starter
Or, refer to chapter 22 of the Flask Mega Tutorial, which uses rx instead of celery, but the general approach to structuring the code is similar.

Celery tries to connect to the wrong broker

I have in my celery configuration
BROKER_URL = 'redis://127.0.0.1:6379'
CELERY_RESULT_BACKEND = 'redis://127.0.0.1:6379'
Yet whenever I run the celeryd, I get this error
consumer: Cannot connect to amqp://guest#127.0.0.1:5672//: [Errno 111] Connection refused. Trying again in 2.00 seconds...
Why is it not connecting to the redis broker I set it up with, which is running btw?
import your celery and add your broker like that :
celery = Celery('task', broker='redis://127.0.0.1:6379')
celery.config_from_object(celeryconfig)
This code belongs in celery.py
If you followed First Steps with Celery tutorial, specifically:
app.config_from_object('django.conf:settings', namespace='CELERY')
then you need to prefix your settings with CELERY, so change your BROKER_URL to:
CELERY_BROKER_URL = 'redis://127.0.0.1:6379'
I got this response because I was starting my celery worker incorrectly on the terminal.
I was running:
celery -A celery worker
But because I defined celery inside of web/server.py, I needed to run:
celery -A web.server.celery worker
web.server indicates that my celery object is in a file server.py inside a directory web. Running that latter command connected to the broker I specified!

Celery Worker ImportError: No module named app-core

From the app/app-core, I am trying to start a celery worker using the command:
celery worker --app=app-core -l info
However, I get the error:
ImportError: No module named act-core
I am following along with this tutorial and also tried doing replicating the project exactly in a separate folder named proj. First time using celery . . what am I missing?
Also, here is my celery.py file:
from __future__ import absolute_import
from celery import Celery
celery = Celery('proj.celery',
broker='amqp://',
backend='amqp://',
include=['proj.tasks'])
# Optional configuration, see the application user guide.
celery.conf.update(
CELERY_TASK_RESULT_EXPIRES=3600,
)
if __name__ == '__main__':
celery.start()

Categories