My application doesn`t call celery task
my settings.py:
CLICKHOUSE_HOST = '127.0.0.1'
CLICKHOUSE_PORT = '6379'
CELERY_BROKER_URL = 'redis://' + CLICKHOUSE_HOST + ':' + CLICKHOUSE_PORT + '/0'
CELERY_BROKER_TRANSPORT_OPTIONS = {'visibility_timeout': 60}
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = "Australia/Tasmania"
CELERY_TASK_TRACK_STARTED = True
CELERY_TASK_TIME_LIMIT = 30 * 60
CELERY_RESULT_BACKEND = 'django-db'
CELERY_CACHE_BACKEND = 'django-cache'
Here is init.py in referal_app directory:
from __future__ import absolute_import
from .celery import app as celery_app
__all__ = ('celery_app',)
Here is my view that must call celery task:
class DefineView(View):
def get(self, request):
create_points.delay(1, 2)
return render(request, 'main/homepage.html', {})
Here is my task (just test for working):
from celery import shared_task
from referal_app.celery import app
#shared_task
def create_points(a, b):
with open('e:\\txt.txt', 'w') as file:
for _ in range(1):
file.write(f'{a + b}')
return 1
Here is celery settings in referal_app directory:
from __future__ import absolute_import
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'referal_app.settings')
app = Celery('referal_app')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print(f'Request: {self.request!r}')
When i launching celery i cant see function lounching
Here is celery -A referal_app worker -l INFO command output:
[tasks]
. main.tasks.create_points
. referal_app.celery.debug_task
[2021-05-28 19:42:52,266: INFO/MainProcess] celery#DESKTOP-LM7RKH1 ready.
[2021-05-28 19:42:52,745: INFO/MainProcess] Received task: referal_app.celery.debug_task[e42e42e2-a3d2-4721-ae37-a35abfe31ca6]
[2021-05-28 19:42:52,747: INFO/MainProcess] Received task: main.tasks.create_points[22d35a11-de3e-4b68-8777-7e4b8756c3bd]
[2021-05-28 19:42:53,870: INFO/MainProcess] Received task: main.tasks.create_points[2211846f-7be7-49c1-9628-d4806fbd4bc0]
[2021-05-28 19:42:53,872: INFO/MainProcess] Received task: referal_app.celery.debug_task[181a599b-5b3b-4e7d-96f7-03f5a7a7d2f0]
[2021-05-28 19:42:54,329: INFO/SpawnPoolWorker-11] child process 11056 calling self.run()
[2021-05-28 19:42:54,337: INFO/SpawnPoolWorker-12] child process 11676 calling self.run()
[2021-05-28 19:42:54,338: INFO/SpawnPoolWorker-9] child process 6560 calling self.run()
[2021-05-28 19:42:54,342: INFO/SpawnPoolWorker-10] child process 16992 calling self.run()
you need to fire up the celery process
run this command to test the celery (you need to run this command into the Django project directory)
celery -A <django_project> worker -l info
if everything works fine, then you are ready to serve celery in the production
I suggest using supervisor, let's install that
install supervisor
sudo apt-get install supervisor
Now create a config file
sudo nano /etc/supervisor/conf.d/<app>.conf
here is the example (Here I use miniconda)
[program:celery]
command=/home/<user>/miniconda3/envs/venv/bin/celery -A <django_project> worker -l info
directory=<django_project>
user=<user>
group=www-data
autostart=true
autorestart=true
stdout_logfile=/<log_dir>/log/access.log
redirect_stderr=true
numprocs=3
priority=998
Finally, use this command to update supervisor
sudo supervisorctl reread
sudo supervisorctl update
Run the following commands to stop, start, and/or check the status of the celery program (here app name celery)
sudo supervisorctl stop celery
sudo supervisorctl start celery
sudo supervisorctl status celery
Related
I'm running celery on windows through this command
celery -A project worker --concurrency=1 --loglevel=info -E -P gevent -Ofair
but it only executes one task at a time, even tried to add some configurations
import os
from celery import Celery
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "project.settings")
app = Celery("django_celery")
app.conf.worker_prefetch_multiplier = 1
app.conf.task_acks_late = True
app.config_from_object("django.conf:settings", namespace="CELERY")
app.autodiscover_tasks()
I've no idea what I'm doing wrong
I'm running Celery as a Flask microservice where it has tasks.py with tasks and manage.py contains the call to run the flask server.
This is part of the manage.py
class CeleryWorker(Command):
"""Starts the celery worker."""
name = 'celery'
capture_all_args = True
def run(self, argv):
if "down" in argv:
ret = subprocess.call(
['pkill', '-9', '-f', "my_app.celery"])
sys.exit(ret)
else:
ret = subprocess.call(
['celery', 'worker', '-A', 'my_app.celery'] + argv)
sys.exit(ret)
manager.add_command("celery", CeleryWorker())
I can start the service with either python manage.py runserver or `celery worker -A my_app.celery and it runs perfectly and registers all tasks in tasks.py.
But in production, i want to handle multiple requests to this microservice and want to add gunicorn to serve those requests. How do i do it?
I'm not able to figure out how i can run both my gunicorn command and celery command together.
Also, i'm running other api services using gunicorn in production from its create_app, since i dont need them to run the celery command.
Recommend to use Supervisor, which allow you to control a number of processes.
step1: pip install supervisor
step2: vi supervisord.conf
[program:flask_wsgi]
command=gunicorn -w 3 --worker-class gevent wsgi:app
directory=$SRC_PATH
autostart=true
[program:celery]
command=celery worker -A app.celery --loglevel=info
directory=$SRC_PATH
autostart=true
step3: run supervisord -c supervisord.conf
I have a Django application that I've deployed with Heroku. I'm trying to user celery to create a periodic task every minute. However, when I observe the logs for the worker using the following command:
heroku logs -t -p worker
I don't see my task being executed. Perhaps there is a step I'm missing? This is my configuration below...
Procfile
web: gunicorn activiist.wsgi --log-file -
worker: celery worker --app=trending.tasks.app
Tasks.py
import celery
app = celery.Celery('activiist')
import os
from celery.schedules import crontab
from celery.task import periodic_task
from django.conf import settings
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
app.conf.update(BROKER_URL=os.environ['REDIS_URL'],
CELERY_RESULT_BACKEND=os.environ['REDIS_URL'])
os.environ['DJANGO_SETTINGS_MODULE'] = 'activiist.settings'
from trending.views import *
#periodic_task(run_every=crontab())
def add():
getarticles(30)
One thing to add. When I run the task using the python shell and the "delay()" command, the task does indeed run (it shows in the logs) -- but it only runs once and only when executed.
You need separate worker for the beat process (which is responsible for executing periodic tasks):
web: gunicorn activiist.wsgi --log-file -
worker: celery worker --app=trending.tasks.app
beat: celery --app=trending.tasks.app
Worker isn't necessary for periodic tasks so the relevant line can be omitted. The other possibility is to embed beat inside the worker:
web: gunicorn activiist.wsgi --log-file -
worker: celery worker --app=trending.tasks.app -B
but to quote the celery documentation:
You can also start embed beat inside the worker by enabling workers -B option, this is convenient if you will never run more than one worker node, but it’s not commonly used and for that reason is not recommended for production use
I have task
class BasecrmSync(PeriodicTask):
run_every = schedules.crontab(minute='*/1')
def run(self, **kwargs):
bc = basecrm.Client(access_token=settings.BASECRM_AUTH_TOKEN)
sync = basecrm.Sync(client=bc, device_uuid=settings.BASECRM_DEVICE_UUID)
sync.fetch(synchronize)
And celery config with db broker
CELERY_RESULT_BACKEND='djcelery.backends.database:DatabaseBackend'
BROKER_URL = 'django://'
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
I run
celery -A renuval_api worker -B --loglevel=debug
But it doesn't run task...
Also I've tried run by
python3 manage.py celery worker --loglevel=DEBUG -E -B -c 1 --settings=renuval_api.settings.local
But It uses amqp transport and I can't understand why.
I run a separate process for the beat function itself. I could never get periodic tasks to fire otherwise. Of course, I may have this completely wrong, but it works for me and has for some time.
For example, I have the celery worker with its app running in one process:
celery worker --app=celeryapp:app -l info --logfile="/var/log/celery/worker.log"
And I have the beat pointed to the same app in its own process:
celery --app=celeryapp:app beat
They are pointed at the same app and settings, and beat fires off the task which the worker picks up and does. This app is in the same code tree as my Django apps, but the processes are not running in Django. Perhaps you could run something like:
python3 manage.py celery beat --loglevel=DEBUG -E -B -c 1 --settings=renuval_api.settings.local
I hope that helps.
I have in my celery configuration
BROKER_URL = 'redis://127.0.0.1:6379'
CELERY_RESULT_BACKEND = 'redis://127.0.0.1:6379'
Yet whenever I run the celeryd, I get this error
consumer: Cannot connect to amqp://guest#127.0.0.1:5672//: [Errno 111] Connection refused. Trying again in 2.00 seconds...
Why is it not connecting to the redis broker I set it up with, which is running btw?
import your celery and add your broker like that :
celery = Celery('task', broker='redis://127.0.0.1:6379')
celery.config_from_object(celeryconfig)
This code belongs in celery.py
If you followed First Steps with Celery tutorial, specifically:
app.config_from_object('django.conf:settings', namespace='CELERY')
then you need to prefix your settings with CELERY, so change your BROKER_URL to:
CELERY_BROKER_URL = 'redis://127.0.0.1:6379'
I got this response because I was starting my celery worker incorrectly on the terminal.
I was running:
celery -A celery worker
But because I defined celery inside of web/server.py, I needed to run:
celery -A web.server.celery worker
web.server indicates that my celery object is in a file server.py inside a directory web. Running that latter command connected to the broker I specified!