I'm trying to connect a Django Rest Framework backend to RabbitMQ using Celery. I have a few micro-services that use RabbitMQ as a message bus between the backend and those services. When I have the backend call a task that pushes a message to the micro-services message bus the message is never put into the queue for one of those services to grab. The queues I declare in my tasks.py are being created in RabbitMQ but are not being used by any of the producers.
testcelery/celery.py
...
from celery import Celery
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "testcelery.settings")
app = Celery("testcelery")
app.config_from_object("django.conf:settings", namespace="CELERY")
app.autodiscover_tasks()
testcelery/settings.py
...
installed_apps = [
...
"backend_processing"
]
CELERY_BROKER_URL = "amqp://testuser#rabbitmq_host:5672"
CELERY_ACCEPT_CONTENT = ["application/json"]
CELERY_TASK_SERIALIZER = "json"
backend_processing/tasks.py
...
from celery import shared_task
from testcelery.celery import app as celery_app
from kombu import Exchange, Queue
test_queue = Queue("test_queue", Exchange("test"), "test", durable=True)
#shared_task
def run_processing(data, producer=None):
with celery_app.producer_or_acquire(producer) as producer:
producer.publish(data,
declare=[test_queue],
retry=True,
exchange=test_queue.exchange,
routing_key=test_queue.routing_key,
delivery_mode="persistent")
What do I need to modify in my configuration to allow celery to publish to queues other than the ones given to celery by the settings.py?
Related
I'm new to learning celery and was following tutorials and setup my celery setup with docker
I'm having issue with sending and executing celery task.
So have 4 docker container one for rabbitmq server, celery producer server and 2 worker.
Celery tasks file:
"""
CELERY MAIN FILE
"""
from celery import Celery
from time import sleep
celery_obj = Celery()
celery_obj.config_from_object('celery_config') #config file we created in same folder
#celery_obj.task
def add(num1,num2):
print("executing add function")
sleep(5)
return num1 + num2
My celery config file for Producer:
"""
CELERY CONFIGURATION FILE
"""
from kombu import Exchange, Queue
broker_url = "pyamqp://rabbitmq_user:123#172.17.0.2/res_opt_rabbitmq_vhost"
result_backend = 'rpc://'
#celery_result_backend = ""
celery_imports = ('res_opt_code.tasks')
task_queues = (
Queue('worker_A_kombu_queue',Exchange('celery',type='direct'),routing_key='worker_A_rabbitmq_queue'),
Queue('worker_B_kombu_queue',Exchange('celery',type='direct'),routing_key='worker_B_rabbitmq_queue')
)
Config file for worker_A:
"""
CELERY CONFIGURATION FILE
"""
from kombu import Exchange, Queue
broker_url = "pyamqp://rabbitmq_user:123#172.17.0.2/res_opt_rabbitmq_vhost"
result_backend = 'rpc://'
#celery_result_backend = ""
celery_imports = ('worker_code.tasks')
task_queues = (
Queue('worker_A_kombu_queue',Exchange('celery',type='direct'),routing_key='worker_A_rabbitmq_queue'),
Queue('worker_B_kombu_queue',Exchange('celery',type='direct'),routing_key='worker_B_rabbitmq_queue')
)
Command for starting celery on producer:
celery -A tasks worker --loglevel=DEBUG -f log_file.txt
command for starting celery on worker:
celery -A tasks worker -n celery_worker_A -Q worker_A_kombu_queue --loglevel=DEBUG
Function call from producer:
from tasks import add
add.apply_async([4,4],routing_key='worker_A_rabbitmq_queue')
#also tried local executing the function but not logs of functions it's in pending
add.delay(4,4)
could you guyz please help me what I'm doing wrong here
In Logs I'm able to see worker_A connected but no logs for function
Tried further troubleshooting and changed the argument in apply_async from routing key to queue and it working with the queue argument
was following this tutorial:
https://www.youtube.com/watch?v=TM1a3m65zaA
old:
add.apply_async([4,4],routing_key='worker_A_rabbitmq_queue')
new:
add.apply_async([4,4],queue='worker_A_rabbitmq_queue')
We want to use celery for listening sqs queue and process event into task
This is the celeryconfig.py file
from kombu import (
Exchange,
Queue
)
broker_transport = 'sqs'
broker_transport_options = {'region': 'us-east-1'}
worker_concurrency = 10
accept_content = ['application/json']
result_serializer = 'json'
content_encoding = 'utf-8'
task_serializer = 'json'
worker_enable_remote_control = False
worker_send_task_events = True
result_backend = None
task_queues = (
Queue('re.fifo', exchange=Exchange('consume', type='direct'), routing_key='consume'),
)
task_routes = {'consume': {'queue': 're.fifo'}}
And this is celery.py file
from celery.utils.log import get_task_logger
from celery import Celery
app = Celery(__name__)
logger = get_task_logger(__name__)
#app.task(routing_key='consume', name="consume", bind=True, acks_late=True, ignore_result=True)
def consume(self, msg):
print('Message received')
logger.info('Message received')
# DO SOMETHING WITH THE RECEIVED MESSAGE
# print('this is the new message', msg)
return True
We are pushing event on sqs using aws cli
aws --endpoint-url http://localhost:9324 sqs send-message --queue-url http://localhost:9324/queue/re.fifo --message-group-id owais --message-deduplication-id test18 --message-body {\"test\":\"test\"}
we are receiving event on celery worker but our consume task is not calling we want to call it
How can we call consume task on event coming from SQS, any help would be appreciated
The message you pushed to SQS using aws won't be recognized by the celery worker. You need to call consume.delay(msg) to push messages SQS, then your worker will be able to recognize it.
The system is running a Django server (1.11.5) with Celery (4.0.0) and RabbitMQ as broker.
It is necessary to send some tasks to a remote server to be processed there. This new server will have its own RabbitMQ installed to use it as broker. The problem comes when on the server where Django is running we need to select which of the tasks keep running on the local machine and which are sent to the new server.
Due to some architecture reasons is not possible to solve this using queues, tasks must be sent to the new broker.
Is it possible to create two different Celery apps in Django (each one pointing to a different broker) which their own tasks each one? How can it be done?
You can create two celery app and rename celery.py to celery_app.py to avoid auto import.
from celery import Celery
app1 = Celery('hello', broker='amqp://guest#localhost//')
#app1.task
def hello1():
return 'hello world from local'
and
from celery import Celery
app2 = Celery('hello', broker='amqp://guest#remote//')
#app2.task
def hello2():
return 'hello world from remote'
and for shared task:
from celery import shared_task
#shared_task
def add(x, y):
return x + y
When you run your celery worker node:
celery --app=PACKAGE.celery_app:app worker
I'm running a Celery worker in Python with the celery module v3.1.25, and a Celery client in node.js with the node-celery npm package v0.2.7 (not the latest).
The Python Celery worker works fine when sending a job using a Python Celery client.
Problem: When using a node-celery client to send a task to the Celery backend, we get an error in the JS console:
(STDERR) Celery should be configured with json serializer
Python Celery worker is configured with:
app = Celery('tasks',
broker='amqp://test:test#192.168.1.26:5672//',
backend='amqp://',
task_serializer='json',
include=['proj.tasks'])
node-celery client is configured with:
var celery = require('node-celery')
var client = celery.createClient({
CELERY_BROKER_URL: 'amqp://test:test#192.168.1.26:5672//',
CELERY_RESULT_BACKEND: 'amqp',
CELERY_TASK_SERIALIZER: "json"
});
client.on('connect', function() {
console.log('connected');
client.call('proj.tasks.getPriceEstimates', [start_latitude, start_longitude],
function(result) {
console.log('result: ', result);
client.end();
})
});
Is this a problem with the configuration on the Python Celery worker? Did we miss out on a configuration parameter which can change the return serialization format to json?
Update
Updated with result_serializers and accept_content parameters as suggested by ChillarAnand
from __future__ import absolute_import, unicode_literals
from celery import Celery
app = Celery('tasks',
broker='amqp://test:test#192.168.1.26:5672//',
backend='amqp://',
task_serializer='json',
result_serializer='json',
accept_content=['application/json'],
include=['proj.tasks'])
But node.js Celery client still thinks that its not in json, throwing the same error message.
It gives that error because the results were in the form of 'application/x-python-serialize'.
Checked this to be the case, as RabbitMQ management console shows the results to be content_type: application/x-python-serialize
This forum post says that it is because the tasks were created before the configs were loaded.
Here are how my files are like:
proj/celery.py
from __future__ import absolute_import, unicode_literals
from celery import Celery
app = Celery('tasks',
broker='amqp://test:test#192.168.1.26:5672//',
backend='amqp://',
task_serializer='json',
result_serializer='json',
accept_content=['application/json'],
include=['proj.tasks'])
proj/tasks.py
from __future__ import absolute_import, unicode_literals
from .celery import app
#app.task
def myTask():
...
return ...
Is there a better way to structure the code to ensure that the configs are loaded before the tasks?
When configuring a serializer, you should specify content type, task serializer and result serializer as well.
app = Celery(
broker='amqp://guest#localhost//',
backend='amqp://',
include=['proj.tasks'],
task_serializer='json',
result_serializer='json',
accept_content = ['application/json'],
)
Celery docs say that Celery 3.1 can work with django out of box. But tasks not working. I have tasks.py:
from celery import task
from datetime import timedelta
#task.periodic_task(run_every=timedelta(seconds=20), ignore_result=True)
def disable_not_confirmed_users():
print "start"
Configs:
from kombu import Exchange, Queue
CELERY_SEND_TASK_ERROR_EMAILS = True
BROKER_URL = 'amqp://guest#localhost//'
CELERY_DEFAULT_QUEUE = 'project-queue'
CELERY_DEFAULT_EXCHANGE = 'project-queue'
CELERY_DEFAULT_ROUTING_KEY = 'project-queue'
CELERY_QUEUES = (
Queue('project-queue', Exchange('project-queue'), routing_key='project-queue'),
)
project/celery.py
from future import absolute_import
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'project.settings')
from django.conf import settings
app = Celery('project')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
Run celery: celery -A project worker --loglevel=INFO
But nothing not happend.
you should use celery beat to run periodic task.
celery -A project worker --loglevel=INFO
starts the worker, which does the actually work.
celery -A proj beat
starts the beat service, which asks the work to do the job.