run celery and aiohttp in the same service - python

my goal is to send live notifications to the user.
the message will arrive from a celery worker.
and will be sent to the user using aiohttp through sockjs.
how can i run both on the same app ? or receive the messages on the aiohttp instance where i have data of the authenticated users in memory ?
what is the best approach to achieve that ?
i have tried running them together using the aiohttp on_startup. but celery is blocking the main thread so its not possible.
async def run_celery(app):
.... run celery
app = web.Application(loop=asyncio.get_event_loop())
app.on_startup.append(run_celery)
sockjs.add_endpoint(app, msg_handler, name='messeging', prefix='/sockjs/')
thank you very much.
shay

Related

How do I get a Celery worker to consume an 'outside' RabbitMQ queue?

I have the following scripts:
celery_tasks.py
from celery import Celery
app = Celery(broker='amqp://guest:guest#localhost:5672//')
app.conf.task_default_queue = 'test_queue'
#app.task(acks_late=True)
def test(a):
return a
publish.py
from celery_tasks import test
test.delay('abc')
When i run publish.py and start the worker (celery -A celery_tasks worker --loglevel=DEBUG), the 'abc' content is published in the 'test_queue' and is consumed by the worker.
Is there a way for the worker to consume something from a queue that was not posted by Celery? For example, when I put something in the test_queue straight through RabbitMQ, without going through the Celery publisher, and run the Celery worker, it gave me the following warning:
WARNING/MainProcess] Received and deleted unknown message. Wrong destination?!?
The full contents of the message body was: body: 'abc' (3b)
{content_type:None content_encoding:None
delivery_info:{'exchange': '', 'redelivered': False, 'delivery_tag': 1, 'consumer_tag': 'None2', 'routing_key': 'test_queue'} headers={}}
Is there a way to solve this?
Celery has a specific format and a set of headers that needs to be maintained to comply with it. Therefore you would have to reverse engineer it to make celery-compliant message not produced by celery.
Keep in mind that celery is not really made to send messages across the broker, but to send tasks, which are enhanced messages therefore have extras in the header part of the amqp message
It's a late answer but custom consumers might help you. I'm using this for consuming messages from rabbitmq. Where these messages are being populated from another app with pika.
http://docs.celeryproject.org/en/latest/userguide/extending.html#custom-message-consumers

Django `python manage.py runserver` does not support asyncio&aiohttp

In my Django app, I need to proxy a request from the user to other servers. And I use asyncio/aiohttp client.
#user->request
.....
loop = asyncio.get_event_loop()
future = asyncio.ensure_future(self.run(t1, t2, t3))
loop.run_until_complete(future)
......
# response
When my django server is started with python manager.py runserver,the following error occurs when the user requests.
RuntimeError: There is no current event loop in thread 'Thread-1'.
But when I start with Gunicorn, everything is ok.
Maybe I should use new_event_loop?
Why there is no problem with Gunicorn?
Try following:
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
But use of aiohttp in synchronous django will not increase its speed unless you are sending a lot of requests in a view. If you do so, it is better to move that task to some worker (e.g. Celery) or use aiohttp for server too instead of Django.

How to provide user constant notification about Celery's Task execution status?

I integrated my project with celery in this way, inside views.py after receving request from the user
def upload(request):
if "POST" == request.method:
# save the file
task_parse.delay()
# continue
and in tasks.py
from __future__ import absolute_import
from celery import shared_task
from uploadapp.main import aunit
#shared_task
def task_parse():
aunit()
return True
In short, the shared task will run a function aunit() from a third python file located in uploadapp/ directory named main.py.
Let's assume that aunit() is a resource heavy process which takes time (like file parsing). As I integrated that with celery, It works totally asynchronously now which is good to me. So, the task start -> Celery process -> It finishes then celery set status to Finish. I can view that using flower .
But what I want to do is that I want to notify the user who is using my app also through django UI that Your Task is done processing as soon as Celery has finished processing at back-side and set status to SUCCESS.
Now, I know this is possible if :
1.) I constantly request the STATUS and see wheather it returns SUCCESS or not.
How do I do that via Celery. How can you query Celery Task status from your views.py and notify user asynchronously with just celery's python module ?
You need a real time mechanism. I would suggest Firebase. Update the Firebase real time DB field of user id with a boolean=True at the end of the celery task. Implement a javascript function to listen to Firebase database user_id object changes -> update the UI

Celery have task wait for completion of same task called previously with shared argument

I am currently trying to setup celery to handle responses from a chatbot and forward those responses to a user.
The chatbot hits the /response endpoint of my server, that triggers the following function in my server.py module:
def handle_response(user_id, message):
"""Endpoint to handle the response from the chatbot."""
tasks.send_message_to_user.apply_async(args=[user_id, message])
return ('OK', 200,
{'Content-Type': 'application/json; charset=utf-8'})
In my tasks.py file, I import celery and create the send_message_to_user function:
from celery import Celery
celery_app = Celery('tasks', broker='redis://')
#celery_app.task(name='send_message_to_user')
def send_message_to_user(user_id, message):
"""Send the message to a user."""
# Here is the logic to send the message to a specific user
My problem is, my chatbot may answer multiple messages to a user, so the send_message_to_user task is properly put in the queue but then a race condition arises and sometimes the messages arrive to the user in the wrong order.
How could I make each send_message_to_user task wait for the previous task with the same name and with the same argument "user_id" before executing it ?
I have looked at this thread Running "unique" tasks with celery but a lock isn't my solution, as I don't want to implement ugly retries when the lock is released.
Does anyone have any idea how to solve that issue in a clean(-ish) way ?
Also, it's my first post here so I'm open to any suggestions to improve my request.
Thanks!

Have Celery broadcast return results from all workers

Is there a way to get all the results from every worker on a Celery Broadcast task? I would like to monitor if everything went ok on all the workers. A list of workers that the task was send to would also be appreciated.
No, that is not easily possible.
But you don't have to limit yourself to the built-in amqp result backend,
you can send your own results using Kombu (http://kombu.readthedocs.org),
which is the messaging library used by Celery:
from celery import Celery
from kombu import Exchange
results_exchange = Exchange('myres', type='fanout')
app = Celery()
#app.task(ignore_result=True)
def something():
res = do_something()
with app.producer_or_acquire(block=True) as producer:
producer.send(
{'result': res},
exchange=results_exchange,
serializer='json',
declare=[results_exchange],
)
producer_or_acquire will create a new kombu.Producer using the celery
broker connection pool.

Categories