Server Push with SocketIO from Celery Task - python

I have a flask application within which I have many long running asynchronous tasks (~hours). It's important that the state of these tasks is communicated with the client.
I use celery to manage the background task queue, and I'm currently trying to broadcast updates to the client from each background thread via socketIO. Is this possible? Is there a better suited strategy to achieving what I would like?

You did not say, but I assume you plan on using Flask-SocketIO to handle the server-side SocketIO and not the official Node.js server, correct?
What you want to do can be done, but with the current version of Flask-SocketIO, the problem is that the process that hosts the Flask and Flask-SocketIO server owns the socket connections with the clients, so it is the only process that can communicate with them. At this time, Flask-SocketIO does not offer any help in sending data to clients from other processes such as Celery workers, this part you have to implement yourself. Specifically for Celery, you can have your long running tasks expose progress information that the server process can pick up and send to the clients.
I am currently working on improvements to Flask-SocketIO that will enable any process to send messages to connected clients using a Redis pub/sub backend for communication to the Flask-SocketIO server. Once this work is completed you will be able to write data to any client transparently from your Celery process.
You also ask if there is another alternative. You should also consider that the client can poll the server for status. If the updates do not need to be very frequent, then this is an option that is going to be much easier to implement. The client asks the server for status for a given task, and the server in turn asks the Celery task. I showed this approach in my Flask+Celery blog article.

I was able to solve this by creating and endpoint on the Flask server. See my answer here for details

Related

Long polling scalable architecture in tornado/cyclone

I want to implement long polling in python using cyclone or tornado with regards to scalability of service from beginning. Clients might connect for hours to this service. My concept:
Client HTTP requests will be processed by multiple tornado/cyclone handler threads behind NGINX proxy (serving as load balancer). There will be multiple data queues for requests: one for all unprocessed requests from all clients and rest of queues containing responses specific to each connected client, previously generated by worker processes. When requests are delivered to tornado/cyclone handler threads, request data will be sent for processing to worker queue and then processed by workers (which connect to database etc.). Meanwhile tornado/cyclone handler thread will look into client-specific queue and sends response with data back to client (if there is some waiting in queue). Please see the diagram.
Simple diagram: https://i.stack.imgur.com/9ZxcA.png
I am considering queue system because some requests might be pretty heavy on database and some requests might create notifications and messages for other clients. Is this a way to go towards scalable server or is it just overkill?
After doing some research I have decided to go with tornado websockets connected to zeroMQ. Inspired by this answer: Scaling WebSockets with a Message Queue.

Sync data with Local Computer Architecture

The scenario is
I have multiple local computers running a python application. These are on separate networks waiting for data to be sent to them from a web server. These computers are on networks without a static IP and generally behind firewall and proxy.
On the other hand I have web server which gets updates from the user through a form and send the update to the correct local computer.
Question
What options do I have to enable this. Currently I am sending csv files over ftp to achieve this but this is not real time.
The application is built on python and using django for the web part.
Appreciate your help
Use a REST API. Then you can post information to your Django app over HTTP, using an authentication key if necessary.
http://www.django-rest-framework.org/ should help you get started quickly
Sounds like you need a message queue.
You would run a separate broker server which is sent tasks by your web app. This could be on the same machine. On your two local machines you would run queue workers which connect to the broker to receive tasks (so no inbound connection required), then notify the broker in real time when they are complete.
Examples are RabbitMQ and Oracle Tuxedo. What you choose will depend on your platform & software.

Sending Message to user/group of users with uwsgi websockets

Recently I've been doing a lot of testing around different ways of serving our Django application. I've settled on uwsgi as it seems to fit our needs pretty well.
I've recently discovered that uwsgi also supports WebSockets and started looking into it and found some examples: https://github.com/unbit/uwsgi/blob/master/tests/
After running the example (websockets_chat.py) and taking a look through uwsgi's documention for their websockets implementation it appears as though you can only send broadcast, or global messages.
Has anyone managed to find a way to transmit a message to a particular user or does uwsgi not support that level of communication yet?
Cheers
There is nothing like broadcast or global messages in websockets specs. They only "upgrades" an http connection to a lower-level one. What you do with that connection is up to you. The examples show integration with redis as message exchanger but you are free to make other uses.
For your specific case you will need to build a shared list of connected users and implements routing. Remember, you cannot rely on node.js way as it is based on a single threaded setup so everything is way simpler. In uWSGI a websocket connection can happens on a thread, a process or a coroutine, so exchanging data between them is the key.

Using celery to implement a continuously running daemon

I'm developing a system that has two components - a Django web application and a monitor process.
The monitor process connects to a TCP/IP server somewhere and receives event indications using a proprietary protocol. It can also send commands to the server, again, with a proprietary protocol.
I want to run the monitor as a daemon. It will connect to the server and continuously monitor the incoming events. Whenever an event arrives, the monitor will update the database. The web application will get the current state from the database.
I'm a bit stuck with sending commands, though. I will need the Django web-app to somehow communicate with the monitor service. I can use Pyro, as recommended here, but I'd much rather use Celery, as I'm already using it for other parts of the system.
How can I get a Celery worker to both manage the monitor connection and serve Celery requests?
Note: Due to limitations of the proprietary protocol, I can't open two connections to the server, and so can't have one process monitor the event and another sending commands.
If you really want to use Celery for this use case, I suggest it'd be better if you had defined a separated queue i.e. server_monitor and route all server monitor tasks to that queue. Now, to avoid having multiple connections to the server, run the worker with -c 1.
Also, since you want to be able to process server monitor events and Celery requests, order the new worker to serve both queues -q celery,server_monitor. This way, the worker will serve both types of requests, but beware - if your celery queue is under heavy traffic, it might take a long time to process a request from server_monitor queue.

Websockets behind nginx triggered by zeromq?

I'm trying to design a system that will process large amounts of data and send updates to the client about its progress. I'd like to use nginx (which, thankfully, just started supporting websockets) and uwsgi for the web server, and I'm passing messages through the system with zeromq. Ideally the solution could be written in Python, but I'm also open to a Nodejs or even a Go solution.
Here is the flow that I'd like to achieve:
Client visits a website and requests that a large amount of data be processed.
The server farms out the processing to another process/server [the worker] via zeromq, and replies to the client request explaining that processing has begun, including information about how to set up a websocket with the server.
The client sets up the websocket connection and waits for updates.
When the processing is done, the worker sends a "processing done!" message to the websocket process via zeromq, and the websocket process pushes the message down to the client.
Is what I describe possible? I guess I was thinking that I could run uwsgi in emperor mode so that it can handle one process (port) for the webserver and another for the websocket process. I'm just not sure if I can find a way to both receive zeromq message and manage websocket connections all from the same process. Maybe I have to initiate the final websocket push from the worker?
Any help/correct-direction-pointing/potential-solutions would be much appreciated. Any sample or snippet of an nginx config file with websockets properly routed would be appreciated as well.
Thanks!
Sure, that should be possible. You might want to look at zerogw.

Categories