Realtime options (Websockets, flash, polling) for Django? - python

What are some realtime "push" options for django that can install as a python package? I want to avoid having to do things like installing independent web-servers for realtime.
Essentially I am looking for something like pusher.com (cloud system) or this socket.io build for django (which has a build status:failing) for chat and other various push operations.
Ape was suggested here, but it seems it requires you to setup Ape as a server. If its not too much to ask for, are there any solutions that build right into django?

Since the time the answer was written (2012); a lot has changed.
The preferred method now to do realtime updates of the system is using websockets; which is being formalized and proposed as a standard RFC 6455. This page on MDN has a great overview of the technology.
The other emerging technology is Server Sent Events which is a W3C draft proposal.
Projects such as swampdragon and django-socketio make integrating realtime functionality easier in your project.
There are two main components to any realtime system:
A connection that remains open from the browser to a server.
A server that listens on this connection and then responds.
A system / standard to store and be notified of messages.
Okay so maybe three components.
Since django doesn't work in realtime any solution that offers realtime push/updates will require another server/service to accept messages and then notify listeners of pending messages.
Django would be the application that pushes messages (writes them) to this server on a channel (a queue/bucket). Listeners then subscribe to a channel to be notified of messages. Since the connection remains open; messages are retrieved in "realtime".
Django really has a minimal role in all of this. There are various implementations that provide the three components necessary for realtime notifications to work.
I really like juggernaut because it is super simple to set up, and uses node.js that doesn't require a lot in terms of server-side components. The other reason I prefer it is because it supports Adobe Flash Socket in addion to WebSocket (and others, see the link).
The api to access it also very simple - in fact, if you are already using redis (which you really should since its so easy to use), you don't need another API as you can drop messages to redis and juggernaut will read them, or you can use its Python API. A simple example from this flask snippet:
Send (write) a message to a channel:
>>> from juggernaut import Juggernaut
>>> jug = Juggernaut()
>>> jug.publish('channel', 'The message')
Listen to it:
<script type=text/javascript
src=http://localhost:8080/application.js></script>
<script type=text/javascript>
var jug = new Juggernaut();
jug.subscribe('channel', function(data) {
alert('Got message: ' + data);
});
</script>

Django is built to serve web pages and there is nothing out of the box to support websockets in django. The quickest/easiest option is pusher.com (I use it an really like it). You can start with something like pusher.com and if you write a quick wrapper around it, you can replace it with your own server using socket.io or any other web socket server by just changing the wrapper / interface to connect to the new server. Just make sure you write it with being able to switch out the backend at any time.
If you really want to start run your own socket server there are projects out there that will make it easy to use sockets in django:
django-websocket
django-socketio

You can actually serve up Django from Tornadio2, a working implementation of socketio in Tornado. If you want to build any degree of sophistication into your realtime app you will likely need a redis pubsub backend that maps sessions to channels and handles multicasting. For this you might like to take a look at Brukva. Also read up Yuval Adam's blog post on this subject. Finally, Tony Abou Assaleh's sample package and post will provide a useful base reference when setting up tornadio2 for django.

Related

Client-Server framework for python

I'm currently working on a University project that needs to be implemented with a Client - Server model.
I had experiences in the past where I was managing the communication at socket level and that really sucked.
I was wondering if someone could suggest an easy to use python framework that I can use for that purpose.
I don't know what kind of details you may need to answer so I'm just going to describe the project briefly.
Communication should happen over HTTP, possibly HTTPS.
The server does not need to send data back or invoke methods on the clients, it just collects data
Many clients send data concurrently to server, who needs to distinguish the sender, process the data accordingly and put the result in a database.
You can use something like Flask or Django. Both frameworks are fairly easy to implement, Flask is much easier than Django IMO, although Django has a built in authentication layer that you can use, albeit more difficult to implement in a client/server scenario like you need.
I would personally use Flask and JWT (JSON Web Tokens), which will allow you to give a token to each client for authentication with the server, which will also let you differentiate between clients, and you can use HTTPS for your SSL/TLS requirement. It is tons easier to implement this, and although I like django better for what it brings to the table, it is probably overkill to have you learn it for a single assignment.
For Flask with SSL, here is a quick rundown of that.
For JWT with Flask, here is that.
You can use any database system you would like.
If I understood you correctly you can use any web framework in python. For instance, you can use Flask (I use it and I like it). Django is also a popular choice among the python web frameworks. However, you shouldn't be limited to only these two. There are plenty of them out there. Just google for them.
The implementation of the client depends on what kind of communication there will be between the clients and the server - I don't have enough details here. I only know it's unidirectional.
The client can be a browser accessing you web application written in Flask where users send only POST requests to the server. However, even here the communication will bidirectional (the clients need to open the page which means the server sends requests back to the client) and it violates your initial requirement.
Then it can be a specific client written in python sending some particular requests to your server over http/https. For instance, your client can use a requests package to send HTTP requests.

Storage Backend based on Websockets

I spent quite some time now with researching Server Backends/API/Frameworks. I need a solution where I can store user content (JSON & Binary data).
The obvious choice would be a REST API. The only missing element is a push feature when data on server changed and clients should be notified instantly. With more research in this matter I discovered classic approaches (Comet, Push, Server sent events, Bayeux, BOSH, …) as well as the „new“ league, Websockets. I would definitely prefer the method with Websockets or using directly TCP Sockets. But this post is not about pros/cons of these two technologies so please restrain yourself from getting side tracked in comments.
At moment exists following projects which are very similar to my needs:
- Simperium (simperium.com), this looks very promising, but core/server is sadly not open source and god knows when, if ever, this step happens
- Realtime.co (framework.realtime.co/storage), hosted service, but same principle
- Some Frameworks for building servers such as Atmosphere (java, no WAMP), Cometd (java, project page looks like stuck in the 90’s), Autobahn (python, WAMP)
My actual favorite is the Autobahn framework (autobahn.ws). Especially using the WAMP protocol (subset of Websocket) as it offers exactly what I need. So the idea would be to build a python backend/server with Autobahn Python (based on Twisted framework) which manages all socket (WAMP) connections and include a Postgresql database for data storing. For all desired clients exists already WAMP libraries. The server would need to be able to do the typical REST API features:
- Send, update, delete requested data (JSON/Binary) from/to server/clients
- Synchronize & automatic conflict management
- Offline handling when connection breaks, automatic restart when connection available again
So finally the questions:
- Have I missed an open source project which covers exactly my needs?
- If I would like to develop my own server with autobahn and a database, could you point me to right direction? Have lot of concerns and not enough depth understanding.. I know Autobahn gives you already a server, but this one is not very close to my final needs.. how to build a server efficient so that he can handle all connected sockets? How handle when a client needs server push? Are there schemas, models or concept how such a server should look like?
- Twisted is a very powerful python framework but not regarded as the most convenient for writing apps.. But I guess a Socket based storage server with db access should be possible? When I run twisted as a web ressource and develop server components with other python framework, would this compromise the latency/performance much?
- Is such a desired server backend with lot of data storage (JSON fields and also binary data such as documents, images) reasonable to build with Sockets by a single devoloper/small team or is this smth. which only bigger companies like Dropbox can do at the moment?
Thank you very much for your help & time!
So finally the questions:
Have I missed an open source project which covers exactly my needs?
No you've covered the open source projects. Open source only gets you about halfway there though. To implement a Global Realtime Network requires equal parts implementation and equal parts operations. You have to think about dropped messages, retries, what happens if a particular geography gets hot how do you scale your servers ...etc. I would argue that an open source solution won't achieve what you want unless you're willing to invest significant resources into operations. I would recommend a service like PubNub: http://pubnub.com
If I would like to develop my own server with autobahn and a database, could you point me to right direction? Have lot of concerns and not enough depth understanding.. I know Autobahn gives you already a server, but this one is not very close to my final needs.. how to build a server efficient so that he can handle all connected sockets? How handle when a client needs server push? Are there schemas, models or concept how such a server should look like?
A good database to back a realtime framework would be Cassandra because it supports high write volumes and handles time series data well: http://cassandra.apache.org/.
Twisted is a very powerful python framework but not regarded as the most convenient for writing apps.. But I guess a Socket based storage server with db access should be possible? When I run twisted as a web ressource and develop server components with other python framework, would this compromise the latency/performance much?
I would not use Twisted. I would use Gevent:http://www.gevent.org/. Its coroutine based so you don't get into callback hell. To support more connections you just increase your greenlet pool to listen on the socket.
Is such a desired server backend with lot of data storage (JSON fields and also binary data such as documents, images) reasonable to build with Sockets by a single devoloper/small team or is this smth. which only bigger companies like Dropbox can do at the moment?
Again I would not build this on your own. A service like PubNub: http://pubnub.com which takes care of all the operational issues for you and has a clean API would service your needs with minimal cost. PubNub takes care of the protocol for you so if your on a mobile device that doesn't support WebSockets it will use TCP, HTTP or whatever the best transport is for the device.

Can push notifications be done with an AngularJS+Flask stack?

I have a Python/Flask backend and an Angular frontend for my website. At the backend there is a process that occasionally checks SQS for messages and I want it to push a notification to the client which can then in turn update an Angular controller. What is the best way to do this my existing technologies?
To be able to push to the client, you'll have to implement web socket support in some fashion. If you want to keep it in python/flask, there is this tutorial on how to do that with gevent:
http://www.socketubs.org/2012/10/28/Websocket_with_flask_and_gevent.html
In that article, Geoffrey also mentions a SocketIO compatible library for python/gevent that may allow you to leverage the SocketIO client-side JS library, called "gevent-socketio".
That may reduce how much work you have to do in terms of cross-browser compatibility since SocketIO has done a lot of that already.
Here is a pretty good tutorial on how to use SocketIO in AngularJS so that you can notify the AngularJS model when an event comes in from SocketIO:
http://www.html5rocks.com/en/tutorials/frameworks/angular-websockets/
If you don't want to host the web socket backend, you could look to a hosted service like PubNub or Pusher and then integrate them into AngularJS as a service. You can communicate with these services through your Python app (when the SQS notification happens) and they'll notify all connected clients for you.
I know this is a bit late, but I have done pretty much exactly what you ask for (though without Angular).
I ended up having a separate process running a websocket server called Autobahn, which listens to a redis pub/sub socket, the code is here on github
This allows you to send push notifications to your clients from pretty much anything that can access redis.
So when I want to publish a message to all my connected clients I just use redis like this:
r = redis.Redis()
r.publish('broadcasts', "SOME MESSAGE")
This has worked fairly good so far. What I can't currently do is send a push notification to a specific client. But if you have a authentication system or something to identify a specific user you could tie that to the open websockets and then be able to send messages directly to a specific client :-)
You could of course use any websocket server or client (like socket.io or sock.js), but this has worked great for me :-)

Need help understanding Comet in Python (with Django)

After spending two entire days on this I'm still finding it impossible to understand all the choices and configurations for Comet in Python. I've read all the answers here as well as every blog post I could find. It feels like I'm about to hemorrhage at this point, so my utmost apologies for anything wrong with this question.
I'm entirely new to all of this, all I've done before were simple non-real-time sites with a PHP/Django backend on Apache.
My goal is to create a real-time chat application; hopefully tied to Django for users, auth, templates, etc.
Every time I read about a tool it says I need another tool on top of it, it feels like a never-ending chain.
First of all, can anybody categorize all the tools needed for this job?
I've read about different servers, networking libraries, engines, JavaScripts for the client side, and I don't know what else. I never imagined it would be this complex.
Twisted / Twisted Web seems to be popular, but I have no idea to to integrate it or what else I need (guessing I need client-side JS at least).
If I understand correctly, Orbited is built on Twisted, do I need anything else with it?
Are Gevent and Eventlet in the same category as Twisted? How much else do I need with them?
Where do things like Celery, RabbitMQ, or KV stores like Redis come into this? I don't really understand the concept of a message queue. Are they essential and what service do they provide?
Are there any complete chat app tutorials I should look at?
I'll be entirely indebted to anybody who helps me past this mental roadblock, and if I left anything out please don't hesitate to ask. I know it's a pretty loaded question.
You could use Socket.IO. There are gevent and tornado handlers for it. See my blog post on gevent-socketio with Django here: http://codysoyland.com/2011/feb/6/evented-django-part-one-socketio-and-gevent/
I feel your pain, having had to go through the same research over the past few months. I haven't had time to deal with proper documentation yet but I have a working example of using Django with socket.io and tornadio at http://bitbucket.org/virtualcommons/vcweb - I was hoping to set up direct communication from the Django server-side to the tornadio server process using queues (i.e., logic in a django view pushes a message onto a queue that then gets handled by tornadio which pushes a json encoded version of that message out to all interested subscribers) but haven't implemented that part fully yet. The way I've currently gotten it set up involves:
An external tornado (tornadio) server, running on another port, accepting socket.io requests and working with Django models. The only writes this server process makes to the database are the chat messages that need to be stored. It has full access to all Django models, etc., and all real-time interactions need to go directly through this server process.
Django template pages that require real-time access include the socket.io javascript and establish direct connections to the tornadio server
I looked into orbited, hookbox, and gevent but decided to go with socket.io + tornado as it seemed to allow me the cleanest javascript + python code. I could be wrong about that though, having just started to learn Python/Django over the past year.
Redis is relevant as a persistence layer that also supports native publish/subscribe. So instead of a situation where you are polling the db looking for new messages, you can subscribe to a channel, and have messages pushed out to you.
I found a working example of the type of system you describe. The magic happens in the socketio view:
def socketio(request):
"""The socket.io view."""
io = request.environ['socketio']
redis_sub = redis_client().pubsub()
user = username(request.user)
# Subscribe to incoming pubsub messages from redis.
def subscriber(io):
redis_sub.subscribe(room_channel())
redis_client().publish(room_channel(), user + ' connected.')
while io.connected():
for message in redis_sub.listen():
if message['type'] == 'message':
io.send(message['data'])
greenlet = Greenlet.spawn(subscriber, io)
# Listen to incoming messages from client.
while io.connected():
message = io.recv()
if message:
redis_client().publish(room_channel(), user + ': ' + message[0])
# Disconnected. Publish disconnect message and kill subscriber greenlet.
redis_client().publish(room_channel(), user + ' disconnected')
greenlet.throw(Greenlet.GreenletExit)
return HttpResponse()
Take the view step-by-step:
Set up socket.io, get a redis client and the current user
Use Gevent to register a "subscriber" - this takes incoming messages from Redis and forwards them on to the client browser.
Run a "publisher" which takes messages from socket.io (from the user's browser) and pushes them into Redis
Repeat until the socket disconnects
The Redis Cookbook gives a little more detail on the Redis side, as well as discussing how you can persist messages.
Regarding the rest of your question: Twisted is an event-based networking library, it could be considered an alternative to Gevent in this application. It's powerful and difficult to debug in my experience.
Celery is a "distributed task queue" - basically, it lets you spread units of work out across multiple machines. The "distributed" angle means some sort of transport is required between the machines. Celery supports several types of transport, including RabbitMQ (and Redis too).
In the context of your example, Celery would only be appropriate if you had to do some sort of costly processing on each message like scanning for profanity or something. Even still, something would have to initiate the Celery task, so there would need to be some code listening for the socket.io callback.
(Just in case you weren't totally confused, Celery itself can be made to use Gevent as its underlying concurrency library.)
Hope that helps!

Django signals as IPC

So I have written a websocket application in Twisted. The application is a basic game between a number of users, but trying to use the web socket for setup and record saving is painful, so I was looking into using Django based rendering for the supplementary information (as in standings, game setup, lobby list, etc) and leave the websockets for the real action. I know I can use some basic IPC functionality to have the Django requests signal the Twisted application, but I was curious if the Django signal system would also work across applications as a simple form of IPC...
No. Django signals are restricted to a single Python interpreter. You'll need to put together something else (sockets, JSON-RPC, XMPP, etc.) in order to perform IPC.

Categories