I'm trying to use a Tornado web socket server to notify my user when changes are made to a database in realtime. I was hoping to use HTML5 web sockets for this, even though most browsers don't support them. None of the demos that come with the Tornado package use web sockets and they are not mentioned in the documentation, so I have no idea how to get started. The few examples I could find on google either don't work or are poorly documented.
Does anyone have any examples of how I can use Tornado to push data to a client when a MySQL database has been updated or something similar I can use to learn from?
A Lee's answer is a good one, you probably want socket.io if you need to support older browsers.
Websockets are very easy in tornado though:
import tornado.websocket
class EchoWebSocket(tornado.websocket.WebSocketHandler):
def open(self):
print "WebSocket opened"
def on_message(self, message):
self.write_message(u"You said: " + message)
def on_close(self):
print "WebSocket closed"
Then route it as any other handler, and include Websocket Javascript in your views:
var ws = new WebSocket("ws://localhost:8888/websocket");
ws.onopen = function() {
ws.send("Hello, world");
};
ws.onmessage = function (evt) {
alert(evt.data);
};
For more info, see the source: https://github.com/facebook/tornado/blob/master/tornado/websocket.py
I've had success using the socket.io client and tornadio on the server end. Socket.IO provides an abstraction over websockets and provides fallbacks if websockets aren't supported by the browser (long polling, flash socket, etc.).
In order to use it you just need to write a tornadio script a la the tornadio documentation that monitors your database and then include the socket.io JavaScript in your web pages and have it establish a connection to wherever your tornadio server resides at the URL route you specified in your tornadio script.
This post using websockets and redis covers the basic idea pretty well.
Related
I built an web application using Python Bottle framework.
I used bottle-websocket plugin for WebSocket communication with clients.
Here is a part of my code.
from bottle import Bottle, request, run
from bottle.ext.websocket import GeventWebSocketServer, websocket
class MyHandler():
...
class MyServer(Bottle):
...
def _serve_websocket(self, ws):
handler = MyHandler()
some_data = request.cookies.get('some_key') # READ SOME DATA FROM HTTP REQUEST
while True:
msg = ws.receive()
handler.do_sth_on(msg, some_data) # USE THE DATA FROM HTTP REQUEST
ws.send(msg)
del(handler)
if __name__ == '__main__':
run(app=MyServer(), server=GeventWebSocketServer, host=HOST, port=PORT)
As the code shows, I need to read some data from the browser (cookies or anything in the HTTP request headers) and use it for WebSocket message processing.
How can I ensure the request is from the same browser session as the one where WebSocket connection comes?
NOTE
As I do not have much knowledge of HTTP and WebSocket, I'd love to here detailed answere as much as possible.
How can I ensure the request is from the same browser session as the one where WebSocket connection comes?
Browser session is a bit abstract since HTTP does not have a concept of sessions. HTTP and RESTful APIs is designed to be stateless, but there is options.
Usually, what you usually want to know is what user the request comes from. This is usually solved by authentication e.g. by using OpenID Connect and let the user send his JWT-token in the Authorization: header, this works for all HTTP requests, including when setting up a Websocket connection.
bottle-oauthlib seem to be a library for authenticating end-users using OAuth2 / OpenID Connect.
Another option is to identify the "browser session" using cookies but this depends on a state somewhere on the server side and is harder to implement on cloud native platforms like e.g. Kubernetes that prefer stateless workloads.
I am planning to build a home automation system where IoT devices communicate with the MQTT broker.The system also involves a Django web server that serves API for ios/android devices. I will describe an example of what I want to implement.
An API call is made from mobile app to Django server to turn ON a device. When such API request is made to Django, it should push 'Turn ON' data to IoT device via MQTT protocol.
Also, IoT devices send some real time data to the MQTT broker. Upon receiving such data I want to send push notifications to a mobile app via apns/fcm.
How can I implement this?. Will Django channels serve the purpose or should I code my Django server to act as an MQTT client and communicate with the MQTT broker?. Or is there any other methods to implement this.
well, i did a little project on paho-MQTT it's a nice experience with google chrome extension MQTTLens.(you should try this if u aren't using this already)
in your case, I think you can use rest-framework of Django for building an API and on the front-end, you can use crispy-form to make ON-OFF signals and this will directly communicate to the views of Django in which you can write the client and subscriber details.
lets focus on An API call is made from mobile app to Django server to turn ON a device. When such API request is made to Django, it should push 'Turn ON' data to IoT device via MQTT protocol.
you can make views which response to the API call from any devices for that you can check django-rest-framework this is the best option that we have.
and now IoT devices send some real time data to the MQTT broker
for this, you can check Google's this artical. MQTT broker can be handed with the Django views easily and this process not very complex if you use the modular structure with Django's DRY concpet.
on the other hand, you can also make different views for just client or for broker it's up to you but i think this approach will take a long time to devlope such application I don't know about mobile development so i can't help you with that :(.
You can handle the task using JavaScript. I have experience in implementing MQTT protocol in Django and Django-REST projects using JavaScript. You should embed a JavaScript code block in your frontend file (in my case, HTML). First, you should call the Paho-MQTT jQuery package in your file.
<script type="text/javascript" src="https://cdnjs.cloudflare.com/ajax/libs/paho-mqtt/1.0.2/mqttws31.min.js"></script>
Then add this block of code.
#parameters
var hostname = "mqtt.eclipse.org"; #There are different brokers. You should enter the broker's hostname.
var port = 80; #The port number can be different based on a TLS or non-TLS connection.
var ClientID = "ClientID_" + parseInt(Math.random()*100);
#Create a client instance
var client = new Paho.MQTT.Client(hostname, Number(port), ClientID);
#Set callback handlers
client.onConnectionLost = onConnectionLost;
client.onMessageArrived = onMessageArrived;
#Connect the client
client.connect(
{onSuccess: onConnect}
);
#Called when client connects
function onConnect() {
#Once a connection has been established, make a subscription and send a message
console.log("onConnect");
client.subscribe("subTopic");
alert("Connected.");
}
#Called when the client loses its connection
function onConnectionLost(responseObject){
if(responseObject.errorCode != 0){
console.log("onConnectionLost:" + responseObject.errorMessage);
}
}
#Called when a message arrives
function onMessageArrived(message) {
console.log("Message arrived: topic=" + message.destinationName + ", message=" + message.payloadString);
if (message.destinationName == "subTopic") {
#Do something
}
By using the code your application will be connected to the broker and listen to one or multiple topics. It means you can get sensors' data realtime. This necessitates to publish sensor's data on your hardware device, say ESP module or Raspberry PI.
It is very likely that you want to send commands from your application to the actuators to turn them on or off. For doing that you need to publish some messages from your application that your hardware will listen to.
Assume you have a toggle switch that you want to publish a message by toggling that.
<label id="switch{{ device.unique_id }}" class="switch">
<input id="state{{ device.unique_id }}" type="checkbox" onclick="publish('{{ device.unique_id }}')">
<span class="slider round"></span>
</label>
The above HTML block should reside into a django for block. Then you should write the publish onclick function to call that on toggling the switch. You can see an example of such a function below.
function publish(x) {
if(!client){
return;
}
var status = document.getElementById(x);
if (status.innerHTML == 'ON'){
status.innerHTML = 'OFF';
var message = new Paho.MQTT.Message("TurnOFF");
message.destinationName = "pubTopic";
client.send(message);
} else {
status.innerHTML = 'ON';
var message = new Paho.MQTT.Message("TurnON");
message.destinationName = "pubTopic";
client.send(message);
}
}
x in the publish function is the id that is embedded in the HTML file. To receive your published messages you should listen to the specific topic(s) on your hardware device.
I want to implement an instant messaging server using Flask + Flask-soketIO.
with client side on mobile phone (front in Ionic 2)
I have already tried different chat room examples with socketIO but I wonder how to manage multiple users chatting two by two.
I'm not yet familiar with instant messaging architectures. I have several questions on the subject :
first of all, is Flask a good framework to implement instant messaging for mobile phone application ?
I did start with Flask because it seems powerful and not heavy as django can be.
In instant messaging app with sokcetIO, how can I connect users two by two?
I tried this code, but it works for multiple users in the same tchat room :
On the client side :
<script type="text/javascript">
$(document).ready(function() {
var socket = io.connect("http://127.0.0.1:5000");
socket.on('connect', function() {
console.log('connected')
});
socket.on('message',function(msg){
$("#messages").append('<li>' + msg + '</li>');
});
$("#sendButton").on('click', function() {
console.log($('#myMessage').val());
socket.send({ 'author': 'Kidz55',
'message': $('#myMessage').val()});
$('#myMessage').val('');
});
});
</script>
On the server side :
#socketio.on('message')
def handle_json(json):
print('received json: ' + str(json))
# broadcasting to everyone who 's connected
send(json,,broadcast=True)
Is it scalable, and does it support heavy traffic ?
In instant messaging app with sokcetIO, how can I connect users two by two?
If it is always going to be two users chatting, then they can send direct messages to each other. When a client connects, it gets assigned a session id, or sid. If you keep track of these ids and map them to your users, you can then send a message to specific users. For example, if you store the sid value for a user in your user database, you can then send a direct message to that user as follows:
emit('private_message', {'msg': 'hello!'}, room=user.sid)
Is it scalable, and does it support heavy traffic ?
There are many factors that influence how much traffic your server can handle. The Flask-SocketIO server is scalable, in the sense that if a single process cannot handle the traffic, you can add more processes, basically giving you a lot of room to grow.
I have a website (Java + Spring) that relies on Websockets (Stomp over Websockets for Spring + RabbitMQ + SockJS) for some functionality.
We are creating a command line interface based in Python and we would like to add some of the functionality which is already available using websockets.
Does anyone knows how to use a python client so I can connect using the SockJS protocol ?
PS_ I am aware of a simple library which I did not tested but it does not have the capability to subscribe to a topic
PS2_ As I can connect directly to a STOMP at RabbitMQ from python and subscribe to a topic but exposing RabbitMQ directly does not feel right. Any comments around for second option ?
The solution I used was to not use the SockJS protocol and instead do "plain ol' web sockets" and used the websockets package in Python and sending Stomp messages over it using the stomper package. The stomper package just generates strings that are "messages" and you just send those messages over websockets using ws.send(message)
Spring Websockets config on the server:
#Configuration
#EnableWebSocketMessageBroker
public class WebSocketConfig implements WebSocketMessageBrokerConfigurer {
#Override
public void registerStompEndpoints(StompEndpointRegistry registry) {
registry.addEndpoint("/my-ws-app"); // Note we aren't doing .withSockJS() here
}
}
And on the Python client side of code:
import stomper
from websocket import create_connection
ws = create_connection("ws://theservername/my-ws-app")
v = str(random.randint(0, 1000))
sub = stomper.subscribe("/something-to-subscribe-to", v, ack='auto')
ws.send(sub)
while not True:
d = ws.recv()
m = MSG(d)
Now d will be a Stomp formatted message, which has a pretty simple format. MSG is a quick and dirty class I wrote to parse it.
class MSG(object):
def __init__(self, msg):
self.msg = msg
sp = self.msg.split("\n")
self.destination = sp[1].split(":")[1]
self.content = sp[2].split(":")[1]
self.subs = sp[3].split(":")[1]
self.id = sp[4].split(":")[1]
self.len = sp[5].split(":")[1]
# sp[6] is just a \n
self.message = ''.join(sp[7:])[0:-1] # take the last part of the message minus the last character which is \00
This isn't the most complete solution. There isn't an unsubscribe and the id for the Stomp subscription is randomly generated and not "remembered." But, the stomper library provides you the ability to create unsubscribe messages.
Anything on the server side that is sent to /something-to-subscribe-to will be received by all the Python clients subscribed to it.
#Controller
public class SomeController {
#Autowired
private SimpMessagingTemplate template;
#Scheduled(fixedDelayString = "1000")
public void blastToClientsHostReport(){
template.convertAndSend("/something-to-subscribe-to", "hello world");
}
}
}
I Have answered the particular question of sending a STOMP message from Springboot server with sockJs fallback to a Python client over websockets here: Websocket Client not receiving any messages. It also addresses the above comments of
Sending to a particular user.
Why the client does not receive any messages.
My Setup:
I have an existing python script that is using Tweepy to access the Twitter Streaming API. I also have a website that shows aggregate real-time information from other sources from various back-ends.
My Ideal Scenario:
I want to publish real-time tweets as well as real-time updates of my other information to my connected users using Socket.IO.
It would be really nice if I could do something as simple as an HTTP POST (from any back-end) to broadcast information to all the connected clients.
My Problem:
The Socket.IO client implementation is super straight forward... i can handle that. But I can't figure out if the functionality I'm asking for already exists... and if not, what would be the best way to make it happen?
[UPDATE]
My Solution: I created a project called Pega.IO that does what I was looking for. Basically, it lets you use Socket.IO (0.8+) as usual, but you can use HTTP POST to send messages to connected users.
It uses the Express web server with a Redis back-end. Theoretically this should be pretty simple to scale -- I will continue contributing to this project going forward.
Pega.IO - github
To install on Ubuntu, you just run this command:
curl http://cloud.github.com/downloads/Gootch/pega.io/install.sh | sh
This will create a Pega.IO server that is listening on port 8888.
Once you are up and running, just:
HTTP POST http://your-server:8888/send
with data that looks like this:
channel=whatever&secretkey=mysecret&message=hello+everyone
That's all there is to it. HTTP POST from any back-end to your Pega.IO server.
The best way I've found for this sort of thing is using a message broker. Personally, I've used RabbitMQ for this, which seems to meet the requirements mentioned in your comment on the other answer (socket.io 0.7 and scalable). If you use RabbitMQ, I'd recommend the amqp module for node, available through npm, and the Pika module for Python.
An example connector for Python using pika. This example accepts a single json-serialized argument:
def amqp_transmit(message):
connection = pika.AsyncoreConnection(pika.ConnectionParameters(host=settings.AMQP_SETTINGS['host'],
port=settings.AMQP_SETTINGS['port'],
credentials=pika.PlainCredentials(settings.AMQP_SETTINGS['username'],
settings.AMQP_SETTINGS['pass'])))
channel = connection.channel()
channel.exchange_declare(exchange=exchange_name, type='fanout')
channel.queue_declare(queue=NODE_CHANNEL, auto_delete=True, durable=False, exclusive=False)
channel.basic_publish(exchange=exchange_name,
routing_key='',
body=message,
properties=pika.BasicProperties(
content_type='application/json'),
)
print ' [%s] Sent %r' %(exchange_name, message)
connection.close()
Very basic connection code on the node end might look like this:
var connection = amqp.createConnection(
{host: amqpHost,
port: amqpPort,
password: amqpPass});
function setupAmqpListeners() {
connection.addListener('ready', amqpReady)
connection.addListener('close', function() {
console.log('Uh oh! AMQP connection failed!');
});
connection.addListener('error', function(e) {throw e});
}
function amqpReady(){
console.log('Amqp Connection Ready');
var q, exc;
q = connection.queue(queueName,
{autoDelete: true, durable: false, exclusive: false},
function(){
console.log('Amqp Connection Established.');
console.log('Attempting to get an exchange named: '+exchangeName);
exc = connection.exchange(exchangeName,
{type: 'fanout', autoDelete: false},
function(exchange) {
console.log('Amqp Exchange Found. ['+exchange.name+']');
q.bind(exc, '#');
console.log('Amqp now totally ready.');
q.subscribe(routeAmqp);
}
);
}
);
}
routeAmqp = function(msg) {
console.log(msg);
doStuff(msg);
}
Edit: The example above uses a fan-out exchange that does not persist messages. Fan-out exchange is likely going to be your best option since scalability is a concern (ie: you are running more than one box running Node that clients can be connected to).
Why not write your Node app so that there are two parts:
The Socket.IO portion, which communicates directly with the clients, and
An HTTP API of some sort, which receives POST requests and then broadcasts appropriate messages with Socket.IO.
In this way, your application becomes a "bridge" between your non-Node apps and your users' browsers. The key here is to use Socket.IO for what it was made for--real time communication to browsers--and rely on other Node technologies for other parts of your application.
[Update]
I'm not on a development environment at the moment, so I can't get you a working example, but some pseudocode would look something like this:
http = require('http');
io = require('socket.io');
server = http.createServer(function(request, response) {
// Parse the HTTP request to get the data you want
io.sockets.emit("data", whatever); // broadcast the data to Socket.IO clients
});
server.listen(8080);
socket_server = io.listen(server);
With this, you'd have a web server on port 8080 that you can use to listen to web requests (you could use a framework such as Express or one of many others to parse the body of the POST request and extract the data you need).