Asynchronous Django using django-websocket-redis - python

I'm trying to use django-websocket-redis and I didn't understand how it works even reading the doc..
The part client (javascript/template) was easy to understand but I want to send data messages from one client to other and i'm blocking here..
Connecting each client :
var ws = new WebSocket('ws://localhost:8000/ws/foobar?subscribe-group');
ws.onopen = function(e) {
console.log("websocket connected");
};
ws.onclose = function(e) {
console.log("connection closed");
};
How manage my views.py to create a link between them ?
With NodeJS I was using this code to link the clients together :
io.sockets.on('connection', function (socket) {
var data={"action": "connexion", "session_id": socket.id,};
socket.emit('message',data);
socket.on('message', function(socket){
if (socket.action == "test")
{
io.sockets.socket(socket.code).emit('message',{"action": "move"});
//the socket.code is the session_id of the client one transmitted by a form
}
});
});
Thanks you.

The link between your Django view.py and the Websocket loop is the Redis message queue. Imagine to have two separate main loops on the server: One which handles HTTP-requests using the normal Django request handler. The other loop handles the Websockets, with their long living connections. Since you can't mix both loops within the normal Django request handler, you need message queuing, so that they can communicate to each other.
Therefore, in your Django view.py, send the data to the websocket using something like:
def __init__(self):
self.redis_publisher = RedisPublisher(facility='foo', broadcast=True)
def get(self, request):
data_for_websocket = json.dumps({'some': 'data'})
self.redis_publisher.publish_message(RedisMessage(data_for_websocket))
This will publish data_for_websocket on all Websockets subscribed (=listening) using the URL:
ws://example.com/ws/foo?subscribe-broadcast

Related

Python publish to RabbitMQ exchange/queue consumed by ASP.NET Core Service

I'm running RabbitMQ, in a Docker container (rabbitmq:3-management image) as part of a Docker Compose application. The application contains some ASP.NET Core WebApi microservices, which exchange messages via this broker. That works fine and didn't give me any problems so far.
Now I need to publish messages from a Python application to an exchange/queue which was created from one of the ASP.NET Core microservices. The microservice contains a consumer for this queue. For publishing from python, I'm using pika. The problem is, I can't seem to get the publishing right. Whenever I execute my Python script, I can see in the RabbitMQ management UI that a new exchange and queue with the suffix "_skipped" were created. It seems as if my message was sent there instead of the actual queue. Also, when trying to publish directly from the management UI, the message actually makes it to the microservice, but there I'll get an exception, that the message could not be deserialized to a MassTransit envelope object, and also a new exchange and queue with the "_error" suffix.
I have no idea where the problem is. I think the exchange/queue themselves are fine, since other queues/consumers/publishers for microservice to microservice communication in this project work. So then it's probably either how I'm trying to address the exchange/queue from Python, or something with my message body which is not right.
This page gives some info about how messages need to be structured, but not too detailed, and here I got most of the info about how to publish with Python.
Below you see the relevant code regarding the host/queue configuration in the microservice, as well as the Python script. Any help/tips on how I can get this to work would be greatly appreciated.
ASP.NET Core:
// Declaring the host, queue "mappingQueue", consumer in Startup.ConfigureServices of microservice
...
services.AddMassTransit(x =>
{
x.AddConsumer<MappingUpdateConsumer>();
x.AddBus(provider => Bus.Factory.CreateUsingRabbitMq(config =>
{
config.Host(new Uri(RabbitMqConst.RabbitMqRootUri), h =>
{
h.Username(RabbitMqConst.RabbitMqUsername);
h.Password(RabbitMqConst.RabbitMqPassword);
});
config.ReceiveEndpoint("mappingQueue", e =>
{
e.ConfigureConsumer<MappingUpdateConsumer>(provider);
});
}));
});
services.AddMassTransitHostedService();
...
// Consumer
public class MappingUpdateConsumer : IConsumer<MappingUpdateMessage>
{
...
public async Task Consume(ConsumeContext<MappingUpdateMessage> context)
{
await Task.Run(async () =>
{
if (context.Message == null)
{
return;
}
...
});
}
}
// Message class (will have more properties in the future, thus not just using a string consumer)
public class MappingUpdateMessage
{
public string Message { get; set; }
}
Python:
import pika
import json
connection = pika.BlockingConnection(pika.ConnectionParameters('localhost'))
channel = connection.channel()
channel.exchange_declare(exchange='mappingQueue', exchange_type='fanout', durable=True)
message = {
"message" : {
"message": "Hello World"
},
"messageType": [
"urn:message:MassTransit.Tests:ValueMessage"
]
}
channel.basic_publish(exchange='mappingQueue',
routing_key='mappingQueue',
body=json.dumps(message))
connection.close()
print("sent")
For those with the same problem, I figured it out eventually:
..
config.ReceiveEndpoint("mappingQueue", e =>
{
e.ClearMessageDeserializers();
e.UseRawJsonSerializer();
e.ConfigureConsumer<MappingUpdateConsumer>(provider);
});
...

How does send work in flask-socketio?

I cannot understand how send function works in flask-socketio.
For example, I am using flask-socketio as server and socket.io as client:
server:
#socketio.on("test")
def handle_test():
send("Wow")
client:
socket.emit('test', "", (data) => {
console.log(data);
});
I think I can get data from server side, but I'm wrong. I just get nothing.
I understand I can build a structure based on event. But I cannot understand how send works. Will it send response to client? If it will, how could I get that response? If it won't, what does it do?
First, I suggest you use emit() instead of send().
To send from client to server, use this:
socket.emit('test', {data: 'my data'});
On the server, you can receive the event and then emit back to the client:
#socketio.on('test')
def handle_test():
emit('wow');
To receive this second emit on the client, do this:
socket.on('wow', function(data) {
console.log(data);
});

Django Channels - connected isn't executed

Hi i am copying parts of the github project multichat from the creator of django channels.
I am making slight changes to the code like not using jquery, renaming of some consumers and such.
I have literally no errors when running the code however when i join the page and the JS creates a websocket it says simply
[2017/08/03 13:13:48] WebSocket HANDSHAKING /chat/stream [127.0.0.1:37070]
[2017/08/03 13:13:48] WebSocket CONNECT /chat/stream [127.0.0.1:37070]
Which one would think is fine ofcourse... However i'n my connect function i have a print("********CONNECTED**********"), wich is nowhere to be seen in the console. It simply doesn't run the function i have told it to when someone connects but it still says the person connected and it throws no errors.
This is the main routing:
channel_routing = [
include("crypto_chat.routing.websocket_routing", path=r"^/chat-stream/$"),
include("crypto_chat.routing.chat_routing"),
]
Routing from app:
websocket_routing = [
route("websocket.connect", ws_connect),
route("websocket.receive", ws_receive),
route("websocket.disconnect", ws_disconnect),
]
chat_routing = [
route("chat.receive", chat_send, command="^send$"),
route("chat.receive", user_online, command="^online$"),
Connect Consumer:
#channel_session_user_from_http
def ws_connect(message):
# only accept connection if you have any rooms to join
print("******************CONNECT*************************''")
message.reply_channel.send({"accept": True})
# init rooms - add user to the groups and pk num to the session
message.channel_session['rooms'] = []
for room in Room.objects.get(users=message.user):
room.websocket_group.add(message.reply_channel)
message.channel_session['rooms'].append(room.pk)
print(message.channel_session['rooms'])
Heres JS (note: i am using the JS extension that is available on the project website also):
function send_msg(){
var msg=document.getElementById('msg_input').value;
console.log("sending msg" + msg);
webSocketBridge.send({
"command": "send",
"room": "1",
"message": msg
});
}
// logging
var ws_path = "/chat/stream";
console.log("connecting to " + ws_path);
// connect
var webSocketBridge = new channels.WebSocketBridge();
webSocketBridge.connect(ws_path);
// listen loop
webSocketBridge.listen(function(data)
{
// read json file and act accordingly
if(data.error){
// post error message in chat
console.log("Error - " + data.error);
return;
}
// handle if the user comes back online
if(data.online){
console.log("User is online");
}
else if(data.offline){
console.log("User offline");
}
else if(data.message){
console.log("Got message");
}
else{ console.log("Unknown message type"); }
});
// Helpful debugging
webSocketBridge.socket.onopen = function () {
console.log("Connected to chat socket");
};
webSocketBridge.socket.onclose = function () {
console.log("Disconnected from chat socket");
}
Websocket paths should match on server and client side. On server side, you have /chat-stream/ and on client side /chat/stream. These should match. Also, make sure you don't forget the trailing slash as django explicitly requires it.

Send data from Python to node a server with socket (NodeJS,Socket.io)

I trying to send sensor data (in python) from my raspberry pi3 to my local node server.
I found a module for python called requests to send data to a server.
Here I'm trying send the value 22 (later there will be sensor data) from my raspberry pi3 to my local node server with socket.io.The requests.get() works but the put commmand doesn't send the data.
Can you tell me where the mistake is ?
#!/usr/bin/env python
#
import requests
r = requests.get('http://XXX.XXX.XXX.XXX:8080');
print(r)
r = requests.put('http://XXX.XXX.XXX.XXX:8080', data = {'rasp_param':'22'});
In my server.js I try to get the data but somehow nothing getting received
server.js
var express = require('express')
, app = express()
, server = require('http').createServer(app)
, io = require('socket.io').listen(server)
, conf = require('./config.json');
// Webserver
server.listen(conf.port);
app.configure(function(){
app.use(express.static(__dirname + '/public'));
});
app.get('/', function (req, res) {
res.sendfile(__dirname + '/public/index.html');
});
// Websocket
io.sockets.on('connection', function (socket) {
//Here I want get the data
io.sockets.on('rasp_param', function (data){
console.log(data);
});
});
});
// Server Details
console.log('Ther server runs on http://127.0.0.1:' + conf.port + '/');
you are using HTTP PUT from Python, but you are listening with a websocket server on nodejs side.
Either have node listening for HTTP POST (I'd use POST rather than PUT):
app.post('/data', function (req, res) {
//do stuff with the data here
});
Or have a websocket client on python's side :
ws = yield from websockets.connect("ws://10.1.10.10")
ws.send(json.dumps({'param':'value'}))
A persistant websocket connection is probably the best choice.

Code in Python, communicate in Node.js and Socket.IO, present in HTML

You have a python script diagnosis.py that generates realtime event-based data. Using Node.js, you can launch it as a child process and capture its output and then using Socket.IO emit that to the client and present it using HTML.
Server
var util = require('util'),
spawn = require('child_process').spawn,
ls = spawn('python', ['diagnosis.py']);
var app = require('http').createServer(handler)
, io = require('socket.io').listen(app)
, fs = require('fs')
app.listen(80);
function handler (req, res) {
fs.readFile(__dirname + '/index.html',
function (err, data) {
if (err) {
res.writeHead(500);
return res.end('Error loading index.html');
}
res.writeHead(200);
res.end(data);
});
}
io.sockets.on('connection', function (socket) {
ls.stdout.on('data', function (gdata) {
socket.emit('news', gdata.toString());
});
});
Client
<html>
<head>
<script src="/socket.io/socket.io.js"></script>
<script>
var d = "";
var socket = io.connect('http://localhost');
socket.on('news', function (data) {
d += data;
document.getElementById('data').innerHTML = d;
console.log(data);
});
</script>
</head>
<body>
<div id="data"></div>
</body>
</html>
Question
This is great and all, but what if you're looking for the same HTML-Node.js communicative power that Socket.IO provides but instead between Node.js and Python? How would you do that? There's no web server there, so Socket.IO does not make a lot of sense and communicating over bare TCP does not provide the same power/elegance. How do I achieve full duplex communication between Node.js and Python?
Update I answered my own question, but I'm open to an alternative approach. RPC doesn't quite do what I want though.
Apache Thrift is a pretty awesome way to write RPC code between all of the major languages. You write a generic Thrift spec declaring the types and services, and then the code generator creates bindings for your desired languages. You can have apis for calling methods between your node and python code bases.
On a more generic low level approach, ZeroMQ is a message queue library that also has support for all of the major languages. With this, you can design your own solution for how you want to communicate (not just purely RPC). It has patterns for request/reply, push/pull, pub/sub, and pair. It gives you enough tools to put together a scalable system of any type.
I have used both and they are great solutions.
Just as a very rough example, the Thrift spec may be something like this. Lets say you want to communicate events from python to node.js, have it processed and get back some response:
myspec.thrift
struct Event {
1: string message;
}
service RpcService {
string eventOccurred(1:Event e)
}
This defines a data structure called Event with a single string member to hold the message. Then a service called RpcService define one function called eventOccured which expects an Event as an argument, and will return a string.
When you generate this code for python and node.js, you can then use the client side code for python and the server side code for node.js
python
from myrpc import RpcService, ttypes
# create a connection somewhere in your code
CONN = connect_to_node(port=9000)
def someoneClickedSomething():
event = ttypes.Event("Someone Clicked!")
resp = CONN.eventOccurred(event)
print "Got reply:", resp
node.js
// I don't know node.js, so this is just pseudo-code
var thrift = require('thrift');
var myrpc = require('myrpc.js');
var server = thrift.createServer(myrpc.RpcService, {
eventOccurred: function(event) {
console.log("event occured:", event.message);
success("Event Processed.");
},
});
server.listen(9000);
You can look at some messaging systems like 0mq http://www.zeromq.org/
I used the library inspired by this question to turn diagnosis.py into a Socket.IO client. This way I can emit the realtime data to the Node.js Socket.IO server:
socketIO.emit('gaze', ...)
And then have it do a socket.broadcast.emit to emit the data to all the Socket.IO clients (browser and diagnosis.py).
RPC is probably the more standard approach for cross-language development but I find it's a bit of an overkill to do that when the goal is to exchange data. It also does not support evented IO out of the box.
Update on Jan 2013 Since socket.broadcast.emit generates a lot of unnecessary traffic, I tried to find a better way of doing this. The solution I came up with is to use namespaces which is supported by the basic python Socket.IO client library I mentioned.
Python
self.mainSocket = SocketIO('localhost', 80)
self.gazeSocket = self.mainSocket.connect('/gaze')
self.gazeSocket.emit('gaze', ...)
To connect to the gaze namespace.
Node.js
var gaze = io.of('/gaze').on('connection', function (socket) {
socket.on('gaze', function (gdata) {
gaze.emit('gaze', gdata.toString());
});
});
This emits the data received only to clients connected to the gaze namespace.
Browser
var socket = io.connect('http://localhost/gaze');
socket.on('gaze', function (data) {
console.log(data);
});
If you're just looking for something that gives you a simple protocol on top of sockets, so you don't have to deal with buffering and delimiting messages and all that stuff, the two obvious choices (depending on what kind of data you're trying to send) are netstrings and JSON-RPC. There are multiple choices of libraries for both, in both languages, but you can end up with code as simple as this:
class MyService(Service):
# … code to set up the newsdb
def getnews(self, category, item):
return self.newsdb.get(category, item)
myService = MyService(6000)
myService = Service(6000)
// … code to set up the newsdb
myService.getnews = function(category, item, callback) {
callback(this.newsdb.get(category, item);
}

Categories