Python/Node ZeroRPC heartbeat error - python

Im trying to run the Python server/node.js client HelloWorld example from the ZeroRPC website. All the revelant libraries seemed to have been installed correctly, but when running the example I get the error:
{ name: 'HeartbeatError',
message: 'Lost remote after 10000ms',
traceback: '' }
Has anyone seen this?

I'm using "zerorpc": "^0.9.3"
I come across with the same issue when I was running a time-consuming python code. The way to solve the issue is that you need to modify the library code of zerorpc:
node_modules -> zerorpc -> lib -> channel.js
Change the cooresponding method to
//Runs the heartbeat on this channel
Channel.prototype._runHeartbeat = function() {
var self = this;
return setInterval(function() {
if(util.curTime() > self._heartbeatExpirationTime) {
//If we haven't received a response in 2 * heartbeat rate, send an
//error
// self.emit("heartbeat-error", "Lost remote after " + (HEARTBEAT * 2) + "ms");
// self.close();
}
//Heartbeat on the channel
try {
var event = events.create(self._envelope, self._createHeader(), "_zpc_hb", [0]);
self._socket.send(event);
} catch(e) {
console.error("Error occurred while sending heartbeat:", e);
}
}, HEARTBEAT);
};
In the latest code from github: https://github.com/dotcloud/zerorpc-node
they have solved this issue.

If you can, use gevent.sleep to let zerorpc enough time to process waiting messages, including heartbeat.

Related

websocket.send method not doing anything in javascript even if the connection established successfully

guys, I had to implement WebSockets using channels in Django,but even after the connection is established successfully, WebSocket.send method is not working in javascript and it works only inside websocket.onopen
here's my javascript code for the socket
let noteSocket = null
function connect(){
noteSocket = new WebSocket('ws://' + window.location.host + `/ws/mycourses/{{
course.productId }}/chapters/{{ last_followed.chapter.chapterId }}/lessons/{{
last_followed.lessonId }}/notes/`)
noteSocket.onopen = function(e) {
console.log("Successfully connected to the WebSocket.");
}
noteSocket.onclose = function(e) {
console.log("WebSocket connection closed unexpectedly. Trying to reconnect in
2s...");
setTimeout(function() {
console.log("Reconnecting...");
connect();
}, 2000);
};
noteSocket.onmessage = function(e){
const data = JSON.stringify(e.data)
renderNote(data.note, data.timestamp)
}
}
after executing the connect function I get these in the Django development server terminal
WebSocket CONNECT /ws/mycourses/879521fa-75f9-4c95-8b61-039f7503ecae/chapters/91a12b16-35f7-4702-a002-35b228010a94/lessons/90a125ad-570e-437e-ac67-46cdeb55a068/notes/ [127.0.0.1:53424]
and I get from my browser console:
Successfully connected to the WebSocket.
I am using google chrome, and my os is arch Linux(note: I even tried in firefox but it didn't work)

Python publish to RabbitMQ exchange/queue consumed by ASP.NET Core Service

I'm running RabbitMQ, in a Docker container (rabbitmq:3-management image) as part of a Docker Compose application. The application contains some ASP.NET Core WebApi microservices, which exchange messages via this broker. That works fine and didn't give me any problems so far.
Now I need to publish messages from a Python application to an exchange/queue which was created from one of the ASP.NET Core microservices. The microservice contains a consumer for this queue. For publishing from python, I'm using pika. The problem is, I can't seem to get the publishing right. Whenever I execute my Python script, I can see in the RabbitMQ management UI that a new exchange and queue with the suffix "_skipped" were created. It seems as if my message was sent there instead of the actual queue. Also, when trying to publish directly from the management UI, the message actually makes it to the microservice, but there I'll get an exception, that the message could not be deserialized to a MassTransit envelope object, and also a new exchange and queue with the "_error" suffix.
I have no idea where the problem is. I think the exchange/queue themselves are fine, since other queues/consumers/publishers for microservice to microservice communication in this project work. So then it's probably either how I'm trying to address the exchange/queue from Python, or something with my message body which is not right.
This page gives some info about how messages need to be structured, but not too detailed, and here I got most of the info about how to publish with Python.
Below you see the relevant code regarding the host/queue configuration in the microservice, as well as the Python script. Any help/tips on how I can get this to work would be greatly appreciated.
ASP.NET Core:
// Declaring the host, queue "mappingQueue", consumer in Startup.ConfigureServices of microservice
...
services.AddMassTransit(x =>
{
x.AddConsumer<MappingUpdateConsumer>();
x.AddBus(provider => Bus.Factory.CreateUsingRabbitMq(config =>
{
config.Host(new Uri(RabbitMqConst.RabbitMqRootUri), h =>
{
h.Username(RabbitMqConst.RabbitMqUsername);
h.Password(RabbitMqConst.RabbitMqPassword);
});
config.ReceiveEndpoint("mappingQueue", e =>
{
e.ConfigureConsumer<MappingUpdateConsumer>(provider);
});
}));
});
services.AddMassTransitHostedService();
...
// Consumer
public class MappingUpdateConsumer : IConsumer<MappingUpdateMessage>
{
...
public async Task Consume(ConsumeContext<MappingUpdateMessage> context)
{
await Task.Run(async () =>
{
if (context.Message == null)
{
return;
}
...
});
}
}
// Message class (will have more properties in the future, thus not just using a string consumer)
public class MappingUpdateMessage
{
public string Message { get; set; }
}
Python:
import pika
import json
connection = pika.BlockingConnection(pika.ConnectionParameters('localhost'))
channel = connection.channel()
channel.exchange_declare(exchange='mappingQueue', exchange_type='fanout', durable=True)
message = {
"message" : {
"message": "Hello World"
},
"messageType": [
"urn:message:MassTransit.Tests:ValueMessage"
]
}
channel.basic_publish(exchange='mappingQueue',
routing_key='mappingQueue',
body=json.dumps(message))
connection.close()
print("sent")
For those with the same problem, I figured it out eventually:
..
config.ReceiveEndpoint("mappingQueue", e =>
{
e.ClearMessageDeserializers();
e.UseRawJsonSerializer();
e.ConfigureConsumer<MappingUpdateConsumer>(provider);
});
...

Django Channels - connected isn't executed

Hi i am copying parts of the github project multichat from the creator of django channels.
I am making slight changes to the code like not using jquery, renaming of some consumers and such.
I have literally no errors when running the code however when i join the page and the JS creates a websocket it says simply
[2017/08/03 13:13:48] WebSocket HANDSHAKING /chat/stream [127.0.0.1:37070]
[2017/08/03 13:13:48] WebSocket CONNECT /chat/stream [127.0.0.1:37070]
Which one would think is fine ofcourse... However i'n my connect function i have a print("********CONNECTED**********"), wich is nowhere to be seen in the console. It simply doesn't run the function i have told it to when someone connects but it still says the person connected and it throws no errors.
This is the main routing:
channel_routing = [
include("crypto_chat.routing.websocket_routing", path=r"^/chat-stream/$"),
include("crypto_chat.routing.chat_routing"),
]
Routing from app:
websocket_routing = [
route("websocket.connect", ws_connect),
route("websocket.receive", ws_receive),
route("websocket.disconnect", ws_disconnect),
]
chat_routing = [
route("chat.receive", chat_send, command="^send$"),
route("chat.receive", user_online, command="^online$"),
Connect Consumer:
#channel_session_user_from_http
def ws_connect(message):
# only accept connection if you have any rooms to join
print("******************CONNECT*************************''")
message.reply_channel.send({"accept": True})
# init rooms - add user to the groups and pk num to the session
message.channel_session['rooms'] = []
for room in Room.objects.get(users=message.user):
room.websocket_group.add(message.reply_channel)
message.channel_session['rooms'].append(room.pk)
print(message.channel_session['rooms'])
Heres JS (note: i am using the JS extension that is available on the project website also):
function send_msg(){
var msg=document.getElementById('msg_input').value;
console.log("sending msg" + msg);
webSocketBridge.send({
"command": "send",
"room": "1",
"message": msg
});
}
// logging
var ws_path = "/chat/stream";
console.log("connecting to " + ws_path);
// connect
var webSocketBridge = new channels.WebSocketBridge();
webSocketBridge.connect(ws_path);
// listen loop
webSocketBridge.listen(function(data)
{
// read json file and act accordingly
if(data.error){
// post error message in chat
console.log("Error - " + data.error);
return;
}
// handle if the user comes back online
if(data.online){
console.log("User is online");
}
else if(data.offline){
console.log("User offline");
}
else if(data.message){
console.log("Got message");
}
else{ console.log("Unknown message type"); }
});
// Helpful debugging
webSocketBridge.socket.onopen = function () {
console.log("Connected to chat socket");
};
webSocketBridge.socket.onclose = function () {
console.log("Disconnected from chat socket");
}
Websocket paths should match on server and client side. On server side, you have /chat-stream/ and on client side /chat/stream. These should match. Also, make sure you don't forget the trailing slash as django explicitly requires it.

Raspberry Pi - Node.js run Python script to continuously sample ADC

I am trying to host a local server (using Node.js) on a Raspberry Pi. The Pi has an ADC (MCP3008) connected to it, and I have a Python script that continuously samples the ADC and prints the current value. I want to have the Node server run the Python script, and whenever it sees a print statement, to just do a console.log(current value) for the time being. I am new to Node and web development in general, so it may be something simple that I'm missing so that Node will continuously receive data from the Python script. I'm trying to use Socket.io at the moment, as that seems to make sense as the method for Node to see changes from the Python script, but maybe this isn't the best way to do it. The basic webpage is from a tutorial I found (http://www.jaredwolff.com/blog/raspberry-pi-getting-interactive-with-your-server-using-websockets/). The code I am currently using is here:
var app = require('http').createServer(handler)
, io = require('socket.io').listen(app)
, url= require('url')
, fs = require('fs')
, gpio = require('onoff').Gpio
, PythonShell = require('python-shell');
app.listen(5000);
function handler (req, res) {
var path = url.parse(req.url).pathname;
if (path == '/') {
index = fs.readFile(__dirname+'/public/index.html',
function(error,data) {
if (error) {
res.writeHead(500);
return res.end("Error: unable to load index.html");
}
res.writeHead(200,{'Content-Type': 'text/html'});
res.end(data);
});
} else if( /\.(js)$/.test(path) ) {
index = fs.readFile(__dirname+'/public'+path,
function(error,data) {
if (error) {
res.writeHead(500);
return res.end("Error: unable to load " + path);
}
res.writeHead(200,{'Content-Type': 'text/plain'});
res.end(data);
});
} else {
res.writeHead(404);
res.end("Error: 404 - File not found.");
}
}
// Python
var pyshell = new PythonShell('mcp3008.py');
pyshell.run('mcp3008.py', function (err, results) {
if (err) throw err;
console.log('Results: %j', results);
});
io.sockets.on('connection', function (socket) {
pyshell.on('message', function (message) {
console.log(message);
});
});
Thank you for any hints or help that you can provide!
As jfriend00 recommended, I looked into node.js solutions. I had previously tried this, using several mcp3008 packages available on npm, but none of them successfully installed on my Raspberry Pi (model B). However, I ended up rewriting the one located here (https://github.com/fiskeben/mcp3008.js) as a separate .js file, included it with my code (along with some work from the npm spi library), and put it into a loop to read the ADC pin. That's working for now, and should be good enough for my current needs, but it still seems like a more processor-intensive solution than it should be. Thanks for your feedback!

Communicating Raspberry Pi with cloudfoundry

I have written a Node.js socket.io routine which will be called by a python socket io routine from my raspberry pi. It will communicate both ways. At the moment when I run these two routines on localhost it works fine. However when I deploy the server application to cloudfoundry and change the SocketIO connection link to cloudfoundry it does not work. Below is the client python
from socketIO_client import SocketIO
def on_updatepi_response(*args):
print 'updatepi'
def on_receivepi_response(*args):
print 'receiveepi'
with SocketIO('raspinode-server.cloudfoundry.com', 8080) as socketIO:
socketIO.on('receivepi', on_receivepi_response)
socketIO.on('updatepi', on_updatepi_response)
socketIO.emit('sendrabbit','testdata')
socketIO.wait(seconds=1)
I know cloudfoundry can be a bit strange as my first idea was to use rabbitmq but it is tied to the VCAP_SERVICES idea. However I did not think such a restriction would be there on a Node.js page.
Let me know if there is anything wrong with the above code and if not how can i get my external pi to send reading to my cloud app ?
Server Code is listed below though it is not relevant. It responds on localhost...I know the rabbitmq code is not hooked up yet
var express = require('express');
var app = express();
var server = require('http').createServer(app);
var amqp = require('amqp');
var io = require('socket.io').listen(server)
function rabbitUrl() {
if (process.env.VCAP_SERVICES) {
conf = JSON.parse(process.env.VCAP_SERVICES);
return conf['rabbitmq-2.4'][0].credentials.url;
}
else {
return "amqp://localhost";
}
}
var port = process.env.VCAP_APP_PORT || 3000;
var messages = [];
function setup() {
var exchange = conn.exchange('cf-demo', {'type':'fanout', durable:false}, function(){
var queue = conn.queue('', {durable:false, exclusive:true},
function() {
queue.subscribe(function(msg) {
messages.push(htmlEscape(msg.body));
if (messages.length > 10) {
messages.shift();
}
});
queue.bind(exchange.name, '');
});
queue.on('queueBindOk', function() {httpServer(exchange);});
});
}
server.listen(8080);
io.sockets.on('connection', function(socket){
// when the client emits sendrabbit, this listens
socket.on('sendrabbit', function(data)
{
// we tell the client to execute updatepi with 2 parameters
io.sockets.emit('updatepi', socket.username, data)
});
socket.on('disconnect', function()
{
socket.broadcast.emit('updatepi', 'SERVER', socket.username + ' has disconnected');
});
});
It's my understanding that your server should listen on the port Cloud Foundry assigns it (available in an env var). You can't assume it will be 8080. Then the client talks to raspinode-server.cloudfoundry.com (no port) and Cloud Foundry routes it to the correct place.

Categories