I am planning to build a home automation system where IoT devices communicate with the MQTT broker.The system also involves a Django web server that serves API for ios/android devices. I will describe an example of what I want to implement.
An API call is made from mobile app to Django server to turn ON a device. When such API request is made to Django, it should push 'Turn ON' data to IoT device via MQTT protocol.
Also, IoT devices send some real time data to the MQTT broker. Upon receiving such data I want to send push notifications to a mobile app via apns/fcm.
How can I implement this?. Will Django channels serve the purpose or should I code my Django server to act as an MQTT client and communicate with the MQTT broker?. Or is there any other methods to implement this.
well, i did a little project on paho-MQTT it's a nice experience with google chrome extension MQTTLens.(you should try this if u aren't using this already)
in your case, I think you can use rest-framework of Django for building an API and on the front-end, you can use crispy-form to make ON-OFF signals and this will directly communicate to the views of Django in which you can write the client and subscriber details.
lets focus on An API call is made from mobile app to Django server to turn ON a device. When such API request is made to Django, it should push 'Turn ON' data to IoT device via MQTT protocol.
you can make views which response to the API call from any devices for that you can check django-rest-framework this is the best option that we have.
and now IoT devices send some real time data to the MQTT broker
for this, you can check Google's this artical. MQTT broker can be handed with the Django views easily and this process not very complex if you use the modular structure with Django's DRY concpet.
on the other hand, you can also make different views for just client or for broker it's up to you but i think this approach will take a long time to devlope such application I don't know about mobile development so i can't help you with that :(.
You can handle the task using JavaScript. I have experience in implementing MQTT protocol in Django and Django-REST projects using JavaScript. You should embed a JavaScript code block in your frontend file (in my case, HTML). First, you should call the Paho-MQTT jQuery package in your file.
<script type="text/javascript" src="https://cdnjs.cloudflare.com/ajax/libs/paho-mqtt/1.0.2/mqttws31.min.js"></script>
Then add this block of code.
#parameters
var hostname = "mqtt.eclipse.org"; #There are different brokers. You should enter the broker's hostname.
var port = 80; #The port number can be different based on a TLS or non-TLS connection.
var ClientID = "ClientID_" + parseInt(Math.random()*100);
#Create a client instance
var client = new Paho.MQTT.Client(hostname, Number(port), ClientID);
#Set callback handlers
client.onConnectionLost = onConnectionLost;
client.onMessageArrived = onMessageArrived;
#Connect the client
client.connect(
{onSuccess: onConnect}
);
#Called when client connects
function onConnect() {
#Once a connection has been established, make a subscription and send a message
console.log("onConnect");
client.subscribe("subTopic");
alert("Connected.");
}
#Called when the client loses its connection
function onConnectionLost(responseObject){
if(responseObject.errorCode != 0){
console.log("onConnectionLost:" + responseObject.errorMessage);
}
}
#Called when a message arrives
function onMessageArrived(message) {
console.log("Message arrived: topic=" + message.destinationName + ", message=" + message.payloadString);
if (message.destinationName == "subTopic") {
#Do something
}
By using the code your application will be connected to the broker and listen to one or multiple topics. It means you can get sensors' data realtime. This necessitates to publish sensor's data on your hardware device, say ESP module or Raspberry PI.
It is very likely that you want to send commands from your application to the actuators to turn them on or off. For doing that you need to publish some messages from your application that your hardware will listen to.
Assume you have a toggle switch that you want to publish a message by toggling that.
<label id="switch{{ device.unique_id }}" class="switch">
<input id="state{{ device.unique_id }}" type="checkbox" onclick="publish('{{ device.unique_id }}')">
<span class="slider round"></span>
</label>
The above HTML block should reside into a django for block. Then you should write the publish onclick function to call that on toggling the switch. You can see an example of such a function below.
function publish(x) {
if(!client){
return;
}
var status = document.getElementById(x);
if (status.innerHTML == 'ON'){
status.innerHTML = 'OFF';
var message = new Paho.MQTT.Message("TurnOFF");
message.destinationName = "pubTopic";
client.send(message);
} else {
status.innerHTML = 'ON';
var message = new Paho.MQTT.Message("TurnON");
message.destinationName = "pubTopic";
client.send(message);
}
}
x in the publish function is the id that is embedded in the HTML file. To receive your published messages you should listen to the specific topic(s) on your hardware device.
Related
I'm trying to get a var to my consumers.py to send data to the client in real time as a function does API calls and returns that to the browser.
I know channels needs Redis to function, but why? Why can we not just pass a list as it's built to the consumers class or any variable for that matter? From another answer: to store the necessary information required for different instances of consumers to communicate with one another. But what if I only will use one websocket connection, and only one user is allowed to be logged in at a time? This will be locally hosted only and the function is outside of consumers.py that returns the data so subscribing to groups may be where I need these.
Am I missing something or is redis / memurai a must here? I just can't help but to feel there's an easier way.
I ended up using server side events (SSE), specifically django-eventstream, and so far it's worked great as I didn't need the client to interact with the server, for a chat application this would not work.
Eventstream creates an endpoint at /events/ the client can connect to and receive a streaming http response.
Sending data from externalFunc.py:
send_event('test', 'message', {'text': 'Hello World'})
Event listener in HTML page:
var es = new ReconnectingEventSource('/events/');
es.addEventListener('message', function (e) {
console.log(e.data);
var source = new EventSource("/events/")
var para = document.createElement("P");
const obj = JSON.parse(event.data)
para.innerText = obj.text;
document.body.appendChild(para)
}, false);
es.addEventListener('stream-reset', function (e) {
}, false);
From a web application, this is making a request to the backend application (Python with Flask and flask-socketio). From this route on the backend, an emit should be done to a socketio client standalone application. This works fine, but when the client app sends back a message directly after, I want to retrieve this message and send it back in my route to the web application. The message I get back from the client via a callback will be asynchronous, so how in the simplest manner could this be achieved? Each time I fetch the message from the client, the route has already sent back a reply to the web app without the message.
I fully understand that this flow is usually not normal, but can this be achieved without saving this message into a database, but store it somewhere on the backend and send it back to the web app?
You can use an Event object from the Python standard library.
from threading import Event
my_event = Event()
In your Flask route:
my_event.wait() # block until the event is signaled
return socketio_response
In your Socket.IO callback function:
socketio_response = data
my_event.set() # alert the route that a result is now available
I want to implement an instant messaging server using Flask + Flask-soketIO.
with client side on mobile phone (front in Ionic 2)
I have already tried different chat room examples with socketIO but I wonder how to manage multiple users chatting two by two.
I'm not yet familiar with instant messaging architectures. I have several questions on the subject :
first of all, is Flask a good framework to implement instant messaging for mobile phone application ?
I did start with Flask because it seems powerful and not heavy as django can be.
In instant messaging app with sokcetIO, how can I connect users two by two?
I tried this code, but it works for multiple users in the same tchat room :
On the client side :
<script type="text/javascript">
$(document).ready(function() {
var socket = io.connect("http://127.0.0.1:5000");
socket.on('connect', function() {
console.log('connected')
});
socket.on('message',function(msg){
$("#messages").append('<li>' + msg + '</li>');
});
$("#sendButton").on('click', function() {
console.log($('#myMessage').val());
socket.send({ 'author': 'Kidz55',
'message': $('#myMessage').val()});
$('#myMessage').val('');
});
});
</script>
On the server side :
#socketio.on('message')
def handle_json(json):
print('received json: ' + str(json))
# broadcasting to everyone who 's connected
send(json,,broadcast=True)
Is it scalable, and does it support heavy traffic ?
In instant messaging app with sokcetIO, how can I connect users two by two?
If it is always going to be two users chatting, then they can send direct messages to each other. When a client connects, it gets assigned a session id, or sid. If you keep track of these ids and map them to your users, you can then send a message to specific users. For example, if you store the sid value for a user in your user database, you can then send a direct message to that user as follows:
emit('private_message', {'msg': 'hello!'}, room=user.sid)
Is it scalable, and does it support heavy traffic ?
There are many factors that influence how much traffic your server can handle. The Flask-SocketIO server is scalable, in the sense that if a single process cannot handle the traffic, you can add more processes, basically giving you a lot of room to grow.
I am newbie for django and python. The thing what I need is, connecting more than one django server with socket. One of these servers (main server) will get a request from mobile client with Django-REST API, and then, it should transmit it to the other django servers related to an ID of the server. (e.g. When main server gets data with an ID as 1, it should transmit the data to the server#1, if it gets the data with ID 2, it should transmit the data to server#2)
I am looking forward your advices..
p.s. Http requests cannot be sent to the django servers except main one. Each of them are intranet application and locations are different. The only way to send data to these servers via http is, sending the request to main server with the ID of the servers.
If you are not able to send an (internal) http request not even to localhost you can try to speak to the WSGI API of the different django apps. The main app might create a WSGI application object and fill it with pseudo request data.
# views.py of the main server
def myview(self, request):
# do some stuff
if server_id = 1:
from server_1_app.wsgi import application
response = application(environ, pseudo_request)
# ...
My Setup:
I have an existing python script that is using Tweepy to access the Twitter Streaming API. I also have a website that shows aggregate real-time information from other sources from various back-ends.
My Ideal Scenario:
I want to publish real-time tweets as well as real-time updates of my other information to my connected users using Socket.IO.
It would be really nice if I could do something as simple as an HTTP POST (from any back-end) to broadcast information to all the connected clients.
My Problem:
The Socket.IO client implementation is super straight forward... i can handle that. But I can't figure out if the functionality I'm asking for already exists... and if not, what would be the best way to make it happen?
[UPDATE]
My Solution: I created a project called Pega.IO that does what I was looking for. Basically, it lets you use Socket.IO (0.8+) as usual, but you can use HTTP POST to send messages to connected users.
It uses the Express web server with a Redis back-end. Theoretically this should be pretty simple to scale -- I will continue contributing to this project going forward.
Pega.IO - github
To install on Ubuntu, you just run this command:
curl http://cloud.github.com/downloads/Gootch/pega.io/install.sh | sh
This will create a Pega.IO server that is listening on port 8888.
Once you are up and running, just:
HTTP POST http://your-server:8888/send
with data that looks like this:
channel=whatever&secretkey=mysecret&message=hello+everyone
That's all there is to it. HTTP POST from any back-end to your Pega.IO server.
The best way I've found for this sort of thing is using a message broker. Personally, I've used RabbitMQ for this, which seems to meet the requirements mentioned in your comment on the other answer (socket.io 0.7 and scalable). If you use RabbitMQ, I'd recommend the amqp module for node, available through npm, and the Pika module for Python.
An example connector for Python using pika. This example accepts a single json-serialized argument:
def amqp_transmit(message):
connection = pika.AsyncoreConnection(pika.ConnectionParameters(host=settings.AMQP_SETTINGS['host'],
port=settings.AMQP_SETTINGS['port'],
credentials=pika.PlainCredentials(settings.AMQP_SETTINGS['username'],
settings.AMQP_SETTINGS['pass'])))
channel = connection.channel()
channel.exchange_declare(exchange=exchange_name, type='fanout')
channel.queue_declare(queue=NODE_CHANNEL, auto_delete=True, durable=False, exclusive=False)
channel.basic_publish(exchange=exchange_name,
routing_key='',
body=message,
properties=pika.BasicProperties(
content_type='application/json'),
)
print ' [%s] Sent %r' %(exchange_name, message)
connection.close()
Very basic connection code on the node end might look like this:
var connection = amqp.createConnection(
{host: amqpHost,
port: amqpPort,
password: amqpPass});
function setupAmqpListeners() {
connection.addListener('ready', amqpReady)
connection.addListener('close', function() {
console.log('Uh oh! AMQP connection failed!');
});
connection.addListener('error', function(e) {throw e});
}
function amqpReady(){
console.log('Amqp Connection Ready');
var q, exc;
q = connection.queue(queueName,
{autoDelete: true, durable: false, exclusive: false},
function(){
console.log('Amqp Connection Established.');
console.log('Attempting to get an exchange named: '+exchangeName);
exc = connection.exchange(exchangeName,
{type: 'fanout', autoDelete: false},
function(exchange) {
console.log('Amqp Exchange Found. ['+exchange.name+']');
q.bind(exc, '#');
console.log('Amqp now totally ready.');
q.subscribe(routeAmqp);
}
);
}
);
}
routeAmqp = function(msg) {
console.log(msg);
doStuff(msg);
}
Edit: The example above uses a fan-out exchange that does not persist messages. Fan-out exchange is likely going to be your best option since scalability is a concern (ie: you are running more than one box running Node that clients can be connected to).
Why not write your Node app so that there are two parts:
The Socket.IO portion, which communicates directly with the clients, and
An HTTP API of some sort, which receives POST requests and then broadcasts appropriate messages with Socket.IO.
In this way, your application becomes a "bridge" between your non-Node apps and your users' browsers. The key here is to use Socket.IO for what it was made for--real time communication to browsers--and rely on other Node technologies for other parts of your application.
[Update]
I'm not on a development environment at the moment, so I can't get you a working example, but some pseudocode would look something like this:
http = require('http');
io = require('socket.io');
server = http.createServer(function(request, response) {
// Parse the HTTP request to get the data you want
io.sockets.emit("data", whatever); // broadcast the data to Socket.IO clients
});
server.listen(8080);
socket_server = io.listen(server);
With this, you'd have a web server on port 8080 that you can use to listen to web requests (you could use a framework such as Express or one of many others to parse the body of the POST request and extract the data you need).