So I've made a Flask application and I'm using it to push out data on certain events, through Pusher's python client.
My challenge is to now implement a method which can listen to events from a presence channel. From what I could gather by reading the source of the python client, there isn't much scope with that package. Any other libraries I could use? Conversely, any implementations I can see and figure out how it's done? Thanks
You can receive events via WebHooks:
http://pusher.com/docs/webhooks#presence
Or you could use the Pusher Python Client (which handles subscription functionality rather than publishing):
https://github.com/ekulyk/PythonPusherClient
The officially supported option would be to use WebHooks.
Related
maybe it is simple but I didn't find a satisfactory answer yet.
I have a python application that collects data over CAN bus (temperature, weight, ...) and I want to visualize them over Angular.
On the one side, I wrote the Python application that cyclic read the CAN-bus data and writes them to the console and on the other hand I wrote a small Angular application that contains the first step a simple table.
Now I want to fill in the table every 10 seconds with data from the Python application instead of printing them to the console.
How can I connect these both?
My first thought was a simple file where I save the values from Python and read them with Angular.
The second solution is a database, but I think this is too much for only a few values
So is there a direct way to access the Python data from Angular?
Basic idea is to create an api in python and let angular consume that
then there is the question of weather you want to have backup data in python,
if so then save it a db or file and use that as response for angular
if you want to do some fancy real time stuff may be look into long polling or http event stream
There are several ways you can access Python data from an Angular application:
One way is to use a REST API. You can create a REST API in Python
using a web framework like Flask or Django, and then use Angular's
HTTP client to make requests to the API and retrieve the data.
Another option is to use WebSockets. You can use a Python library
like asyncio or websockets to set up a WebSocket server, and then
use Angular's WebSocket client to connect to the server and receive
updates in real-time.
You can also use a message queue like RabbitMQ or ZeroMQ to allow
your Python and Angular applications to communicate with each other
asynchronously.
Overall, the best approach will depend on your specific requirements and how you want to structure your application. A REST API is a good choice if you need to retrieve data from the Python application on demand, while WebSockets or a message queue can be used for real-time communication and updates.
I am designing a micro-service based system that will be accessed by a python SDK, from an application.
The SDK will be used to access machine learning models hosted as micro-services in a backend system.
I understands that GRPC relies on protobuf, which is lightweight and support streams. We do need to send data vectors to the hosted models, so it is appealing to use protobuf.
The question is more about the GRPC, as I understand it is using HTTP/2.
GRPC seems useful for accessing an API hosted on the backend, however there is also a need to keep an open live connection to receive general update events from the server.
For example I would like to have a “context” where incoming events can arrive, mainly for asynchronous communication, for example after several model invocations are performed, the system might use aggregated data on the backend and send an event when a prediction crosses some threshold.
Hypothetical usage example:
ctx = BackendSystemSDK.connect(APIGatewayHost,port)
ctx.registerCallback(SomeAlertCallbackFunc)
...
ctx.applyModel(‘model1’,SomeVector1)
ctx.applyModel(‘model2’,SomeVector2)
...
When program terminates
ctx.close()
I played with GRPC and tested a client and server successfully in python.
However I am not sure about implementing the “context” open connection idea with GRPC.
It is more like a pub/sub concept, however I could not find any examples for local testing such architecture. I did see examples for Google cloud, however it is not relevant as I want to host everything on-premise using kubernetes for scale.
The system should support heavy load of requests, for example processing multiple incoming video streams, per frame, so 24 requests per minute for 20-50 cameras (or more) could easily be the case.
Is GRPC good fit for such scenario?, not only for inter-microservice communication but also for the main API access gateway?
And for the "context", Maybe it should be implemented as live connection part with websockets or other protocol? I really want to simplify the development and use a single technology.
I could not fully understand from the documentation whether HTTP/2 and GRPC supports a long running bi-directional open connection.
I am having a little trouble with a server / client design and
wonder if anyone had any advice.
I have a Thrift server that abstracts a data store. The idea is that
there will be a number of clients that are essentially
out of process plugins that use the interface provided by the server
to receive, manipulate the underlying data store and also provide
their own data.
There will be a number of other clients which simply access the data
provided by the server and its "plugins".
The problem case is when one of these "plugins" wishes to provide its
own data and provide an interface to that data.
The server should have no knowledge of the plugins data or interface.
I would ideally like all clients to access functionality through the
main thrift server so it acts as a facade for the plugins. If a client
requested some data provided by a plugin the main server could
delegate to the plugin to provide that data. I guess this would mean
have each plugin being a thrift client and server. I have written the
server in python so could probably handle thrift calls that are not
yet defined but would it be possible to forward these calls another
thrift server IE act as a proxy ?
An alternative is maybe have the plugins be clients only and push data
to the server. But the format of these messages
would unknown to the server and would have to be generic enough to
accommodate different types of data. I not sure how I would provide a
useful interface to this data to other clients.
As far as I can see only the plugins knows how to store and manipulate
the data it owns so this idea probably would not work.
Thanks for any advice. Any suggestions welcomed.
Sounds like you need some sort of a mechanism to correlate requests to the different plugins available. Ideally, there should be a different URL path per set of operations published for each plugin.
I would consider implementing a sort of map/dictionary of URL paths to plugins. Then for each request received, do a lookup in the map and get the associated plugin and send it the request accordingly. If there is no entry in the map, then a redirect/proxy could be sent. For example if URL = http://yourThriftServer/path/operation, the operation or the path and operation would map to a plugin.
An extra step would be to implement a sort of meta request, whereby a client could query what URL paths/operations are available in the server.
What are some realtime "push" options for django that can install as a python package? I want to avoid having to do things like installing independent web-servers for realtime.
Essentially I am looking for something like pusher.com (cloud system) or this socket.io build for django (which has a build status:failing) for chat and other various push operations.
Ape was suggested here, but it seems it requires you to setup Ape as a server. If its not too much to ask for, are there any solutions that build right into django?
Since the time the answer was written (2012); a lot has changed.
The preferred method now to do realtime updates of the system is using websockets; which is being formalized and proposed as a standard RFC 6455. This page on MDN has a great overview of the technology.
The other emerging technology is Server Sent Events which is a W3C draft proposal.
Projects such as swampdragon and django-socketio make integrating realtime functionality easier in your project.
There are two main components to any realtime system:
A connection that remains open from the browser to a server.
A server that listens on this connection and then responds.
A system / standard to store and be notified of messages.
Okay so maybe three components.
Since django doesn't work in realtime any solution that offers realtime push/updates will require another server/service to accept messages and then notify listeners of pending messages.
Django would be the application that pushes messages (writes them) to this server on a channel (a queue/bucket). Listeners then subscribe to a channel to be notified of messages. Since the connection remains open; messages are retrieved in "realtime".
Django really has a minimal role in all of this. There are various implementations that provide the three components necessary for realtime notifications to work.
I really like juggernaut because it is super simple to set up, and uses node.js that doesn't require a lot in terms of server-side components. The other reason I prefer it is because it supports Adobe Flash Socket in addion to WebSocket (and others, see the link).
The api to access it also very simple - in fact, if you are already using redis (which you really should since its so easy to use), you don't need another API as you can drop messages to redis and juggernaut will read them, or you can use its Python API. A simple example from this flask snippet:
Send (write) a message to a channel:
>>> from juggernaut import Juggernaut
>>> jug = Juggernaut()
>>> jug.publish('channel', 'The message')
Listen to it:
<script type=text/javascript
src=http://localhost:8080/application.js></script>
<script type=text/javascript>
var jug = new Juggernaut();
jug.subscribe('channel', function(data) {
alert('Got message: ' + data);
});
</script>
Django is built to serve web pages and there is nothing out of the box to support websockets in django. The quickest/easiest option is pusher.com (I use it an really like it). You can start with something like pusher.com and if you write a quick wrapper around it, you can replace it with your own server using socket.io or any other web socket server by just changing the wrapper / interface to connect to the new server. Just make sure you write it with being able to switch out the backend at any time.
If you really want to start run your own socket server there are projects out there that will make it easy to use sockets in django:
django-websocket
django-socketio
You can actually serve up Django from Tornadio2, a working implementation of socketio in Tornado. If you want to build any degree of sophistication into your realtime app you will likely need a redis pubsub backend that maps sessions to channels and handles multicasting. For this you might like to take a look at Brukva. Also read up Yuval Adam's blog post on this subject. Finally, Tony Abou Assaleh's sample package and post will provide a useful base reference when setting up tornadio2 for django.
I'm interested in something based on Jabber but I didn't find a free/opensource one so I'm thinking of writing one.
I've installed a Jabber server and now thinking about the ways in which I can write the client. I'm thinking of one of either these two methods.
1) An ajax call made to a jabber script running on the webserver that takes care of connecting to the server. But then I thought because of the dependencies involved in the jabber client, it might end up consuming too much memory when a few clients connect.
2) The other method is to run a client running as a daemon that takes care of all the heavy lifting. This way I need to have only one instance of the client that sends a spoofed message (sender's name as that of whatever the user entered on the site). A simple script running on the webserver talks to this daemon over some sort of API (XMLRPC or Msgpack maybe?)
I think #2 is better but I'm not sure. Are there other ways I can implement this? I'm considering using Perl or Python for this.
Jabber is usually called XMPP nowadays, and there are dozens of clients and servers, something for every language. If you are using Javascript (you mention Ajax), you probably want Strophe. Most servers are modular, so you only load the features you need (consider Tigase, ejabberd, or xmpppy). Writing your own is even worse an idea than it sounds.
BOSH
Install prosody because it is really eaSily installed and has BOSH support built-in. You could skip this but then you need to find out how to use BOSH via ejabberd.
use strophe.js to implement this(using BOSH). New browsers support cross-domain request(CORS -> read Proxy-less BOSH part). The old browsers you could use proxy or use flash in the middle as proxy.
read Professional XMPP Programming with JavaScript and jQuery to learn strophe. It even has chapters explaining how to create chat.
Node.js
Or you could consider installing node.js to create your chat system using socket.io.