Recently I've been doing a lot of testing around different ways of serving our Django application. I've settled on uwsgi as it seems to fit our needs pretty well.
I've recently discovered that uwsgi also supports WebSockets and started looking into it and found some examples: https://github.com/unbit/uwsgi/blob/master/tests/
After running the example (websockets_chat.py) and taking a look through uwsgi's documention for their websockets implementation it appears as though you can only send broadcast, or global messages.
Has anyone managed to find a way to transmit a message to a particular user or does uwsgi not support that level of communication yet?
Cheers
There is nothing like broadcast or global messages in websockets specs. They only "upgrades" an http connection to a lower-level one. What you do with that connection is up to you. The examples show integration with redis as message exchanger but you are free to make other uses.
For your specific case you will need to build a shared list of connected users and implements routing. Remember, you cannot rely on node.js way as it is based on a single threaded setup so everything is way simpler. In uWSGI a websocket connection can happens on a thread, a process or a coroutine, so exchanging data between them is the key.
Related
I am facing a situation where I have an Express server and a Flask server, each responsible for various tasks. We are piping a request from Express through to the Flask server, and would like to use sockets to provide heartbeat style updates from the Flask server to the Express server.
Is it possible to use sockets like this? I admit to having never really used sockets for backend stuff before. I've used Socket.io to connect React-based sites with an Express backend, but I'm not sure how to connect two servers like this.
Any help would be greatly appreciated.
There is a Flask extension for Socket.io.
Even though the blurb text says "Flask-SocketIO gives Flask applications access to low latency bi-directional communications between the clients and the server", "clients" doesn't have to mean "frontend".
Since you've already used Socket.io and websockets, you might see if that package meets your need. Certainly easier than reaching immediately for Unix sockets if it turns out you don't have to. :)
NOTE: Flask is not concurrent. It can't handle more than one request at a time by default, because it runs on a single thread and doesn't do async/await stuff. This is more broadly a Python/WSGI problem than specifically a Flask problem. Depending on what you do, this may become a bottleneck in your app.
Automatic threading with Flask
WSGI is synchronous
Status Quo:
I have two python apps (frontend-server and data-collector, a database is 'between' them).
Currently using redis as db and its publish/subscribe protocol to notify the frontend when new data is available.
But may I want to use a different database (and don't want to keep redis on the system just for the pub/sub).
Are there any simple alternatives to notify my frontend if the data-collector has transacted new data to the database (without using an external message queue like beanstalkd or redis)?
ZeroMQ is a good option. It has good Python bindings, and it makes communicating between processes on the same machine and processes on different machines look almost identical.
Start by reading the guide: http://zguide.zeromq.org/page:all
As I mentioned in my comment, if you want something that is going across a network then other than setting up a web service (flask app?), or writing your own INET socket server there is nothing built in to the operating system to communicate between machines. Beanstalk has a very simple API in Python and I've used it for this kind of thing very successfully.
try:
beanstalk = beanstalkc.Connection(host="my.host.com")
beanstalk.watch("update_queue")
except:
print "Error connecting to beanstalk"
while True:
job = beanstalk.reserve()
do_something_with_job(job)
If you are only going to be working on the same machine, then read up on linux IPC. A socket connection between processes is very fast and has practically zero overhead. They can also be a part of an asynchronous program when you take advantage of epoll call backs.
I am creating a web app which needs to continuously poll my django web server to get an update. Is there a way avoid this polling? Like server can send push messages on update or the client registers a callback for an event and server triggers the callback whenever something changes.
I know there are signaling frameworks in ASP.net etc. but I want something which can work with Django.
Thanks
Fundamentally web sockets, part of HTML5, were design for this purpose, ie bi-directional communication between clients and servers through the http protocol, while its being highly talked about few application servers have implemented and even fewer http servers have actually even began supporting it.
While there are some packages:
django-websocket
django-socketio
that have enabled it in django, they don't do anything about your http server, very rarely if ever do you use django standalone, this is because django isn't very efficient for distributing static content such as images or any other static files, as well as distribute work load, we rely on things like nginx, apache and such things for this. unfortunately they don't support web sockets, yet, as such they tend to break the communication between the client and the application server even if its initiated in the first place, depending on implementation.
From my own personal experience nginx would break the communication after 60 seconds since this was the default allotted time for anything open.
As far as I know node.js maybe the best server, currently, for working with web sockets.
Depending on what you are tying to achieve and If regular polling seems in efficient you can try long-polling, basically the connection is held open, until theres new data to be pushed back unto the client vs regular polling, which is done at some interval, note that you may have to configure your http server not to terminate pro-long open connections and run django multithreaded, since each connection will use an instance.
I am developing a testbed for cloud computing environment. I want to establish multiple client connection to a server. What I want is that, server first of all send a data to all the clients specifying sending_interval and then all the clients will keep on sending their data with a time gap of that time_interval (as specified by the server). Please help me out, how can I do the same using python socket program. (i.e. I want multiple client to single server connectivity and also client sending data with the time gap specified by server). Will be great-full if anyone can help me. Thanks in advance.
This problem is easily solved by the ZeroMQ socket library. It is production stable. It allows you to define publisher-subscriber relationships, where a publishing process will publish data on a port regardless of how many (0 to infinite) listening processes there are. They call this the PUB-SUB model; it's in their docs (link below).
It sounds like you want to set up a bunch of clients that are all publishers. They can subscribe to a controlling channel, which which will send updates to their configuration (how often to write). They also act as publishers, pushing out their own data at an interval specified by default/config channel/socket.
Then, you have one or more listening processes that listen to all the clients' published messages. Perhaps you could even have two listening processes, one for backup or DR, or whatever.
We're using ZeroMQ and loving the simplicity it gives; there's no connection errors because the publisher doesn't care if anyone is listening, and the subscriber can start before the publisher and if there's nothing there to listen to, it can just loop around and wait until there is.
Bindings are available in ALL languages (it's freaky). The Python binding isn't pure-python, it does require a C compiler, but is frighteningly fast, and the pub/sub example is a cut/paste, 'golly, it works!' experience.
Link: http://zeromq.org
There are MANY other methods available with this library, including message queues, etc. They have relatively complete documentation, too.
Multi-Client and Single server Socket programming can be achieved by Multithreading in Socket Programming. I have implemented both the method:
Single Client and Single Server
Multiclient and Single Server
In my GitHub Repo Link: https://github.com/shauryauppal/Socket-Programming-Python
What is Multi-threading Socket Programming?
Multithreading is a process of executing multiple threads simultaneously in a single process.
To understand well you can visit Link: https://www.geeksforgeeks.org/socket-programming-multi-threading-python/, written by me.
We have two Python programs running on two linux servers. Now we want to send messages between these Python programs. The best idea so far is to create a TCP/IP server and client architecture, but this seems like a very complicate way to do it. Is this really best practice for doing such a thing?
I like zeromq for simple messaging, it's really lightweight and fast...very flexible as well. Using AMQP messaging isn't a bad idea either depending on the specifics of your situation, I've found kombu to be a very nice pythonic library for that. You could also use xmlrpclib or setup a simple REST API with bottle or flask. Every option has it's place, so I'd investigate all your options.
This really depends on the kind of messaging you want and the roles of the two processes. If it's proper "client/server", I would probably create a SimpleHTTPServer and then use HTTP to communicate between the two. You can also use XMLRPCLib and the client to talk between them. Manually creating a TCP server with your own custom protocol sounds like a bad idea to me. You might also consider using a message queue system to communicate between them.
You could have a mulitprocessing.managers. As doc says :"A manager object controls a server process which manages shared objects. Other processes can access the shared objects by using proxies."
In your case, you could create a master process that control your other processes, each of those processes will call the master to grab the data.