Python message to other applications - python

Status Quo:
I have two python apps (frontend-server and data-collector, a database is 'between' them).
Currently using redis as db and its publish/subscribe protocol to notify the frontend when new data is available.
But may I want to use a different database (and don't want to keep redis on the system just for the pub/sub).
Are there any simple alternatives to notify my frontend if the data-collector has transacted new data to the database (without using an external message queue like beanstalkd or redis)?

ZeroMQ is a good option. It has good Python bindings, and it makes communicating between processes on the same machine and processes on different machines look almost identical.
Start by reading the guide: http://zguide.zeromq.org/page:all

As I mentioned in my comment, if you want something that is going across a network then other than setting up a web service (flask app?), or writing your own INET socket server there is nothing built in to the operating system to communicate between machines. Beanstalk has a very simple API in Python and I've used it for this kind of thing very successfully.
try:
beanstalk = beanstalkc.Connection(host="my.host.com")
beanstalk.watch("update_queue")
except:
print "Error connecting to beanstalk"
while True:
job = beanstalk.reserve()
do_something_with_job(job)
If you are only going to be working on the same machine, then read up on linux IPC. A socket connection between processes is very fast and has practically zero overhead. They can also be a part of an asynchronous program when you take advantage of epoll call backs.

Related

How could I build a plugin that provides a socketio endpoint

I got a existing software project, that has a relatively primitive plugin system and wanted to expand it by providing a web interface.
Since my application processes realtime data, websockets are the only option besides web rtc.
My previous attempt used zeromq domain sockets on the python side and a server in Node js that connected to the domain socket.
This solution works great and has some benefits over the plugin server, but I want to offer a simpler option for folks that don't need the benefits and don't want the extra complexity.
How would you go about implementing this and is it even possible to do so?
Otherwise, I'll still do a separate process, but use fastapi to build the stuff around the socket endpoint and start it up using subprocess to spawn a second process that also connects to the domain socket.
Hope my question is not stupid, or a rtfm case.

using multiple twisted socket servers together

So I have a single twisted socket server that serves clients and eventually I'll need to add more servers. The problem is that connections to the server are unique and unable to be shared among multiple server instances.
This makes a problem if the servers are behind a load balancer, or if multiple users from a single chat are across multiple server instances, because a message to a chat won't successfully send to everyone.
How would I resolve this?
It may be a difficult task as balancing load can be improved according to the underlying protocol (like http for web servers).
Are you trying to design a load balancing system for basically any socket based application ? What I mean is that it is one thing to dispatch messages between multiples servers, ensuring correct synchronization, it is another thing to build a dynamic self-balancing system for any communication protocol.
To build your loadbalancer, you can use a "TCP proxy" like HAProxy (http://www.haproxy.org/)
To handle the communication between your application server instances (behind the load balancing server), you can use messaging like zeromq (http://zeromq.org/) or rabbitmq (http://www.rabbitmq.com/). You'll find some common architecture pattern there.
There are python libs for both zeromq and rabbitmq so the implementation within your twisted-based server is not too hard.

Sync data with Local Computer Architecture

The scenario is
I have multiple local computers running a python application. These are on separate networks waiting for data to be sent to them from a web server. These computers are on networks without a static IP and generally behind firewall and proxy.
On the other hand I have web server which gets updates from the user through a form and send the update to the correct local computer.
Question
What options do I have to enable this. Currently I am sending csv files over ftp to achieve this but this is not real time.
The application is built on python and using django for the web part.
Appreciate your help
Use a REST API. Then you can post information to your Django app over HTTP, using an authentication key if necessary.
http://www.django-rest-framework.org/ should help you get started quickly
Sounds like you need a message queue.
You would run a separate broker server which is sent tasks by your web app. This could be on the same machine. On your two local machines you would run queue workers which connect to the broker to receive tasks (so no inbound connection required), then notify the broker in real time when they are complete.
Examples are RabbitMQ and Oracle Tuxedo. What you choose will depend on your platform & software.

Sending Message to user/group of users with uwsgi websockets

Recently I've been doing a lot of testing around different ways of serving our Django application. I've settled on uwsgi as it seems to fit our needs pretty well.
I've recently discovered that uwsgi also supports WebSockets and started looking into it and found some examples: https://github.com/unbit/uwsgi/blob/master/tests/
After running the example (websockets_chat.py) and taking a look through uwsgi's documention for their websockets implementation it appears as though you can only send broadcast, or global messages.
Has anyone managed to find a way to transmit a message to a particular user or does uwsgi not support that level of communication yet?
Cheers
There is nothing like broadcast or global messages in websockets specs. They only "upgrades" an http connection to a lower-level one. What you do with that connection is up to you. The examples show integration with redis as message exchanger but you are free to make other uses.
For your specific case you will need to build a shared list of connected users and implements routing. Remember, you cannot rely on node.js way as it is based on a single threaded setup so everything is way simpler. In uWSGI a websocket connection can happens on a thread, a process or a coroutine, so exchanging data between them is the key.

Python Socket Programming

I am developing a testbed for cloud computing environment. I want to establish multiple client connection to a server. What I want is that, server first of all send a data to all the clients specifying sending_interval and then all the clients will keep on sending their data with a time gap of that time_interval (as specified by the server). Please help me out, how can I do the same using python socket program. (i.e. I want multiple client to single server connectivity and also client sending data with the time gap specified by server). Will be great-full if anyone can help me. Thanks in advance.
This problem is easily solved by the ZeroMQ socket library. It is production stable. It allows you to define publisher-subscriber relationships, where a publishing process will publish data on a port regardless of how many (0 to infinite) listening processes there are. They call this the PUB-SUB model; it's in their docs (link below).
It sounds like you want to set up a bunch of clients that are all publishers. They can subscribe to a controlling channel, which which will send updates to their configuration (how often to write). They also act as publishers, pushing out their own data at an interval specified by default/config channel/socket.
Then, you have one or more listening processes that listen to all the clients' published messages. Perhaps you could even have two listening processes, one for backup or DR, or whatever.
We're using ZeroMQ and loving the simplicity it gives; there's no connection errors because the publisher doesn't care if anyone is listening, and the subscriber can start before the publisher and if there's nothing there to listen to, it can just loop around and wait until there is.
Bindings are available in ALL languages (it's freaky). The Python binding isn't pure-python, it does require a C compiler, but is frighteningly fast, and the pub/sub example is a cut/paste, 'golly, it works!' experience.
Link: http://zeromq.org
There are MANY other methods available with this library, including message queues, etc. They have relatively complete documentation, too.
Multi-Client and Single server Socket programming can be achieved by Multithreading in Socket Programming. I have implemented both the method:
Single Client and Single Server
Multiclient and Single Server
In my GitHub Repo Link: https://github.com/shauryauppal/Socket-Programming-Python
What is Multi-threading Socket Programming?
Multithreading is a process of executing multiple threads simultaneously in a single process.
To understand well you can visit Link: https://www.geeksforgeeks.org/socket-programming-multi-threading-python/, written by me.

Categories