Communication between Python and WCF - python

I have multiples servers (200-300) in Python 2.4 calling a WCF service using REST. I would like to be able to notify those servers of some changes that occured on the WCF service instead of asking all servers in Python to do some pooling to detect such changes.
How can I do such broadcast, notification to avoid the pooling?
Thanks
Sylvain

A message bus seems to be the right choice. You might want to have a look at ZeroMQ. It's cross-platform and has bindings in a variety of languages.

Related

Simple way for message passing in distributed system

I am implementing a small distributed system (in Python) with nodes behind firewalls. What is the easiest way to pass messages between the nodes under the following restrictions:
I don't want to open any ports or punch holes in the firewall
Also, I don't want to export/forward any internal ports outside my network
Time delay less than, say 5 minutes, is acceptable, but closer to real time would be nice, if possible.
1+2 → I need to use a third party, accessible by all my nodes. From this follows, that I probably also want to use encryption
Solutions considered:
Email - by setting up separate or a shared free email accounts (e.g. Gmail) which each client connects to using IMAP/SMTP
Google docs - using a shared online spreadsheet (e.g. Google docs) and some python library for accessing/changing cells using a polling mechanism
XMPP using connections to a third party server
IRC
Renting a cheap 5$ VPS and setting up a Zero-MQ publish-subscribe node (or any other protocol) forwarded over SSH and having all nodes connect to it
Are there any other public (free) accessible message queues available (or platforms that can be misused as a message queue)?
I am aware of the solution of setting up my own message broker (RabbitMQ, Mosquito) etc and make it accessible to my nodes somehow (ssh-forwardning to a third host etc). But my questions is primarily about any solution that doesn't require me to do that, i.e. any solutions that utilizes already available/accessible third party infrastructure. (i.e. are there any public message brokers I can use?)
How about Mosquitto: message broker that implements the MQ Telemetry Transport protocol versions 3.1 and 3.1.1. MQTT provides a lightweight method of carrying out messaging using a publish/subscribe model. This makes it suitable for "machine to machine" messaging. It supports encryption. Time to setup: approximatively 15 mins you should be up and running. Since it is a message broker, you can write your own code to ensure you can communicate with 3rd party solutions. Also, it achieves soft real-time, but depending on your setup you can achieve hard real-time. After you look into Mosquitto have a look at Paho, which is a port of Mosquito to Eclipse Foundation.
Paho also provides a Python Client, which offers support for both MQTT v3.1 and v3.1.1 on Python 2.7 or 3.x. It also provides some helper functions to make publishing one off messages to an MQTT server very straightforward. Plenty of documentation and examples to get you up and running.
I would recommend RabbitMQ or Redis (RabbitMQ preferred because it is a very mature technology and insanely reliable). ZMQ is an option if you want a single hop messaging system instead of a brokered messaging system such as RabbitMQ but ZMQ is harder to use than RabbitMQ. Depending on how you want to utilize the message passing (is it a task dispatch in which case you can use Celery or if you need a slightly more low-level access in which case use Kombu with librabbitmq transport )
Found https://www.cloudamqp.com/ which offers a free plan with a cloud based installation of RabbitMQ. I will try that and see if it fulfill my needs.

How to communicate between client device and webserver?

I've build a little device based on the raspberry pi. Now I want to configure it using my web server. The idea is that I enter all the details on my django web page and then the device just pulls that off the server.
But there are two problems I'm not sure how to solve:
I have multiple devices for multiple users so some kind of Login must be provided.
The device also sends pictures from time to time. Right now it's using FTP with a general login, but I want to personalize that too for every device. The uploads will need a resume function so http is out!
So the basic question is: Should I get started with sockets or is there a better and safer way to do it? Maybe there is some kind of open source library that's been tested a lot?
Instead of hand coding sockets, I would suggest using HTTP with BASIC authentication to communicate between the device and the web server. You can uniquely assign an id/pwd to each device, and BASIC authentication is well supported by all web servers and client side libraries.
There are some security concerns with BASIC authentication even if you use HTTPS, but it maybe acceptable in your particular case here.
Maybe you could use SSH, with Fabric for instance. Here an example.

Sending Message to user/group of users with uwsgi websockets

Recently I've been doing a lot of testing around different ways of serving our Django application. I've settled on uwsgi as it seems to fit our needs pretty well.
I've recently discovered that uwsgi also supports WebSockets and started looking into it and found some examples: https://github.com/unbit/uwsgi/blob/master/tests/
After running the example (websockets_chat.py) and taking a look through uwsgi's documention for their websockets implementation it appears as though you can only send broadcast, or global messages.
Has anyone managed to find a way to transmit a message to a particular user or does uwsgi not support that level of communication yet?
Cheers
There is nothing like broadcast or global messages in websockets specs. They only "upgrades" an http connection to a lower-level one. What you do with that connection is up to you. The examples show integration with redis as message exchanger but you are free to make other uses.
For your specific case you will need to build a shared list of connected users and implements routing. Remember, you cannot rely on node.js way as it is based on a single threaded setup so everything is way simpler. In uWSGI a websocket connection can happens on a thread, a process or a coroutine, so exchanging data between them is the key.

Python message to other applications

Status Quo:
I have two python apps (frontend-server and data-collector, a database is 'between' them).
Currently using redis as db and its publish/subscribe protocol to notify the frontend when new data is available.
But may I want to use a different database (and don't want to keep redis on the system just for the pub/sub).
Are there any simple alternatives to notify my frontend if the data-collector has transacted new data to the database (without using an external message queue like beanstalkd or redis)?
ZeroMQ is a good option. It has good Python bindings, and it makes communicating between processes on the same machine and processes on different machines look almost identical.
Start by reading the guide: http://zguide.zeromq.org/page:all
As I mentioned in my comment, if you want something that is going across a network then other than setting up a web service (flask app?), or writing your own INET socket server there is nothing built in to the operating system to communicate between machines. Beanstalk has a very simple API in Python and I've used it for this kind of thing very successfully.
try:
beanstalk = beanstalkc.Connection(host="my.host.com")
beanstalk.watch("update_queue")
except:
print "Error connecting to beanstalk"
while True:
job = beanstalk.reserve()
do_something_with_job(job)
If you are only going to be working on the same machine, then read up on linux IPC. A socket connection between processes is very fast and has practically zero overhead. They can also be a part of an asynchronous program when you take advantage of epoll call backs.

Interprocess messaging between two Python programs

We have two Python programs running on two linux servers. Now we want to send messages between these Python programs. The best idea so far is to create a TCP/IP server and client architecture, but this seems like a very complicate way to do it. Is this really best practice for doing such a thing?
I like zeromq for simple messaging, it's really lightweight and fast...very flexible as well. Using AMQP messaging isn't a bad idea either depending on the specifics of your situation, I've found kombu to be a very nice pythonic library for that. You could also use xmlrpclib or setup a simple REST API with bottle or flask. Every option has it's place, so I'd investigate all your options.
This really depends on the kind of messaging you want and the roles of the two processes. If it's proper "client/server", I would probably create a SimpleHTTPServer and then use HTTP to communicate between the two. You can also use XMLRPCLib and the client to talk between them. Manually creating a TCP server with your own custom protocol sounds like a bad idea to me. You might also consider using a message queue system to communicate between them.
You could have a mulitprocessing.managers. As doc says :"A manager object controls a server process which manages shared objects. Other processes can access the shared objects by using proxies."
In your case, you could create a master process that control your other processes, each of those processes will call the master to grab the data.

Categories