Python XMLRPC with concurrent requests - python

I'm looking for a way to prevent multiple hosts from issuing simultaneous commands to a Python XMLRPC listener. The listener is responsible for running scripts to perform tasks on that system that would fail if multiple users tried to issue these commands at the same time. Is there a way I can block all incoming requests until the single instance has completed?

I think python SimpleXMLRPCServer module is what you want. I believe the default behavior of that model is blocking new requests when current request is processing. The default behavior gave me lots of trouble and I changed that behavior by mix in ThreadingMixIn class so that my xmlrpc server could respond multiple requests in the same time.
class RPCThreading(SocketServer.ThreadingMixIn, SimpleXMLRPCServer.SimpleXMLRPCServer):
pass
If I understand your question correctly, SimpleXMLRPCServer is the solution. Just use it directly.

Can you have another communication channel? If yes, then have a "call me back when it is my turn" protocol running between the server and the clients.
In other words, each client would register its intention to issue requests to the server and the said server would "callback" the next-up client when it is ready.

There are several choices:
Use single-process-single-thread server like SimpleXMLRPCServer to process requests subsequently.
Use threading.Lock() in threaded server.
You some external locking mechanism (like lockfile module or GET_LOCK() function in mysql) in multiprocess server.

Related

Suggestion about the way to build architecture for web-socket API

I am building web-sockets API on python using python WebSockets. According to the services that I am using in my API I need to wrap them by threads. So I have to make my asyncio web-sockets await while thread produces data for the response.
I found that there are a lot of ways to realize such service. I can use threads + native python socket module or multiplexing(python selector module) + python socket module or multithreading + python socket module or threads + async python web-sockets.
I want to have a web-socket python service working the next way. My client sends data to the server. The server starts thread_1 which somehow modifies given data then pass modified data to the thread_2 which modify them one more time and then return twice modified data as the response to the client. As I expect client will not wait for the response on the server to send next pack of data but if the server returns some result the client will handle it. In other words, client and server should work in async order. Although, it will be great if you will suggest some materials which help me to achieve the goal in that question.
One of the way to realize such architecture is to combine multithreading with asyncio web-sockets. That goal is achieved by using asyncio executors.

Client to Server Remote Function Calls in Python. How to implement?

I'm trying to set up a simple client to server interface for calling functions/programs on the server. A client will send a simple command to the server listening for such commands. Once the server receives a command from the client it will execute the following function or program on the server. I have looked into a simple TCP server receiving a text string and parsing that string then executing the a function or external program. I have read into using XML-RPC implemented with a twisted server as well.
What I'm asking is which would be the easiest to set up or are there any other ways to easily do this task?
Thanks.
There is a great tutorial for twisted that will do just fine as a teaching tool (and guide you by hand in writing a basic server/client services). Have a go at it http://twistedmatrix.com/documents/current/core/howto/tutorial/ what you will probably want to do is parse received info and act accordingly.
If it is appliable in your case, maybe you can use full-featured system for async/remote job execution like Celery?
There are more than one way to achieve your requirement ach with some pros and cons:
Python Low Level Sockets
Using Standard python socket libraries and cliet server architecture
Connecting to Server via protocols like Telnet/SSh and then triggering some code.
Using Python libraries like Telnet/ssh or Subprocess.
XML-RPC
Sending a XMP RPC request as described here http://docs.python.org/2/library/xmlrpclib.html
In my opinion easiest method to achieve remote method triggering is via Python Subprocess Module. I generally use following kind of syntax for my general purposes.
import subprocess
ret = subprocess.call(["ssh", "user#host", "program"]);
# or, with stderr:
prog = subprocess.Popen(["ssh", "user#host", "program"], stderr=subprocess.PIPE)
errdata = prog.communicate()[1]
Hope it helps

pymongo connection pooling and client requests

I know pymongo is thread safe and has an inbuilt connection pool.
In a web app that I am working on, I am creating a new connection instance on every request.
My understanding is that since pymongo manages the connection pool, it isn't wrong approach to create a new connection on each request, as at the end of the request the connection instance will be reclaimed and will be available on subsequent requests.
Am I correct here, or should I just create a single instance to use across multiple requests?
The "wrong approach" depends upon the architecture of your application. With pymongo being thread-safe and automatic connection pooling, the actual use of a single shared connection, or multiple connections, is going to "work". But the results will depend on what you expect the behavior to be. The documentation comments on both cases.
If your application is threaded, from the docs, each thread accessing a connection will get its own socket. So whether you create a single shared connection, or request a new one, it comes down to whether your requests are threaded or not.
When using gevent, you can have a socket per greenlet. This means you don't have to have a true thread per request. The requests can be async, and still get their own socket.
In a nutshell:
If your webapp requests are threaded, then it doesn't matter which way you access a new connection. The result will be the same (socket per thread)
If your webapp is async via gevent, then it doesn't matter which way you access a new conection. The result will be the same. (socket per greenlet)
If your webapp is async, but NOT via gevent, then you have to take into consideration the notes on the best suggested workflow.

Callback Functions across computers?

I'm writing a python server/client app. If I serialize a function on the client and pass it to the server, can the server use it as a callback? I'm assuming there must be something extra I'd have to do as the client and server are communicating via packets, I just don't know what.
What I actually need is for the server to change one of the client's attributes (when the server is ready to accept another command), and I want an alternative to having the client continuously poll the server. Thanks for any help.
Take a look at Twisted JSON RPC.
A recent SO post: Python Twisted JSON RPC

python streaming TCP server with RPC

I have written a little streaming mp3 server in python. So far all it does is accept a ServerSocket connection, and begin streaming all mp3 data in its queue to the request using socket.send(). I have implemented this to chunk in stream icy metadata, so the name of the playing song shows up in the client.
I would like to add playlist management to the server, so that I can manipulate the playlist of the running server. I have a vague idea that xmlrpclib would be suited to doing this, but I'm confused about two things:
Whether it's possible/advisable to integrate ICY and XMLRPC on a single server and a single port.
How to share state between the stream thread and the playlist, and manipulation thereof via xmlrpc.
Your initial attempt might be easier if you use two separate ports, each with its own server running in a separate thread. However, managing synchronization between the threads might be an annoying task in the long run.
ICY and HTTP are very similar, and if you've already implemented ICY on top of SocketServer, you could probably extend BaseHTTPServer.BaseHTTPRequestHandler to respond to both ICY and HTTP requests on the same port. Take a look at the standard library code for the BaseHTTPRequestHandler.parse_request() method, and think about how to override it in a subclass for a split personality.
Also, when you want to handle multiple concurrent requests using these classes, take a look at the SocketServer mixin classes.

Categories