Client to Server Remote Function Calls in Python. How to implement? - python

I'm trying to set up a simple client to server interface for calling functions/programs on the server. A client will send a simple command to the server listening for such commands. Once the server receives a command from the client it will execute the following function or program on the server. I have looked into a simple TCP server receiving a text string and parsing that string then executing the a function or external program. I have read into using XML-RPC implemented with a twisted server as well.
What I'm asking is which would be the easiest to set up or are there any other ways to easily do this task?
Thanks.

There is a great tutorial for twisted that will do just fine as a teaching tool (and guide you by hand in writing a basic server/client services). Have a go at it http://twistedmatrix.com/documents/current/core/howto/tutorial/ what you will probably want to do is parse received info and act accordingly.

If it is appliable in your case, maybe you can use full-featured system for async/remote job execution like Celery?

There are more than one way to achieve your requirement ach with some pros and cons:
Python Low Level Sockets
Using Standard python socket libraries and cliet server architecture
Connecting to Server via protocols like Telnet/SSh and then triggering some code.
Using Python libraries like Telnet/ssh or Subprocess.
XML-RPC
Sending a XMP RPC request as described here http://docs.python.org/2/library/xmlrpclib.html
In my opinion easiest method to achieve remote method triggering is via Python Subprocess Module. I generally use following kind of syntax for my general purposes.
import subprocess
ret = subprocess.call(["ssh", "user#host", "program"]);
# or, with stderr:
prog = subprocess.Popen(["ssh", "user#host", "program"], stderr=subprocess.PIPE)
errdata = prog.communicate()[1]
Hope it helps

Related

How to implement a python code to work as Azure Function?

I have built a Socket TCP / IP server that listens on a specific port and then, with that data, makes a rest query to another server, and that response is returned through the port where it received it.
All Socket server is made in Python 3.8 and works great.
I need to know how to implement this code from my Socket server to an Azure Functions, so that it provides permanent service?
I appreciate the goodwill of anyone who can offer an answer.
Thanks Total.
Simple answer: you cannot do that. Azure Functions are Event-based (such as an HTTP call). If you need to provide TCP socket, maybe hosting your python code in a container, e.g. Azure Container Instances, might be a good way to go.

How would I handle multiple sockets and send data between them in Python 2.7.3?

I am trying to create a server in Python 2.7.3 which sends data to all client connections whenever one client connection sends data to the server. For instance, if client c3 sent "Hello, world!" to my server, I would like to then have my server send "Hello, world!" to client connections c1 and c2. By client connections, I mean the communications sockets returned by socket.accept(). Note that I have tried using the asyncore and twisted modules, but AFAIK they do not support this. Does anybody know any way to accomplish this?
EDIT: I have seen Twisted, but I would much rather use the socket module. Is there a way (possibly multithreading, possibly using select) that I can do this using the socket module?
You can absolutely do this using Twisted Python. You just accept the connections and set up your own handling logic (of course the library does not including built-in support for your particular communication pattern exactly, but you can't expect that).

Python message to other applications

Status Quo:
I have two python apps (frontend-server and data-collector, a database is 'between' them).
Currently using redis as db and its publish/subscribe protocol to notify the frontend when new data is available.
But may I want to use a different database (and don't want to keep redis on the system just for the pub/sub).
Are there any simple alternatives to notify my frontend if the data-collector has transacted new data to the database (without using an external message queue like beanstalkd or redis)?
ZeroMQ is a good option. It has good Python bindings, and it makes communicating between processes on the same machine and processes on different machines look almost identical.
Start by reading the guide: http://zguide.zeromq.org/page:all
As I mentioned in my comment, if you want something that is going across a network then other than setting up a web service (flask app?), or writing your own INET socket server there is nothing built in to the operating system to communicate between machines. Beanstalk has a very simple API in Python and I've used it for this kind of thing very successfully.
try:
beanstalk = beanstalkc.Connection(host="my.host.com")
beanstalk.watch("update_queue")
except:
print "Error connecting to beanstalk"
while True:
job = beanstalk.reserve()
do_something_with_job(job)
If you are only going to be working on the same machine, then read up on linux IPC. A socket connection between processes is very fast and has practically zero overhead. They can also be a part of an asynchronous program when you take advantage of epoll call backs.

Python JSON-RPC_2.0 TCP Server Client Explained

I'm having a difficult time fully understanding the nature of a TCP server/client relationship when a JSON string is sent to the server. The information I need may be out there, but I'm perhpas not using the correct search paramaters as I'm looking.
I've built a Python TCP, JSON-RPC Server from the following examples:
https://github.com/joshmarshall/jsonrpclib
http://code.activestate.com/recipes/552751-json-rpc-server-and-client/
In both cases, I can communicate with the Python server from a Python console on a different computer, sending commands from one (the client) to the other (server). In all of the examples, I've had to install the libraries mentioned above on both the client and the server machines in order to facilitate the TCP communication.
So the background to my situation and question is, when does JSON enter the mix? This is what I want to do:
Setup a Python TCP server that accepts a JSON string from a remote client inside (or outside) the network. The server parses the JSON string, fetches the method and parameters from the objectified string, and executes the method. The server then sends a JSON string result to the calling client. In this case, the client is a mobile application (iPad, Android, etc) with a JavaScript library that I'll use to send the requests to the server.
Why would I need a Python client? From what I can gather, the client just needs to open a connection to the server and then send the JSON string, right? Why do all the code samples include Python client examples? Are they assuming a server machine is going to talk to a server machine, so they have included client code to help generate the JSON string that will be sent to the server?
If I assume that a Python client isn't really needed for anything, I've been sending JSON strings to the Python server from the iPad, but in each case the server is reporting a "Bad request syntax" error. I'll pen a new question on that issue if I'm understanding the current question correctly.
Insight is appreciated.
The JSON encoding is the lingua franca of your RPC protocol, so you can indeed use any client you like. The implementations you found for JSON-RPC use the HTTP protocol, a very specific communication protocol built on top of TCP/IP, but you can implement the same protocol over raw TCP-IP sockets if so required.
The examples include both the Python client and the server because they illustrate how to implement the JSON-RPC standard in Python, not in JavaScript or C or Lisp. They focus on the implementation in one language. The JSON-RPC standard however, is language agnostic. It doesn't matter what language you write either the server or the client in, as long as they use the same standard.

Callback Functions across computers?

I'm writing a python server/client app. If I serialize a function on the client and pass it to the server, can the server use it as a callback? I'm assuming there must be something extra I'd have to do as the client and server are communicating via packets, I just don't know what.
What I actually need is for the server to change one of the client's attributes (when the server is ready to accept another command), and I want an alternative to having the client continuously poll the server. Thanks for any help.
Take a look at Twisted JSON RPC.
A recent SO post: Python Twisted JSON RPC

Categories