I am building web-sockets API on python using python WebSockets. According to the services that I am using in my API I need to wrap them by threads. So I have to make my asyncio web-sockets await while thread produces data for the response.
I found that there are a lot of ways to realize such service. I can use threads + native python socket module or multiplexing(python selector module) + python socket module or multithreading + python socket module or threads + async python web-sockets.
I want to have a web-socket python service working the next way. My client sends data to the server. The server starts thread_1 which somehow modifies given data then pass modified data to the thread_2 which modify them one more time and then return twice modified data as the response to the client. As I expect client will not wait for the response on the server to send next pack of data but if the server returns some result the client will handle it. In other words, client and server should work in async order. Although, it will be great if you will suggest some materials which help me to achieve the goal in that question.
One of the way to realize such architecture is to combine multithreading with asyncio web-sockets. That goal is achieved by using asyncio executors.
Related
I am doing a master-worker architecture assignment with XMLRPC in Python and now I am supposed to implement a dynamic membership protocol so the workers list on the client gets updated everytime the master modifies it (worker added or deleted) so the client does not need to request it manually or before running a command.
The teacher mentioned it can be done through events or group communication (I will implement a manager node to ping workers which also needs the same dynamic protocol) so I thought about using sockets which is an event based architecture but the teacher told me I am better off using indirect communication with Redis or RabbitMQ.
The thing is I do not really know how to implement a Redis message listener on Python since I just find blocking examples with while True followed by redis.get_message() most of the time (using redis-py). Could you help me with that?
Thanks a lot in advance.
Thanks to Mephalich on Redis' Discord server I discovered an interesting way to achieve that which consists of using run_in_thread.
import redis
def hnd(msg):
print(msg)
# some real work should be done in this handler over the msg arrived through the pubsub channel
r = redis.Redis(...)
p = r.pubsub()
p.psubscribe(**{'cmdchannels*': hnd})
thread = p.run_in_thread(sleep_time=0.001)
I have a Flask/SocketIO application which currently pairs two clients together to play a game together. Right now the clients are interacting with the server through some compiled client-side Javascript, and I am using socketio to define the sockets which the clients call, e.g., movedForward when that client moved forward. The client-side JS similarly defines websockets which the server will emit to, e.g., partnerTurnedRight when the server is passing the partner's movement to the other player.
I would like to create 'dummy' clients on the server side which can interact with a normal, remote client -- basically, a python implementation of the Javascript which is spawned every time a remote client connects. The idea is to have a server-side "player" play the game with a remote, human client.
I'm not sure how to go about implementing something like this. My intuition is that I should create a separate Flask/SocketIO app (somehow), which has sockets on for the messages the server sends (e.g., partnerTurnedRight) and emits messages the server expects (e.g., movedForward). Then, when a remote client connects, spawn a stateful subprocess which has its own unique sid and is able to interact with the server with the exact same interface as the remote client. However, I'm really not sure how to put everything together or how to actually spawn a server-side client like that.
An example project which does something like this, some pseudocode, or a general structure of how to set something like this up would be greatly appreciated!
(Part of the problem is that I don't know what search terms to use, so it's been hard finding examples.)
You can use the python-socketio package server-side using python. Here's an example of the client usage:
import socketio
sio = socketio.Client()
#sio.on('connect')
def on_connect():
print('connected')
sio.emit('Hello')
#sio.on('event')
def on_message(data):
print('Received ', data)
#sio.on('disconnect')
def on_disconnect():
print('disconnected')
sio.connect('http://localhost:5000')
sio.wait()
I'm writing a python server/client app. If I serialize a function on the client and pass it to the server, can the server use it as a callback? I'm assuming there must be something extra I'd have to do as the client and server are communicating via packets, I just don't know what.
What I actually need is for the server to change one of the client's attributes (when the server is ready to accept another command), and I want an alternative to having the client continuously poll the server. Thanks for any help.
Take a look at Twisted JSON RPC.
A recent SO post: Python Twisted JSON RPC
I have written a little streaming mp3 server in python. So far all it does is accept a ServerSocket connection, and begin streaming all mp3 data in its queue to the request using socket.send(). I have implemented this to chunk in stream icy metadata, so the name of the playing song shows up in the client.
I would like to add playlist management to the server, so that I can manipulate the playlist of the running server. I have a vague idea that xmlrpclib would be suited to doing this, but I'm confused about two things:
Whether it's possible/advisable to integrate ICY and XMLRPC on a single server and a single port.
How to share state between the stream thread and the playlist, and manipulation thereof via xmlrpc.
Your initial attempt might be easier if you use two separate ports, each with its own server running in a separate thread. However, managing synchronization between the threads might be an annoying task in the long run.
ICY and HTTP are very similar, and if you've already implemented ICY on top of SocketServer, you could probably extend BaseHTTPServer.BaseHTTPRequestHandler to respond to both ICY and HTTP requests on the same port. Take a look at the standard library code for the BaseHTTPRequestHandler.parse_request() method, and think about how to override it in a subclass for a split personality.
Also, when you want to handle multiple concurrent requests using these classes, take a look at the SocketServer mixin classes.
I'm looking for a way to prevent multiple hosts from issuing simultaneous commands to a Python XMLRPC listener. The listener is responsible for running scripts to perform tasks on that system that would fail if multiple users tried to issue these commands at the same time. Is there a way I can block all incoming requests until the single instance has completed?
I think python SimpleXMLRPCServer module is what you want. I believe the default behavior of that model is blocking new requests when current request is processing. The default behavior gave me lots of trouble and I changed that behavior by mix in ThreadingMixIn class so that my xmlrpc server could respond multiple requests in the same time.
class RPCThreading(SocketServer.ThreadingMixIn, SimpleXMLRPCServer.SimpleXMLRPCServer):
pass
If I understand your question correctly, SimpleXMLRPCServer is the solution. Just use it directly.
Can you have another communication channel? If yes, then have a "call me back when it is my turn" protocol running between the server and the clients.
In other words, each client would register its intention to issue requests to the server and the said server would "callback" the next-up client when it is ready.
There are several choices:
Use single-process-single-thread server like SimpleXMLRPCServer to process requests subsequently.
Use threading.Lock() in threaded server.
You some external locking mechanism (like lockfile module or GET_LOCK() function in mysql) in multiprocess server.