python-socketio client to client messaging - python

Can I send messages from one client directly to another using python-socketio?
It can be done using socketio as follows:
socket.join('room')
io.sockets.in('room').emit('event_name', data)
Source: socket.io client to client messaging
I can't find any documentation on how this works with python-socketio. Could someone please provide an example?

The following in socketio
io.sockets.in('room').emit('event_name', data)
is the same as this in python-socketio:
io.emit("HelloWorld", some_dict, room="room")
Here is the link to the official doc -> link

The Socket.IO protocol allows bidirectional client-to-server communication, but clients are not connected among themselves so they cannot communicate directly. To implement client-to-client you have to pass through the server and it works as follows:
Client A emits an event to the server, indicating who is Client B, the recipient of the message, and what data it wishes to send to that client.
In the server, the handler for the event from Client A, emits an event addressed to Client B with the data passed by Client A in the first event.

Related

python websocket pub-sub with re-publish/broadcast

I would like to set-up a server that can subscribe to an external stream over a websocket (ws_ext) and then republish that data (after curating) to internal clients connecting to this server over websockets (ws_int).
My approach so far is to set up a fastapi server that can open websockets (ws_int) with internal clients .
However, I don't understand how to have a listener embedded in this server that can listen to external stream and then publish to these internal clients in a non blocking way.
Can someone point me to a working example that can help?
Here is what i would like to achieve:
p.s: I HAVE BEEN able to make it work by decoupling broadcaster from subscriber using redis pubsub. So, what i have now setup is a client that listens to external stream, curate and pushes it to redis pubsub. then i have a separate broadcaster that listens to redis pubsub and pushes it out to clients after curating on its websockets. I would still love to combine these two without using redis or some such backend.
if you have all clients connected to an async ws located in the broadcaster then the same time push whatever that's coming asynchronously to broadcaster from external website the process should be unblocking supposedly
the update process can have a async stream pipeline to filter results coming from external website for each client in broadcaster
as for example to async client for WebSocket it can go by "with async"
async def hello():
async with websockets.connect(
'ws://localhost:8765', ssl=ssl_context) as websocket:
name = input("What's your name? ")
await websocket.send(name)
print(f"> {name}")
greeting = await websocket.recv()
print(f"< {greeting}")
asyncio.get_event_loop().run_until_complete(hello())

Sending message from Kafka to socket.io in python

I have an end-to-end pipeline of an web application like below in Python3.6
Socket(connection from client to server) -> Flask Server -> Kafka Producer ->Kafka Consumer ->NLPService
Now when I get some result back from the NLPService, I need to send it back to the client. I am thinking below steps
NLP service writes the result to a different topic on Kafka producer (done)
Kafka consumer retrieves the result from Kafka broker (done)
Kafka consumer needs to write the result to the flask server
Then flask server will send the result back to the socket
Socket writes to client
I have already done steps 1-2. But stuck at step 3, 4. How do I write from Kafka to the flask server? If I just call a function at my server.py, then logically it seems like I have to create a socket within at function at server.py which will do the job of sending to client through socket. But syntax wise it looks weird. What am I missing?
at consumer.py
#receiving reply
topicReply = 'Reply'
consumerReply = KafkaConsumer(topicReply, value_deserializer=lambda m: json.loads(m.decode('ascii')))
for message in consumerReply:
#send reply back to Server
fromConsumer(message.value)
at server.py
socketio = SocketIO(app)
def fromConsumer(msg):
#socketio.on('reply')
def replyMessage(msg):
send(msg)
The above construct in server.py doesn't make sense to me. Please suggest.

How to handle socket.io broken connection in Flask?

I have a very simple Python (Flask socket.io) application which works as a server and another app written in AngularJS which is a client.
In order to handle connected and disconnected client I use respectlivy:
#socketio.on('connect')
def on_connect():
print("Client connected")
#socketio.on('disconnect')
def on_disconnect():
print("Client disconnected")
When Client connects to my app I get information about it, in case if client disconnect (for example because of problems with a network) I don't get any information.
What is the proper way to handle the situation in which client disconnects unexpectedly?
There are two types of connections: using long-pooling or WebSocket.
When you use WebSocket clients knows instantly that server was disconnected.
In the case of long-polling, there is need to set ping_interval and pint_timeout parameters (I also find information about heartbeat_interval and heartbeat_timeout but I don't know how they are related to ping_*).
From the server perspective: it doesn't know that client was disconnected and the only way to get that information is to set ping_interval and ping_timeout.

get current connection in flask socket.io

I want to make multiple emitting to an individual socket connection using flask's socket.io extension.
from flask.ext.socketio import SocketIO, emit, join_room, leave_room
# creating flask app and io...
#io.on("/saySomething")
def saying():
emit("said", "hello")
saying2()
def saying2():
# ...
# doing something long and important
# ...
emit("said", "and how are you?")
I do not know which connection saying2 is emitting to. Should I pass current connection to saying2 method? How can I achieve my goal?
In your example, saying2() is emitting to the client that sent the /saySomething event. This is based on a concept similar to request contexts in standard Flask.
The emit function has two optional arguments to send to other clients:
broadcast=True will send to all connected clients, including the client that sent the /saySomething event.
room=<room-name> will send to all clients attached to the given room. If you want to address individual clients, then put each client in a different room and target the desired room.

Easiest way to push RabbitMQ events to browser using WebSockets in Python?

I have an existing Python system that receives messages using Rabbit MQ. What is the absolute easiest way to get these events pushed to a browser using WebSockets using Python? Bonus if the solution works in all major browsers too.
Thanks,
Virgil
Here https://github.com/Gsantomaggio/rabbitmqexample I wrote an complete example that uses tornado and RabbitMQ.
You can find all the instruction from the site:
anyway ..you need:
pip install pika
pip install tornado
First you register your rabbitmq:
def threaded_rmq():
channel.queue_declare(queue="my_queue")
logging.info('consumer ready, on my_queue')
channel.basic_consume(consumer_callback, queue="my_queue", no_ack=True)
channel.start_consuming()
then you register your web-sockets clients:
class SocketHandler(tornado.websocket.WebSocketHandler):
def open(self):
logging.info('WebSocket opened')
clients.append(self)
def on_close(self):
logging.info('WebSocket closed')
clients.remove(self)
When you get a message, you can redirect it to the web-socket page.
def consumer_callback(ch, method, properties, body):
logging.info("[x] Received %r" % (body,))
# The messagge is brodcast to the connected clients
for itm in clients:
itm.write_message(body)
You could use Twisted, txAMQP and Autobahn|Python on the server to write a bridge in probably 50 lines of code, and Autobahn|JS on the browser side. Autobahn implements WebSocket, and WAMP on top, which provides you with Publish & Subscribe (as well as Remote Procedure Calls) over WebSocket.
When using raw WebSocket you would have to invent your own Publish & Subscribe over WebSocket - since I guess that is what you are after: extending the AMQP PubSub to the Web. Or you could check out STOMP.
Disclaimer: I am original author of WAMP and Autobahn.

Categories