I am currently working on a server platform, which is based on an event driven architecture. An event should enter the system via a websocket connection, and after some processing the response for it should also leave the system via the same websocket connection. The implementation logic behind the idea is, that if a connection is made to the server, I put it in a while cycle, and await it to send me data until it disconnects. The incoming data is put into a queue, from which a worker thread will pull it out and process it. On the other part, I have created a task, which is polling an outgoing event queue, and if there is an event in the queue, it sends it to the corresponding recipient. Unfortunately my current asyncio logic is flawed, in the way that polling the outgoing event queue blocks the receiving task, and I cannot wrap my head around a way to fix it. Here are some code snippets, which should represent the problem presented above:
Starting the websocket server
def run(self, address: str, port: int, ssl_context: ssl.SSLContext = None):
start_server = websockets.serve(
self.websocket_connection_handler, address, port, ssl=ssl_context)
event_loop = asyncio.get_event_loop()
event_loop.create_task(self.send_heartbeat())
event_loop.create_task(self.dispatch_outgoing_events())
print(f'Running on {"wss" if ssl_context else "ws"}://{address}:{port}')
event_loop.run_until_complete(start_server)
event_loop.run_forever()
The dispatcher function which infinitely polls data from the outgoing queue
async def dispatch_outgoing_events(self):
while not self.exit_state.should_exit:
if len(self.outgoing_event_queue) == 0:
await asyncio.sleep(0)
else:
event = self.outgoing_event_queue.get_event()
destination = event.destination
client_id = re.findall(
r'[a-f0-9]{8}-[a-f0-9]{4}-[a-f0-9]{4}-[a-f0-9]{4}-[a-f0-9]{12}', destination)[0]
client = self.client_store.get(client_id)
await client.websocket.send(serializer.serialize(event))
The connection handler function for the websocket
async def websocket_connection_handler(self, websocket, path):
client_id = await self.register(websocket)
try:
while not self.exit_state.should_exit:
correlation_id = str(uuid4())
message = await websocket.recv()
else:
try:
event = serializer.deserialize(
message, correlation_id, client_id)
event.return_address = f'remote://websocket/{client_id}'
self.incoming_event_queue.add_event(event)
except Exception as e:
event = type('evt', (object,), dict(system_entry=str(
datetime.datetime.utcnow()), destination=f'remote://websocket/{client_id}'))()
self.exception_handler.handle_exception(
e, event)
except Exception as exception:
print(
f'client {client_id} suddenly disconnected. Reason: {type(exception).__name__} -> {exception}')
self.client_store.remove(client_id)
self.topic_factory.remove_client(client_id)
self.topic_factory.get_topic('server_notifications').publish(ClientDisconnectedNotification(client_id),
str(uuid4()))
Related
Hello I am wanting to create a client socket via python, and I found this example (https://stackoverflow.com/a/49918082/12354066). The only problem I am wondering about is, I have a whole other program I want to implement this with, and it seems loop.run_until_complete(asyncio.wait(tasks)) is blocking the whole thread and not allowing me to execute any more functions i.e print(1) after loop.run_until_complete(asyncio.wait(tasks)). I want to be able to listen & send messages, but I also want to be able to execute other after I begin listening, maybe this is better suited for threads and not async (I don't know much async..)
import websockets
import asyncio
class WebSocketClient():
def __init__(self):
pass
async def connect(self):
'''
Connecting to webSocket server
websockets.client.connect returns a WebSocketClientProtocol, which is used to send and receive messages
'''
self.connection = await websockets.client.connect('ws://127.0.0.1:8765')
if self.connection.open:
print('Connection stablished. Client correcly connected')
# Send greeting
await self.sendMessage('Hey server, this is webSocket client')
return self.connection
async def sendMessage(self, message):
'''
Sending message to webSocket server
'''
await self.connection.send(message)
async def receiveMessage(self, connection):
'''
Receiving all server messages and handling them
'''
while True:
try:
message = await connection.recv()
print('Received message from server: ' + str(message))
except websockets.exceptions.ConnectionClosed:
print('Connection with server closed')
break
async def heartbeat(self, connection):
'''
Sending heartbeat to server every 5 seconds
Ping - pong messages to verify connection is alive
'''
while True:
try:
await connection.send('ping')
await asyncio.sleep(5)
except websockets.exceptions.ConnectionClosed:
print('Connection with server closed')
break
main:
import asyncio
from webSocketClient import WebSocketClient
if __name__ == '__main__':
# Creating client object
client = WebSocketClient()
loop = asyncio.get_event_loop()
# Start connection and get client connection protocol
connection = loop.run_until_complete(client.connect())
# Start listener and heartbeat
tasks = [
asyncio.ensure_future(client.heartbeat(connection)),
asyncio.ensure_future(client.receiveMessage(connection)),
]
loop.run_until_complete(asyncio.wait(tasks))
print(1) # never gets executed
I try to create a websocket server, I wanna make a client to exchange data from server to client, but now my data from other process, I need to make a queue accept data from other process, it makes my main websocket function blocked, the final result is that could not reconnect after client connection break, I think it blocked in the code of queue.
Here is my part of my code:
class RecorderEventHook(object):
def __init__(self, high_event_mq):
self.high_event_mq = high_event_mq
self.msg = None
self.loop = None
# #wrap_keep_alive
async def on_msg_event(self, websocket):
try:
# async for message in websocket:
while True:
msg = self.high_event_mq.get()
await websocket.send(json.dumps(msg))
# msg
except Exception as error:
print(error)
async def event_controller(self):
await websockets.serve(self.on_msg_event, 'localhost', 8888)
def start(self):
loop = asyncio.new_event_loop()
loop.create_task(self.event_controller())
loop.run_forever()
I try to save connected websocket object and using in other thread(in same process), but it failed and mentions
"xxxx" function never waited
I want to be able to receive data from other processes without affecting the normal reconnection of the client.
Anybody help and big appreciate.
I have a streams server that handles multiple independent clients. When I shut it down I want to notify all clients that the server has shut down.
I figured out how to close the server to new connections, but not how to cancel the specific handlers waiting for client data.
So far the only solution I found is to cancel all tasks in the loop, but this doesn't work for me as I have other tasks that must finish their jobs first.
Does asyncio provide some interface for this or do I have to keep track of all connections myself and cancel them once the server shuts down? I would prefer if the connection handler catches an exception when it calls await reader.readuntil() and not in the middle of execution, but this is not required.
Right now the client looses connection without warning. With this it cannot tell if it was a network issue or if the server shut down.
import asyncio
import signal
server = None
shutdown = False
async def important_task():
while not shutdown:
await asyncio.sleep(10)
print("I am important")
async def handle_conn(reader,writer):
print("Got connection")
try:
while True:
text = await reader.readuntil(b'\n')
# Do stuff
writer.write( text ) # Echo example
await writer.drain()
except serverShutdownException: # How do I cause something like this?
writer.write(b"Goodbye")
await writer.drain()
finally:
writer.close()
await writer.wait_closed()
def handle_sig(num,frame):
global shutdown
print(f"Caught {num}")
server.close()
shutdown = True
async def serve():
global server
server = await asyncio.start_server(handle_conn,"127.0.0.1",8080)
try:
await server.serve_forever()
except asyncio.CancelledError:
pass
await server.wait_closed()
# wait for all handlers to be done?
def main():
signal.signal(signal.SIGINT, handle_sig)
loop = asyncio.get_event_loop()
t1 = loop.create_task(serve())
t2 = loop.create_task(important_task())
loop.run_until_complete(asyncio.gather(t1,t2))
main()
The following is a reduced version of server that periodically serves any connected clients with telemetry in the form of json strings. This was my initial attempt, in which the main loop pushes data to all connected clients. However, I cannot simply let the handler terminate after "registering" the client. The connection will be closed. So I need to block the handler until the main loop determines the client has disconnected. Signalling the handler through an Event simply does nothing.
#routes.get('/telemetry/json')
async def handler(request: Request):
global CLIENT
CLIENT = await StreamResponse().prepare(request)
log.debug(f"Wait for {EVENT}")
await EVENT.wait() # This never wakes up!
log.debug(f"Client {request.remote} disconnected")
async def main():
global EVENT
EVENT = Event()
app = web.Application()
app.add_routes(routes)
runner = web.AppRunner(app)
await runner.setup()
await web.TCPSite(runner, port=8080).start()
while True:
await sleep(1)
if CLIENT is None:
continue
try:
await CLIENT.write('FLUSH\n'.encode('utf-8'))
await CLIENT.drain()
except ConnectionResetError:
log.debug(f"Notify {EVENT}")
EVENT.set()
log.addHandler(logging.StreamHandler())
log.setLevel(10)
asyncio.run(main())
To clarify: The use of global CLIENT and EVENT is not how it is intended. The handling of multipple clients was removed to make the example code as short as possible.
I have the web app. That app has endpoint to push some object data to redis channel.
And another endpoint handles websocket connection, where that data is fetched from channel and send to client via ws.
When i connect via ws, messages gets only first connected client.
How to read messages from redis channel with multiple clients and not create a new subscription?
Websocket handler.
Here i subscribe to channel, save it to app (init_tram_channel). Then run job where i listen channel and send messages(run_tram_listening).
#routes.get('/tram-state-ws/{tram_id}')
async def tram_ws(request: web.Request):
ws = web.WebSocketResponse()
await ws.prepare(request)
tram_id = int(request.match_info['tram_id'])
channel_name = f'tram_{tram_id}'
await init_tram_channel(channel_name, request.app)
tram_job = await run_tram_listening(
request=request,
ws=ws,
channel=request.app['tram_producers'][channel_name]
)
request.app['websockets'].add(ws)
try:
async for msg in ws:
if msg.type == aiohttp.WSMsgType.TEXT:
if msg.data == 'close':
await ws.close()
break
if msg.type == aiohttp.WSMsgType.ERROR:
logging.error(f'ws connection was closed with exception {ws.exception()}')
else:
await asyncio.sleep(0.005)
except asyncio.CancelledError:
pass
finally:
await tram_job.close()
request.app['websockets'].discard(ws)
return ws
Subscribing and saving channel.
Every channel is related to unique object, and in order not to create many channels that related to the same object, i save only one to app.
app['tram_producers'] is dict.
async def init_tram_channel(
channel_name: str,
app: web.Application
):
if channel_name not in app['tram_producers']:
channel, = await app['redis'].subscribe(channel_name)
app['tram_producers'][channel_name] = channel
Running coro for channel listening.
I run it via aiojobs:
async def run_tram_listening(
request: web.Request,
ws: web.WebSocketResponse,
channel: Channel
):
"""
:return: aiojobs._job.Job object
"""
listen_redis_job = await spawn(
request,
_read_tram_subscription(
ws,
channel
)
)
return listen_redis_job
Coro where i listen and send messages:
async def _read_tram_subscription(
ws: web.WebSocketResponse,
channel: Channel
):
try:
async for msg in channel.iter():
tram_data = msg.decode()
await ws.send_json(tram_data)
except asyncio.CancelledError:
pass
except Exception as e:
logging.error(msg=e, exc_info=e)
The following code has been found in some aioredis github issue (I've adopted it to my task).
class TramProducer:
def __init__(self, channel: aioredis.Channel):
self._future = None
self._channel = channel
def __aiter__(self):
return self
def __anext__(self):
return asyncio.shield(self._get_message())
async def _get_message(self):
if self._future:
return await self._future
self._future = asyncio.get_event_loop().create_future()
message = await self._channel.get_json()
future, self._future = self._future, None
future.set_result(message)
return message
So, how it works? TramProducer wraps the way we get messages.
As said #Messa
message is received from one Redis subscription only once.
So only one client of TramProducer is retrieving messages from redis, while other clients are waiting for future result that will be set after receiving message from channel.
If self._future initialized it means that somebody is waiting for message from redis, so we will just wait for self._future result.
TramProducer usage (i've taken an example from my question):
async def _read_tram_subscription(
ws: web.WebSocketResponse,
tram_producer: TramProducer
):
try:
async for msg in tram_producer:
await ws.send_json(msg)
except asyncio.CancelledError:
pass
except Exception as e:
logging.error(msg=e, exc_info=e)
TramProducer initialization:
async def init_tram_channel(
channel_name: str,
app: web.Application
):
if channel_name not in app['tram_producers']:
channel, = await app['redis'].subscribe(channel_name)
app['tram_producers'][channel_name] = TramProducer(channel)
I think it maybe helpfull for somebody.
Full project here https://gitlab.com/tram-emulator/tram-server
I guess a message is received from one Redis subscription only once, and if there is more than one listeners in your app, then only one of them will get it.
So you need to create something like mini pub/sub inside the application to distribute the messages to all listeners (websocket connections in this case).
Some time ago I've made an aiohttp websocket chat example - not with Redis, but at least the cross-websocket distribution is there: https://github.com/messa/aiohttp-nextjs-demo-chat/blob/master/chat_web/views/api.py
The key is to have an application-wide message_subcriptions, where every websocket connection registers itself, or perhaps its own asyncio.Queue (I've used Event in my example, but that's suboptimal), and whenever message comes from Redis, it is pushed to all relevant queues.
Of course when websocket connection ends (client unsubscribe, disconnect, failure...) the queue should be removed (and possibly Redis subscription cancelled if it was the last connection listening to it).
Asyncio doesn’t mean we should forget about queues :) Also it’s good to get familiar with combining multiple tasks at once (reading from websocket, reading from message queue, perhaps reading from some notification queue...). Using queues can also help you to handle client reconnects more cleanly (without loss of any messages).