Execute a coroutine after asyncio server is started - python

I am working on a controller application that monitors and controls subprocesses which are independent python executeables.
Basically what I want is that in controller.py running an asyncio.star_server. After the server is up and running the controller.py should execute other python files as clients which will connect to it. The controller server runs forever and create new client instances and also send shutdown message to them if necessary.
Unfortunately this does not work. No error received, it just hangs.
controller.py:
async def handleClient(reader, writer):
#handling a connection
addr = writer.get_extra_info("peername")
print(f"connection from {addr}")
data_ = await reader.readline()
...
async def startClient(client_py_file, host, port):
# this executes another py file that will connect to this server
await asyncio.sleep(0.1)
subprocess.run(["python.exe", client_py_file, host, port])
async def main():
server = await asyncio.start_server(handleClient, "127.0.0.1", 4000)
await asyncio.ensure_future(startClient("client.py", "127.0.0.1", 4000)
await server.wait_closed()
asyncio.run(main())
It seems it executes the client.py that starts, that connects to the server without any error.
client.py:
async def async_client(loop):
reader, writer = await asyncio.open_connection(host, port, loop = loop)
writer.writelines([json.dumps("key" : idstr, "msg" : "this is my message"}, b"\n"])
await writer.drain()
while True:
data = await reader.readline()
....
now the client hangs on and waits for response from the server. But on the server the handleClient handler is not triggered. Have no idea what goes wrong. Could you please help me?
Thank you in advance!

The problem is that subprocess.run is a blocking function, which waits for the client to finish. During this wait the event loop is blocked and unable to service the incoming connections.
The simplest fix is to replace subprocess.run(...) with subprocess.Popen(...) which does the same thing, but returning a handle to the subprocess without waiting for it to finish. If you need to communicate with the subprocess, you can also use asyncio.create_subprocess_exec(...) which also returns a handle, but one whose methods like wait() are coroutines.

Related

asyncio loop blocking django http and websocket requests

i've been trying to build application where i need to listen to the postgres notify listen constantly and send those messages via websocket and also need the usual django apis, i'm able to run both separately, but as initiate the db listener it starts blocking http or websocket requests
connection = get_db_conn()
connection.set_isolation_level(psycopg2.extensions.ISOLATION_LEVEL_AUTOCOMMIT)
cur = connection.cursor()
cur.execute("LISTEN new_item_added;")
async def db_listen():
print('Started Listening to DB notify ...')
while True:
await asyncio.sleep(1)
data = await queue.get()
await NotificationConsumer.send_data(data.payload) # class method to send over websocket
print("message received: ", data.payload)
def listen_callback():
connection.poll()
queue.put_nowait(connection.notifies.pop(0))
# calling this function from django app's __init__.py file
def initiate_db_listener():
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop.add_reader(connection, listen_callback)
loop.run_until_complete(db_listen())
if __name__ == '__main__':
initiate_db_listener()
i tried many variations of it, all are blocking the main thread

Python Websocket and Async using

I try to create a websocket server, I wanna make a client to exchange data from server to client, but now my data from other process, I need to make a queue accept data from other process, it makes my main websocket function blocked, the final result is that could not reconnect after client connection break, I think it blocked in the code of queue.
Here is my part of my code:
class RecorderEventHook(object):
def __init__(self, high_event_mq):
self.high_event_mq = high_event_mq
self.msg = None
self.loop = None
# #wrap_keep_alive
async def on_msg_event(self, websocket):
try:
# async for message in websocket:
while True:
msg = self.high_event_mq.get()
await websocket.send(json.dumps(msg))
# msg
except Exception as error:
print(error)
async def event_controller(self):
await websockets.serve(self.on_msg_event, 'localhost', 8888)
def start(self):
loop = asyncio.new_event_loop()
loop.create_task(self.event_controller())
loop.run_forever()
I try to save connected websocket object and using in other thread(in same process), but it failed and mentions
"xxxx" function never waited
I want to be able to receive data from other processes without affecting the normal reconnection of the client.
Anybody help and big appreciate.

Python asyncio: Cancel streams server and all clients

I have a streams server that handles multiple independent clients. When I shut it down I want to notify all clients that the server has shut down.
I figured out how to close the server to new connections, but not how to cancel the specific handlers waiting for client data.
So far the only solution I found is to cancel all tasks in the loop, but this doesn't work for me as I have other tasks that must finish their jobs first.
Does asyncio provide some interface for this or do I have to keep track of all connections myself and cancel them once the server shuts down? I would prefer if the connection handler catches an exception when it calls await reader.readuntil() and not in the middle of execution, but this is not required.
Right now the client looses connection without warning. With this it cannot tell if it was a network issue or if the server shut down.
import asyncio
import signal
server = None
shutdown = False
async def important_task():
while not shutdown:
await asyncio.sleep(10)
print("I am important")
async def handle_conn(reader,writer):
print("Got connection")
try:
while True:
text = await reader.readuntil(b'\n')
# Do stuff
writer.write( text ) # Echo example
await writer.drain()
except serverShutdownException: # How do I cause something like this?
writer.write(b"Goodbye")
await writer.drain()
finally:
writer.close()
await writer.wait_closed()
def handle_sig(num,frame):
global shutdown
print(f"Caught {num}")
server.close()
shutdown = True
async def serve():
global server
server = await asyncio.start_server(handle_conn,"127.0.0.1",8080)
try:
await server.serve_forever()
except asyncio.CancelledError:
pass
await server.wait_closed()
# wait for all handlers to be done?
def main():
signal.signal(signal.SIGINT, handle_sig)
loop = asyncio.get_event_loop()
t1 = loop.create_task(serve())
t2 = loop.create_task(important_task())
loop.run_until_complete(asyncio.gather(t1,t2))
main()

Python sockets server and client in one script

I have a seemingly simple task that I can't quite wrap my brains around.
Here is what I need to do. Using socket module, start a server, use a client to start a connection, stop the server, return connection data - all in one script. I can do it when I run the two from two terminals but I need to put both server and client code in one script for automation. My problem is that socket.accept() is a blocking call and the script hangs before I can invoke the client. Tried playing with socket.setblocking(False) but it still blocks. I intuitively feel that I can accomplish this with asyncio module, but I have no experience with it and the examples I've seen don't seem to fit my task. Thanks much.
I need to put both server and client code in one script for automation. My problem is that socket.accept() is a blocking call and the script hangs before I can invoke the client. [...] I intuitively feel that I can accomplish this with asyncio module
Asyncio indeed makes it easy to start several tasks "in the background" (see asyncio.create_task) or "in parallel" (see asyncio.gather).
In fact, since the start_server API runs the server "in the background" to begin with (sort of how a server forks to daemonize itself, and you don't have to add & when starting it from a shell), you don't even need to do anything special to start the client and the server in parallel - just start the server, await the client coroutine, and stop the server.
As an example, starting with the echo client/server examples from the documentation, I've quickly arrived to something like this:
import asyncio
async def connect():
print('connecting...')
reader, writer = await asyncio.open_connection('127.0.0.1', 8888)
writer.write(b'hello world')
data = await reader.read(100)
assert data == b'hello world'
writer.close()
await writer.wait_closed()
print('closed connection')
return data
async def handle_client(reader, writer):
print('incoming connection')
while True:
data = await reader.read(100)
if data == b'':
break
writer.write(data)
await writer.drain()
print('incoming connection closed')
async def main():
server = await asyncio.start_server(handle_client, '127.0.0.1', 8888)
print('server now set up')
await connect()
server.close()
await server.wait_closed()
asyncio.run(main())

listen to multiple socket with websockets and asyncio

I am trying to create a script in python that listens to multiple sockets using websockets and asyncio, the problem is that no matter what I do it only listen to the first socket I call.
I think its the infinite loop, what are my option to solve this? using threads for each sockets?
async def start_socket(self, event):
payload = json.dumps(event)
loop = asyncio.get_event_loop()
self.tasks.append(loop.create_task(
self.subscribe(event)))
# this should not block the rest of the code
await asyncio.gather(*tasks)
def test(self):
# I want to be able to add corotines at a different time
self.start_socket(event1)
# some code
self.start_socket(event2)
this is what I did eventually, that way its not blocking the main thread and all subscriptions are working in parallel.
def subscribe(self, payload):
ws = websocket.WebSocket(sslopt={"cert_reqs": ssl.CERT_NONE})
ws.connect(url)
ws.send(payload)
while True:
result = ws.recv()
print("Received '%s'" % result)
def start_thread(self, loop):
asyncio.set_event_loop(loop)
loop.run_forever()
def start_socket(self, **kwargs):
worker_loop = asyncio.new_event_loop()
worker = Thread(target=self.start_thread, args=(worker_loop,))
worker.start()
worker_loop.call_soon_threadsafe(self.subscribe, payload)
def listen(self):
self.start_socket(payload1)
# code
self.start_socket(payload2)
# code
self.start_socket(payload3)
Your code appears incomplete, but what you've shown has two issues. One is that run_until_complete accepts a coroutine object (or other kind of future), not a coroutine function. So it should be:
# note parentheses after your_async_function()
asyncio.get_event_loop().run_until_complete(your_async_function())
the problem is that no matter what I do it only listen to the first socket I call. I think its the infinite loop, what are my option to solve this? using threads for each sockets?
The infinite loop is not the problem, asyncio is designed to support such "infinite loops". The problem is that you are trying to do everything in one coroutine, whereas you should be creating one coroutine per websocket. This is not a problem, as coroutines are very lightweight.
For example (untested):
async def subscribe_all(self, payload):
loop = asyncio.get_event_loop()
# create a task for each URL
for url in url_list:
tasks.append(loop.create_task(self.subscribe_one(url, payload)))
# run all tasks in parallel
await asyncio.gather(*tasks)
async def subsribe_one(self, url, payload):
async with websockets.connect(url) as websocket:
await websocket.send(payload)
while True:
msg = await websocket.recv()
print(msg)
One way to efficiently listen to multiple websocket connections from a websocket server is to keep a list of connected clients and essentially juggle multiple conversations in parallel.
E.g. A simple server that sends random # to each connected client every few secs:
import os
import asyncio
import websockets
import random
websocket_clients = set()
async def handle_socket_connection(websocket, path):
"""Handles the whole lifecycle of each client's websocket connection."""
websocket_clients.add(websocket)
print(f'New connection from: {websocket.remote_address} ({len(websocket_clients)} total)')
try:
# This loop will keep listening on the socket until its closed.
async for raw_message in websocket:
print(f'Got: [{raw_message}] from socket [{id(websocket)}]')
except websockets.exceptions.ConnectionClosedError as cce:
pass
finally:
print(f'Disconnected from socket [{id(websocket)}]...')
websocket_clients.remove(websocket)
async def broadcast_random_number(loop):
"""Keeps sending a random # to each connected websocket client"""
while True:
for c in websocket_clients:
num = str(random.randint(10, 99))
print(f'Sending [{num}] to socket [{id(c)}]')
await c.send(num)
await asyncio.sleep(2)
if __name__ == "__main__":
loop = asyncio.get_event_loop()
try:
socket_server = websockets.serve(handle_socket_connection, 'localhost', 6789)
print(f'Started socket server: {socket_server} ...')
loop.run_until_complete(socket_server)
loop.run_until_complete(broadcast_random_number(loop))
loop.run_forever()
finally:
loop.close()
print(f"Successfully shutdown [{loop}].")
A simple client that connects to the server and listens for the numbers:
import asyncio
import random
import websockets
async def handle_message():
uri = "ws://localhost:6789"
async with websockets.connect(uri) as websocket:
msg = 'Please send me a number...'
print(f'Sending [{msg}] to [{websocket}]')
await websocket.send(msg)
while True:
got_back = await websocket.recv()
print(f"Got: {got_back}")
asyncio.get_event_loop().run_until_complete(handle_message())
Mixing up threads and asyncio is more trouble than its worth and you still have code that will block on the most wasteful steps like network IO (which is the essential benefit of using asyncio).
You need to run each coroutine asynchronously in an event loop, call any blocking calls with await and define each method that interacts with any awaitable interactions with an async
See a working e.g.: https://github.com/adnantium/websocket_client_server

Categories