TCP client reconnect to server if the server is down (using Asyncio) - python

I am writing a program using asyncio in python where a Client connects to a server, they exchange some messages and the Server closes the connection.
What I also need to implement is a retry mechanism, where in case the server is down the client will keep trying to reconnect every 5 seconds.
Being new in python and with the asyncio concept in general, I need help to understand how to implement that.
Below is a snippet of my code where I start the connection with the server and where I handle it in case of socket closure on server side.
async def main():
# Get a reference to the event loop as we plan to use
# low-level APIs.
format = "%(asctime)s: %(message)s"
logging.basicConfig(format=format, level=logging.INFO,
datefmt="%H:%M:%S")
executor1 = concurrent.futures.ThreadPoolExecutor(max_workers=2)
future1 = executor1.submit(init_thread)
loop = asyncio.get_running_loop()
on_con_lost = loop.create_future()
message = future1.result()
message = struct.pack(">I", len(message)) + bytes(message, "utf-8")
transport, protocol = await loop.create_connection(
lambda: EchoClientProtocol(message, on_con_lost),
'127.0.0.1', 9000)
# Wait until the protocol signals that the connection
# is lost and close the transport.
try:
await on_con_lost
finally:
transport.close()
asyncio.run(main())

Related

Python multithreading for simultaneous input and socket connection

I have started with a friend of mine to dig deeper into network coding. Concurrency and parallelism are a big part of this.
We have created a server and client to connect them and this works fine. Now we want to create a thread in the server for checking for inputs from the keyboard while listing to connections on the socket. Maybe we get something totally wrong but we tried it with this code and a threadpoolexecution but the program get stuck at the first await call
i = await ainput.asyncInput()
We thought that after the await starts the thread wait for an input and the main thread goes on in execution but that seems to be wrong.
Here is the server module:
import socket
import asyncio
import asyncron_Input as ainput
def closeServer():
exit()
server_address = ('localhost',6969)
async def main():
#create TCP Socket
serverSock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
# prints the adress and the port of the server
print("starting up on: ")
print(server_address[0])
print("port: ")
print(server_address[1])
# binds the socket to the given adress
serverSock.bind(server_address)
#listen for connection
serverSock.listen(1)
print("End server with 1")
while True:
#close server with asynco inputs
i = await ainput.asyncInput()
if i == "1":
closeServer()
#wait for connection
print("waiting for conncetion")
conn,client_address = serverSock.accept()
try:
print("connected to",client_address)
while True:
data = conn.recv(16)
if data:
print("received:",data)
data = "succsessful"
else:
print("no data")
break
finally:
#close connection
conn.close
asyncio.run(main())
Here is the async input:
import asyncio
from concurrent.futures import ThreadPoolExecutor
async def asyncInput():
with ThreadPoolExecutor(1,'Async Input') as executor:
return await asyncio.get_event_loop().run_in_executor(executor,input)
Thanks for your help in advance
There's two problems with your code:
You wait for input before accepting any socket connections, you can't use await if you want code proceeding it to happen concurrently, you need to use a Task.
You're using blocking sockets. sock.accept and sock.recv are blocking by default. They'll halt execution of your event loop, you need to use them in an await expression, which means making your sever socket non-blocking and then using them with special asyncio specific socket methods.
To fix this, you'll need to wrap listening for input in a task, make your server socket non-blocking, get the running event loop and then use the sock_accept and sock_recv methods of the event loop. Putting this all together, your code will look something like this:
import asyncio
import socket
from concurrent.futures import ThreadPoolExecutor
async def asyncInput():
with ThreadPoolExecutor(1,'Async Input') as executor:
return await asyncio.get_event_loop().run_in_executor(executor,input)
def closeServer():
exit()
server_address = ('localhost',8000)
async def loop_for_input():
while True:
#close server with asynco inputs
i = await asyncInput()
if i == "1":
closeServer()
async def main():
#create TCP Socket
serverSock = socket.socket(socket.AF_INET, socket.SOCK_STREAM, )
serverSock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
# prints the adress and the port of the server
print("starting up on: ")
print(server_address[0])
print("port: ")
print(server_address[1])
# binds the socket to the given adress
serverSock.bind(server_address)
serverSock.setblocking(False) #make your socket non-blocking
#listen for connection
serverSock.listen(1)
print("End server with 1")
loop = asyncio.get_running_loop() # get the running event loop
input_task = asyncio.create_task(loop_for_input()) # create an task to run your input loop
while True:
#wait for connection
print("waiting for conncetion")
conn,client_address = await loop.sock_accept(serverSock) # use the sock_accept coroutine to asynchronously listen for connections
try:
print("connected to",client_address)
while True: # you may also want to create a task for this loop.
data = await loop.sock_recv(conn, 16) # use the sock_recv coroutine to asynchronously listen for data
if data:
print("received:",data)
data = "succsessful"
else:
print("no data")
break
finally:
#close connection
conn.close()
asyncio.run(main())
There's potentially a third problem in that your code can only handle one client at a time since you enter an infinite loop for the first connection that comes in. This means any additional clients who connect will be blocked. If you want to solve that problem, any time a client connects, create a new Task to listen for data from the client, similar to what the code above does with asyncInput()

Sending data to multiple websocket connections in Python

I have a server that gathers data from a bunch of GPS trackers, and want to ship this data out in real time to X connected clients via WebSockets. The trackers connect over TCP (each in their own thread) and send data regularly to the server. The data is merged in a thread called data_merger and put in that threads queue(). This mechanic works nicely and as intended, however I'm running into issues when I want to send this data to websocket connections.
I tried basing my solution on the websocket synchronization example, as this seemed like it applied to my usecase. I have a thread called outbound_worker that handles the websocket code. From thread.run():
def run(self):
self.data_merger.name = 'data_merger'
self.data_merger.start()
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
start_server = websockets.serve(self.handle_clients, 'localhost', self.port)
print("WebSocker server started for port %s at %s" % (self.port, datetime.now()))
loop.run_until_complete(start_server)
asyncio.get_event_loop().run_forever()
Then the handler method for the websocket server:
async def handle_clients(self, websocket, path):
while True:
try:
# Register the websocket to connected set
await self.register(websocket)
data = self.data_merger.queue.get()
await send_to_clients(data)
await asyncio.sleep(0.1)
except websockets.ConnectionClosed:
print("Connection closed")
await self.unregister(websocket)
break
async def send_to_clients(self, data):
data = json.dumps(data)
if self.connected:
await asyncio.wait([ws.send(data) for ws in self.connected])
The register() and unregister() methods are identical to the example I linked above. The client I'm using is a basic loop that prints the data received:
async def hello():
uri = "ws://localhost:64000"
while True:
async with websockets.connect(uri) as websocket:
print("Awaiting data...")
data = await websocket.recv()
#print(f"{data}")
print(f"{json.loads(data)}")
asyncio.get_event_loop().run_until_complete(hello())
As I am new to asynchronous calls in Python and websockets in general, I'm not sure if my approach is correct here. Since I am trying to push data right after registering a new connection, the code seems to halt at the await send_to_clients(data) line. Should I rather handle this in the data_merger thread and pass the connected set?
Another issue is that if I simply use the client_handler to register() and unregister() the new connections, it seems to just loop over the register() part and I'm unable to connect a second client.
I guess my questions can be condensed into the following:
How do I accept and manage multiple open connections over websocket, similar to a multithreaded socket server?
Is there a way to trigger a function call (for instance register() only on new websocket connections, similar to socket.listen() and socket.accept()?

Python asyncio - starting coroutines in an infinite loop

I am making a simple server/client chat program in Python. This program should allow for multiple users to connect at once, and then execute their requests concurrently. For this, I am using the asyncio module and sockets.
async def accept_new_connections(socket):
socket.listen(1)
while True:
connection, client_address = sock.accept()
print("accepted conn")
asyncio.create_task(accept_commands(socket, connection))
async def accept_commands(socket, connection):
print("accept cmd started")
while True:
# get and execute commands
def main():
asyncio.run(accept_new_connections(socket))
main()
What I would hope to do is running accept_commands for each of the connections, which would then execute commands concurrently. However, the current code only starts accept_commands for the first connection, and blocks the while loop (the one in accept_new_connections).
Any idea what I need to change to have accept_command started for each of the connections instead?
It is tough to tell because your example does have the implementation of accept_commands, but based on your issue it is likely you need to use the async socket methods on event loop itself so that your coroutine can yield execution and let something else happen.
The below example shows how to do this. This starts a socket on port 8080 and will send back any data it receives back to the client. You can see this work concurrently by connecting two clients with netcat or telnet and sending data.
import asyncio
import socket
socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
socket.bind(('localhost', 8080))
socket.listen(1)
socket.setblocking(False)
loop = asyncio.new_event_loop()
async def main():
while True:
connection, client_address = await loop.sock_accept(socket)
print('connected')
loop.create_task(accept_commands(connection))
async def accept_commands(connection):
while True:
request = await loop.sock_recv(connection, 16)
await loop.sock_sendall(connection, request)
loop.run_until_complete(main())

Python3 asyncio: using infinite loop for multiple connections and proper connection closing

I have server, where I need to keep connection with client as long as possible. I need to allow for multiple clients connect to this server. Code:
class LoginServer(BaseServer):
def __init__(self, host, port):
super().__init__(host, port)
async def handle_connection(self, reader: StreamReader, writer: StreamWriter):
peername = writer.get_extra_info('peername')
Logger.info('[Login Server]: Accepted connection from {}'.format(peername))
auth = AuthManager(reader, writer)
while True:
try:
await auth.process()
except TimeoutError:
continue
finally:
await asyncio.sleep(1)
Logger.warning('[Login Server]: closing...')
writer.close()
#staticmethod
def create():
Logger.info('[Login Server]: init')
return LoginServer(Connection.LOGIN_SERVER_HOST.value, Connection.LOGIN_SERVER_PORT.value)
The problem: currently only one client can connect to this server. It seems socket do not closing properly. And because of this even previous client cannot reconnect. I think this is because infinite loop exists. How to fix this problem?
The while loop is correct.
If you wanted a server that waits on data from a client you would have the following loop in your handle_connection.
while 1:
data = await reader.read(100)
# Do something with the data
See the example echo server here for more details on reading / writing.
https://asyncio.readthedocs.io/en/latest/tcp_echo.html
Your problem is likely that this function doesn't return and is looping itself without await'g anything. That would mean the asyncio loop would never regain control so new connections could not be made.
await auth.process()

How to create TCP proxy server with asyncio?

I found these example with TCP client and server on asyncio: tcp server example. But how to connect them to get TCP proxy server which will be receive data and send it to other adress?
You can combine both the TCP client and server examples from the user documentation.
You then need to connect the streams together using this kind of helper:
async def pipe(reader, writer):
try:
while not reader.at_eof():
writer.write(await reader.read(2048))
finally:
writer.close()
Here's a possible client handler:
async def handle_client(local_reader, local_writer):
try:
remote_reader, remote_writer = await asyncio.open_connection(
'127.0.0.1', 8889)
pipe1 = pipe(local_reader, remote_writer)
pipe2 = pipe(remote_reader, local_writer)
await asyncio.gather(pipe1, pipe2)
finally:
local_writer.close()
And the server code:
# Create the server
loop = asyncio.get_event_loop()
coro = asyncio.start_server(handle_client, '127.0.0.1', 8888)
server = loop.run_until_complete(coro)
# Serve requests until Ctrl+C is pressed
print('Serving on {}'.format(server.sockets[0].getsockname()))
try:
loop.run_forever()
except KeyboardInterrupt:
pass
# Close the server
server.close()
loop.run_until_complete(server.wait_closed())
loop.close()

Categories