Testing asynchronous sockets in python - python

I am studying asynchronous sockets in python these days for a bigger project. I just used the asyncio module and I referred the streams official documentation. For test purposes I created a server and a client that the server can handle a single client connected and after client is connected both server and client can chat each other.
server.py
import asyncio
async def handle(reader, writer):
while True:
data = await reader.read(100)
message_recieved = data.decode()
addr = writer.get_extra_info('peername')
print(f'{addr}::::{message_recieved}')
message_toSend = input('>>')
writer.write(message_toSend.encode())
await writer.drain()
async def main():
server = await asyncio.start_server(handle, '127.0.0.1', 10001)
addr = ', '.join(str(sock.getsockname()) for sock in server.sockets)
print(f'Serving on {addr}')
async with server:
await server.serve_forever()
asyncio.run(main())
client.py
import asyncio
async def client():
reader, writer = await asyncio.open_connection('127.0.0.1', 10001)
while True:
message = input('>>')
writer.write(message.encode())
data = await reader.read(100)
print(f'Recieved: {data.decode()}')
asyncio.run(client())
This is working fine. But now I have few questions.
How can I check whether is it working asynchronously?
Is it ok to use while loops like I did? (the reason for this question is I feel like when I used a while loop the loop becomes a synchronous part)?
Is this the correct way to code a simple client and server or are there any better ways of doing it?
I highly appreciate if someone experienced with this can help me.

Related

Python: Get data from redis-pubsub, send to multiple users using websocket

My goal is to retrieve data from redis using pub-sub and then broadcast it to multiple users using websockets, however I can't get it to work using my knowledge of python and asyncio. So far, my code is the following:
import redis
import asyncio
import websockets
redis_url = 'redis://localhost:6379/0'
channel = 'app:notifications'
connection = redis.StrictRedis.from_url(redis_url, decode_responses=True)
pubsub = connection.pubsub(ignore_subscribe_messages=False)
pubsub.subscribe(channel)
async def listen():
for item in pubsub.listen():
message = item['data']
if type(message) != int:
print(message)
yield message
async def server(websocket, path):
while True:
msg = await listen()
websocket.send(msg)
start_server = websockets.serve(server, "localhost", 8000)
asyncio.get_event_loop().run_until_complete(start_server)
asyncio.get_event_loop().run_forever()
Maybe I could have a pubsub coroutine running which sends the messages to all connected users, but I do not know how I could make that work. Can anyone help me? Thanks in advance.

Update Server every n seconds by looping function every n seconds? Python Sockets

I'm running this server which receives data. However I want it to update every second. This Asyncio loop says it's running forever but it only receives data once.
What loops can I execute to update message retrieval every n seconds and where should I place these loops? I've tried Threading, For/While Loops etc but I may have been placing them in the wrong places.
What should I do?
import asyncio
import websockets
import socket
UDP_IP = socket.gethostname()
UDP_PORT = 5225
sock = socket.socket(socket.AF_INET, # Internet
socket.SOCK_DGRAM) # UDP
sock.bind((UDP_IP, UDP_PORT))
while True:
data, addr = sock.recvfrom(1024) # buffer size is 1024 bytes
#print(str(data))
x = 1
async def echo(websocket, path):
async for message in websocket:
await asyncio.sleep(1)
await websocket.send(str(data)) #FontWeight Value
print(bytes(data))
start_server = websockets.serve(echo, "localhost", 9090)
asyncio.get_event_loop().run_until_complete(start_server)
asyncio.get_event_loop().run_forever()
#loop.run_forever(start_server)
You can't use ordinary sockets in asyncio because their blocking recv stalls the event loop. You need to use something like this:
data = None
class ServerProtocol(asyncio.Protocol):
def data_received(self, newdata):
global data
data = newdata
async def serve_udp():
loop = asyncio.get_running_loop()
server = await loop.create_server(ServerProtocol, UDP_IP, UDP_PORT)
async with server:
await server.serve_forever()
Then you to integrate it with the websocket serving code. For example:
async def ws_echo(websocket, path):
async for message in websocket:
await asyncio.sleep(1)
await websocket.send(str(data))
async def main():
asyncio.create_task(serve_udp())
await websockets.serve(ws_echo, "localhost", 9090)
await asyncio.Event().wait() # prevent main() from returning
asyncio.run(main())

Why use asyncio Server as context manager *and* call serve_forever()?

The Python 3.7 documentation for asyncio streams includes a TCP echo server example:
import asyncio
async def handle_echo(reader, writer):
data = await reader.read(100)
message = data.decode()
addr = writer.get_extra_info('peername')
print(f"Received {message!r} from {addr!r}")
print(f"Send: {message!r}")
writer.write(data)
await writer.drain()
print("Close the connection")
writer.close()
async def main():
server = await asyncio.start_server(
handle_echo, '127.0.0.1', 8888)
addr = server.sockets[0].getsockname()
print(f'Serving on {addr}')
async with server:
await server.serve_forever()
asyncio.run(main())
This is the fragment that I'm particularly interested in:
async with server:
await server.serve_forever()
So we are doing two things:
We are using asyncio.Server as a context manager, for which (from that page) "it’s guaranteed that the Server object is closed and not accepting new connections when the async with statement is completed".
We are calling Server.serve_forever(). This usually starts listening (if not already started) and ensures the server is closed when the coroutine is cancelled. By the time we reach these lines we have already called start_server (with the default start_serving=True) so the only effect is ensuring the server is closed.
It seems like these are doing essentially the same thing. Why are both lines included in the example? Are most reasonable practical applications likely to include both?
The answer is the unnecessary redundancy can be seen here. The serve_forever method can be called if the server is already accepting connections.
In the Server object methods description, the serve_forever coroutine example is as the follows:
async def client_connected(reader, writer):
# Communicate with the client with
# reader/writer streams. For example:
await reader.readline()
async def main(host, port):
srv = await asyncio.start_server(
client_connected, host, port)
await srv.serve_forever()
asyncio.run(main('127.0.0.1', 0))
More information can be found in the official documentation.

Python sockets server and client in one script

I have a seemingly simple task that I can't quite wrap my brains around.
Here is what I need to do. Using socket module, start a server, use a client to start a connection, stop the server, return connection data - all in one script. I can do it when I run the two from two terminals but I need to put both server and client code in one script for automation. My problem is that socket.accept() is a blocking call and the script hangs before I can invoke the client. Tried playing with socket.setblocking(False) but it still blocks. I intuitively feel that I can accomplish this with asyncio module, but I have no experience with it and the examples I've seen don't seem to fit my task. Thanks much.
I need to put both server and client code in one script for automation. My problem is that socket.accept() is a blocking call and the script hangs before I can invoke the client. [...] I intuitively feel that I can accomplish this with asyncio module
Asyncio indeed makes it easy to start several tasks "in the background" (see asyncio.create_task) or "in parallel" (see asyncio.gather).
In fact, since the start_server API runs the server "in the background" to begin with (sort of how a server forks to daemonize itself, and you don't have to add & when starting it from a shell), you don't even need to do anything special to start the client and the server in parallel - just start the server, await the client coroutine, and stop the server.
As an example, starting with the echo client/server examples from the documentation, I've quickly arrived to something like this:
import asyncio
async def connect():
print('connecting...')
reader, writer = await asyncio.open_connection('127.0.0.1', 8888)
writer.write(b'hello world')
data = await reader.read(100)
assert data == b'hello world'
writer.close()
await writer.wait_closed()
print('closed connection')
return data
async def handle_client(reader, writer):
print('incoming connection')
while True:
data = await reader.read(100)
if data == b'':
break
writer.write(data)
await writer.drain()
print('incoming connection closed')
async def main():
server = await asyncio.start_server(handle_client, '127.0.0.1', 8888)
print('server now set up')
await connect()
server.close()
await server.wait_closed()
asyncio.run(main())

listen to multiple socket with websockets and asyncio

I am trying to create a script in python that listens to multiple sockets using websockets and asyncio, the problem is that no matter what I do it only listen to the first socket I call.
I think its the infinite loop, what are my option to solve this? using threads for each sockets?
async def start_socket(self, event):
payload = json.dumps(event)
loop = asyncio.get_event_loop()
self.tasks.append(loop.create_task(
self.subscribe(event)))
# this should not block the rest of the code
await asyncio.gather(*tasks)
def test(self):
# I want to be able to add corotines at a different time
self.start_socket(event1)
# some code
self.start_socket(event2)
this is what I did eventually, that way its not blocking the main thread and all subscriptions are working in parallel.
def subscribe(self, payload):
ws = websocket.WebSocket(sslopt={"cert_reqs": ssl.CERT_NONE})
ws.connect(url)
ws.send(payload)
while True:
result = ws.recv()
print("Received '%s'" % result)
def start_thread(self, loop):
asyncio.set_event_loop(loop)
loop.run_forever()
def start_socket(self, **kwargs):
worker_loop = asyncio.new_event_loop()
worker = Thread(target=self.start_thread, args=(worker_loop,))
worker.start()
worker_loop.call_soon_threadsafe(self.subscribe, payload)
def listen(self):
self.start_socket(payload1)
# code
self.start_socket(payload2)
# code
self.start_socket(payload3)
Your code appears incomplete, but what you've shown has two issues. One is that run_until_complete accepts a coroutine object (or other kind of future), not a coroutine function. So it should be:
# note parentheses after your_async_function()
asyncio.get_event_loop().run_until_complete(your_async_function())
the problem is that no matter what I do it only listen to the first socket I call. I think its the infinite loop, what are my option to solve this? using threads for each sockets?
The infinite loop is not the problem, asyncio is designed to support such "infinite loops". The problem is that you are trying to do everything in one coroutine, whereas you should be creating one coroutine per websocket. This is not a problem, as coroutines are very lightweight.
For example (untested):
async def subscribe_all(self, payload):
loop = asyncio.get_event_loop()
# create a task for each URL
for url in url_list:
tasks.append(loop.create_task(self.subscribe_one(url, payload)))
# run all tasks in parallel
await asyncio.gather(*tasks)
async def subsribe_one(self, url, payload):
async with websockets.connect(url) as websocket:
await websocket.send(payload)
while True:
msg = await websocket.recv()
print(msg)
One way to efficiently listen to multiple websocket connections from a websocket server is to keep a list of connected clients and essentially juggle multiple conversations in parallel.
E.g. A simple server that sends random # to each connected client every few secs:
import os
import asyncio
import websockets
import random
websocket_clients = set()
async def handle_socket_connection(websocket, path):
"""Handles the whole lifecycle of each client's websocket connection."""
websocket_clients.add(websocket)
print(f'New connection from: {websocket.remote_address} ({len(websocket_clients)} total)')
try:
# This loop will keep listening on the socket until its closed.
async for raw_message in websocket:
print(f'Got: [{raw_message}] from socket [{id(websocket)}]')
except websockets.exceptions.ConnectionClosedError as cce:
pass
finally:
print(f'Disconnected from socket [{id(websocket)}]...')
websocket_clients.remove(websocket)
async def broadcast_random_number(loop):
"""Keeps sending a random # to each connected websocket client"""
while True:
for c in websocket_clients:
num = str(random.randint(10, 99))
print(f'Sending [{num}] to socket [{id(c)}]')
await c.send(num)
await asyncio.sleep(2)
if __name__ == "__main__":
loop = asyncio.get_event_loop()
try:
socket_server = websockets.serve(handle_socket_connection, 'localhost', 6789)
print(f'Started socket server: {socket_server} ...')
loop.run_until_complete(socket_server)
loop.run_until_complete(broadcast_random_number(loop))
loop.run_forever()
finally:
loop.close()
print(f"Successfully shutdown [{loop}].")
A simple client that connects to the server and listens for the numbers:
import asyncio
import random
import websockets
async def handle_message():
uri = "ws://localhost:6789"
async with websockets.connect(uri) as websocket:
msg = 'Please send me a number...'
print(f'Sending [{msg}] to [{websocket}]')
await websocket.send(msg)
while True:
got_back = await websocket.recv()
print(f"Got: {got_back}")
asyncio.get_event_loop().run_until_complete(handle_message())
Mixing up threads and asyncio is more trouble than its worth and you still have code that will block on the most wasteful steps like network IO (which is the essential benefit of using asyncio).
You need to run each coroutine asynchronously in an event loop, call any blocking calls with await and define each method that interacts with any awaitable interactions with an async
See a working e.g.: https://github.com/adnantium/websocket_client_server

Categories