My goal is to retrieve data from redis using pub-sub and then broadcast it to multiple users using websockets, however I can't get it to work using my knowledge of python and asyncio. So far, my code is the following:
import redis
import asyncio
import websockets
redis_url = 'redis://localhost:6379/0'
channel = 'app:notifications'
connection = redis.StrictRedis.from_url(redis_url, decode_responses=True)
pubsub = connection.pubsub(ignore_subscribe_messages=False)
pubsub.subscribe(channel)
async def listen():
for item in pubsub.listen():
message = item['data']
if type(message) != int:
print(message)
yield message
async def server(websocket, path):
while True:
msg = await listen()
websocket.send(msg)
start_server = websockets.serve(server, "localhost", 8000)
asyncio.get_event_loop().run_until_complete(start_server)
asyncio.get_event_loop().run_forever()
Maybe I could have a pubsub coroutine running which sends the messages to all connected users, but I do not know how I could make that work. Can anyone help me? Thanks in advance.
Related
I am studying asynchronous sockets in python these days for a bigger project. I just used the asyncio module and I referred the streams official documentation. For test purposes I created a server and a client that the server can handle a single client connected and after client is connected both server and client can chat each other.
server.py
import asyncio
async def handle(reader, writer):
while True:
data = await reader.read(100)
message_recieved = data.decode()
addr = writer.get_extra_info('peername')
print(f'{addr}::::{message_recieved}')
message_toSend = input('>>')
writer.write(message_toSend.encode())
await writer.drain()
async def main():
server = await asyncio.start_server(handle, '127.0.0.1', 10001)
addr = ', '.join(str(sock.getsockname()) for sock in server.sockets)
print(f'Serving on {addr}')
async with server:
await server.serve_forever()
asyncio.run(main())
client.py
import asyncio
async def client():
reader, writer = await asyncio.open_connection('127.0.0.1', 10001)
while True:
message = input('>>')
writer.write(message.encode())
data = await reader.read(100)
print(f'Recieved: {data.decode()}')
asyncio.run(client())
This is working fine. But now I have few questions.
How can I check whether is it working asynchronously?
Is it ok to use while loops like I did? (the reason for this question is I feel like when I used a while loop the loop becomes a synchronous part)?
Is this the correct way to code a simple client and server or are there any better ways of doing it?
I highly appreciate if someone experienced with this can help me.
I am trying to messing up with Websockets module and after checking the main page:
https://websockets.readthedocs.io/en/stable/intro.html
I did following:
SERVER
# SERVER
import asyncio
import websockets
import nest_asyncio
USERS = {}
async def set_online(websocket, user_name):
USERS[user_name] = websocket
await notify()
async def set_offline(websocket, user_name):
USERS.pop(user_name, None)
await notify()
async def notify():
if USERS:
message = "Online users: {}\n".format(len(USERS))
print (message)
#await asyncio.wait([user.send(message) for user in USERS])
else:
message = "Online users: 0\n"
print (message)
async def server(websocket, path):
user_name = await websocket.recv()
await set_online(websocket, user_name)
try:
async for message in websocket:
for user_name, user_ws in USERS.items():
if websocket == user_ws:
print (f"{user_name}: {message}")
finally:
await set_offline(websocket, user_name)
start_server = websockets.serve(server, "localhost", 3000,
ping_interval=None)
nest_asyncio.apply()
loop = asyncio.get_event_loop()
loop.run_until_complete(start_server)
loop.run_forever()
and also:
CLIENT
# CLIENT
import asyncio
import websockets
import nest_asyncio
async def client(localhost, port):
uri = "ws://{0}:{1}".format(localhost, str(port))
async with websockets.connect(uri) as websocket:
user_name = input("set your name: ")
await websocket.send(f"{user_name}")
while True:
message = input("> ")
if message == "/quit":
break
else:
await websocket.send(message)
host = "localhost"
port = 3000
nest_asyncio.apply()
loop = asyncio.get_event_loop()
loop.run_until_complete(client(host, port))
so all works as expected but I would like to achieve that each user can receive the answer as well from other users.
I found there is a conflict when I want to use websocket.send(message) in for loop async for message in websocket: on SERVER side
The link which I paste above, I think has a solution but I am struggling to figure out how to use it properly in my script.
I believe I need to create two tasks (send and recv) which will work in parallel.
Like:
async def handler(websocket, path):
consumer_task = asyncio.ensure_future(consumer_handler(websocket, path))
producer_task = asyncio.ensure_future(producer_handler(websocket, path))
done, pending = await asyncio.wait([consumer_task, producer_task],return_when=asyncio.FIRST_COMPLETED)
for task in pending:
task.cancel()
the following is displayed on the website which I provided above, just one thing needs to be changed from asyncio.ensure_future to asyncio.create_task. I implemented function handler, producer, consumer, producer_handler and consumer_handler to make it works but no luck.
Could someone provide an example or how this should be set up correctly?
I believe asyncio.create_task should be used on both (SERVER and CLIENT) so they both receive and send at one time.
This is pretty long but I hope someone can help me with it and also maybe my part of script will be handy for someone as well!
In Python, I'm using "websockets" library for websocket client.
import asyncio
import websockets
async def init_sma_ws():
uri = "wss://echo.websocket.org/"
async with websockets.connect(uri) as websocket:
name = input("What's your name? ")
await websocket.send('name')
greeting = await websocket.recv()
The problem is the client websocket connection is disconnected once a response is received. I want the connection to remain open so that I can send and receive messages later.
What changes do I need to keep the websocket open and be able to send and receive messages later?
I think your websocket is disconnected due to exit from context manager after recv().
Such code works perfectly:
import asyncio
import websockets
async def init_sma_ws():
uri = "wss://echo.websocket.org/"
async with websockets.connect(uri) as websocket:
while True:
name = input("What's your name? ")
if name == 'exit':
break
await websocket.send(name)
print('Response:', await websocket.recv())
asyncio.run(init_sma_ws())
In your approach you used a asynchronous context manager which closes a connection when code in the block is executed. In the example below an infinite asynchronous iterator is used which keeps the connection open.
import asyncio
import websockets
async def main():
async for websocket in websockets.connect(...):
try:
...
except websockets.ConnectionClosed:
continue
asyncio.run(main())
More info in library's docs.
I have multiple servers, each server is instance returning by asyncio.start_server. I need my web_server to works with websockets, to have possibility getting data using my javascript client. As I can see, asyncio do not provide websockets, only tcp sockets. Maybe I missed something ? I want to implement websocket server that I can using in asyncio.gather like below:
loop = asyncio.get_event_loop()
login_server = LoginServer.create()
world_server = WorldServer.create()
web_server = WebServer.create()
loop.run_until_complete(
asyncio.gather(
login_server.get_instance(),
world_server.get_instance(),
web_server.get_instance()
)
)
try:
loop.run_forever()
except KeyboardInterrupt:
pass
loop.close()
I do not want to use aiohttp cause if using like in code above aiohttp just blocks another tasks. I need something that will be non-blocking and that will have access to data of another servers (login and world). Does it possible with asyncio ? Does asyncio provide something like websockets ? How to implement websocket server for using in asyncio.gather ?
Well, finally I've implemented WebServer for using in another thread with asyncio. The code (WebServer code):
from aiohttp import web
class WebServer(BaseServer):
def __init__(self, host, port):
super().__init__(host, port)
#staticmethod
async def handle_connection(self, request: web.web_request):
ws = web.WebSocketResponse()
await ws.prepare(request)
async for msg in ws:
Logger.debug('[Web Server]: {}'.format(msg))
return ws
#staticmethod
def run():
app = web.Application()
web.run_app(app, host=Connection.WEB_SERVER_HOST.value, port=Connection.WEB_SERVER_PORT.value)
And how to run:
executor = ProcessPoolExecutor()
loop.run_until_complete(
asyncio.gather(
login_server.get_instance(),
world_server.get_instance(),
loop.run_in_executor(executor, WebServer.run)
)
)
Recently I've gotten into the "crypto mania" and have started writing my own wrappers around the API's on some exchanges.
Binance in particular has an a streaming websocket endpoint.
where you can stream data but via a websocket endpoint.
I thought I'd try this out on my own using sanic.
here is my websocket route
#ws_routes.websocket("/hello")
async def hello(request, ws):
while True:
await ws.send("hello")
now I have 2 clients on 2 different machines connecting to it
async def main():
async with aiohttp.ClientSession() as session:
ws = await session.ws_connect("ws://192.168.86.31:8000/hello")
while True:
data = await ws.receive()
print(data)
however only one of the clients will be able to connect and receive the sent data from the server. I'm assuming that because of the while loop its blocking and preventing the other connection from connecting because it doesn't yield?
how do we make it stream to multiple clients without blocking the other connections?
I looked into adding more workers and it seems to do the trick but what I don't understand is thats not a very scalable solution. because each client would be its own worker and if you have thousands or even just 10 clients that would be 10 workers 1 per client.
so how does Binance do their websocket streaming? or hell how does the twitter stream endpoint work?
how is it able to serve an infinite stream to multiple concurrent clients?
because ultimately thats what I'm trying to do
The way to solve this would be something like this.
I am using the sanic framework
class Stream:
def __init__(self):
self._connected_clients = set()
async def __call__(self, *args, **kwargs):
await self.stream(*args, **kwargs)
async def stream(self, request, ws):
self._connected_clients.add(ws)
while True:
disconnected_clients = []
for client in self._connected_clients: # check for disconnected clients
if client.state == 3: # append to a list because error will be raised if removed from set while iterating over it
disconnected_clients.append(client)
for client in disconnected_clients: # remove disconnected clients
self._connected_clients.remove(client)
await asyncio.wait([client.send("Hello") for client in self._connected_clients]))
ws_routes.add_websocket_route(Stream(), "/stream")
keep track of each websocket session
append to a list or set
check for invalid websocket sessions and remove from your websocket sessions container
do an await asyncio.wait([ws_session.send() for ws_session [list of valid sessions]]) which is basically a broadcast.
5.profit!
this is basically the pubsub design pattern
Something like this maybe?
import aiohttp
import asyncio
loop = asyncio.get_event_loop()
async def main():
async with aiohttp.ClientSession() as session:
ws = await session.ws_connect("ws://192.168.86.31:8000/hello")
while True:
data = await ws.receive()
print(data)
multiple_coroutines = [main() for _ in range(10)]
loop.run_until_complete(asyncio.gather(*multiple_coroutines))