I am using the websocket library in Python and I am new to this.
I want to create multiple different connections to websockets. This happens through my custom WebsocketProcess class which opens the connection, receives the event, keeps a record Id and then calls an API to grab the information for this particular record.
I am having trouble running them in parallel.
Please, see below (ignore the numerous imports)
Main:
#if __name__ == "__main__":
async def main():
#The length AccessTokens, ClientDescriptions and SQLTablesFix determines how many websockets we need to open
L = await asyncio.gather(
properties[0].websocket_starting(AccessTokens[0], ClientDescriptions[0], SQLTablesFix[0]),
properties[1].websocket_starting(AccessTokens[1], ClientDescriptions[1], SQLTablesFix[1]),
...
...
)
asyncio.run(main())
The WebsocketProcess class is as follows:
class WebsocketProcess:
"""description of class"""
def on_error(self, ws, error):
#{Relevant Code Here}
def on_open(self, ws):
print("\nOn Open\n")
def run(*args):
while True:
try:
time.sleep(1)
except TimeoutError:
pass
ws.close()
def on_close(self):
#{Relevant Code Here}
def on_message(self, ws, message):
#{Relevant Code Here}
ws.close()
def connect_websocket(self, AccessToken, ClientDescription, SQLTablesFix):
ws = websocket.WebSocketApp("_______url_here_____",
on_open = self.on_open,
on_message = self.on_message,
on_error = self.on_error,
on_close = self.on_close,
cookie = "ClientToken=_______; AccessToken=%s" % AccessToken)
ws.run_forever()
async def websocket_starting(self, AccessToken, ClientDescription, SQLTablesFix):
print("\nwebsocket_starting")
self.AccessToken = AccessToken
self.ClientDescription = ClientDescription
self.SQLTablesFix = SQLTablesFix
self.connect_websocket(self.AccessToken, self.ClientDescription, self.SQLTablesFix)
As you can see from the above, I have changed the main to asynchronous to run multiple instances of the websocket_process class in parallel. It opens a connection to the first websocket and it stops there waiting for events, without proceeding to open a second websocket.
I tried making the WebsocketProcess class entirely asynchronous but the errors I am receiving an error specifying that coroutine 'run' was never awaited (in the connect_websocket method).
Do you guys have any suggestions on how to run multiple instances of the websocket_process class in parallel?
Thanks!
Your websocket operations are blocking operations, to use websocket in asyncio, use other async libraries like websockets, Tornado
Related
I want to implement a service based on web sockets in the Tornado framework. When a user closes a web socket, I want to notify the other users about this. However, on_close is apparently a blocking function and my _broadcast(str) -> None function is async.
How can I call this function anyway?
from tornado import websocket
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
class SocketHandler(websocket.WebSocketHandler):
async def open(self, *args, conns, **kwargs):
logger.info(f"Opened a new connection to client {id(self)}")
self._conns = conns
async def on_message(self, message):
logger.info(f"Client {id(self)} sent message: {message}")
await self._broadcast(message)
def on_close(self):
logger.info(f"Client {id(self)} has left the scene")
self._conns.remove(self)
self._broadcast("something") # TODO
async def _broadcast(self, msg):
for conn in self._conns:
try:
await conn.write_message(msg)
except websocket.WebSocketClosedError:
pass
app = web.Application([
(r'/ws', SocketHandler)
])
if __name__ == '__main__':
app.listen(9000)
ioloop.IOLoop.instance().start()
The simple solution you're looking for is to use asyncio.create_task when calling the coroutine:
def on_close(self):
logger.info(f"Client {id(self)} has left the scene")
self._conns.remove(self)
asyncio.create_task(self._broadcast("something"))
(the legacy Tornado version of this function is tornado.gen.convert_yielded, but now that Tornado and asyncio are integrated there's no reason not to use the asyncio version for native coroutines)
But for this particular problem, the use of await in your _broadcast function is not ideal. Awaiting a write_message is used to provide flow control, but create_task doesn't do anything useful with the backpressure provided by await. (write_message is fairly unusual in that it is fully supported to call it both with and without await). In fact, it applies backpressure to the wrong things - one slow connection will slow notifications to all the others that come after it.
So in this case I'd advise making _broadcast a regular synchronous function:
def _broadcast(self, msg):
for conn in self._conns:
try:
conn.write_message(msg)
except websocket.WebSocketClosedError:
pass
If you want to be better able to control memory usage (via the flow control provided by await write_message), you'll need a more complicated solution, probably involving a bounded queue for each connection (in on_close, use put_nowait to add the message to every connection's queue, then have a task that reads from the queue and writes the message with await write_message)
i think a solution that involves using an asyncio.Queue should work for you.
i made a small class as a mock-up to test this out:
import asyncio
import time
class Thing:
on_close_q = asyncio.Queue()
def __init__(self):
self.conns = range(3)
def on_close(self, id):
time.sleep(id)
print(f'closing {id}')
self.on_close_q.put_nowait((self, id))
async def notify(self, msg):
print('in notify')
for conn in range(3):
print(f'notifying {conn} {msg}')
async def monitor_on_close():
print('monitoring')
while True:
instance, id = await Thing.on_close_q.get()
await instance.notify(f'{id} is closed')
from there, you'll need to run monitor_on_close in the ioloop you get from tornado. i've never used tornado, but i think adding something like this to your __main__ block should work:
ioloop.IOLoop.current().add_callback(monitor_on_close)
I'm trying to implement a websocket client in python using websockets and the apparently mandatory asyncio which I never used before (and I have a hard time to understand...).
I've read a lot on the subject and saw (too) many examples here and everywhere, but I can't find a way to properly make a websocket client with a persistent connection.
I need to have a persistent connection because the commands need to be requested on the same connection, the first one being an authentication command.
The remote server is a 3rd party API I don't have any control over.
I suppose I could run an authentication request along with each command my program sends but that does not feel right to open > auth > request > close for each command instead of keeping one connection alive during the whole program's life
My implementation is a library using many classes and I need to wrap the websocket connector/handler in one of them
Here's what I have right now, based on examples I found here and there (with some obfuscated data) :
import json
import asyncio
from websockets import connect
URL = 'wss://server.com/endpoint'
class Websocket:
async def __aenter__(self):
self._conn = connect(URL)
self.websocket = await self._conn.__aenter__()
return self
async def __aexit__(self, *args, **kwargs):
await self._conn.__aexit__(*args, **kwargs)
async def send(self, message):
await self.websocket.send(message)
async def receive(self):
return await self.websocket.recv()
class Handler:
def __init__(self):
self.wws = Websocket()
self.loop = asyncio.get_event_loop()
def command(self, cmd):
return self.loop.run_until_complete(self.__async__command(cmd))
async def __async__command(self, cmd):
async with self.wws as echo:
await echo.send(json.dumps(cmd))
return await echo.receive()
def main():
handler = Handler()
foo = handler.command('authentication command')
print('auth: ', foo)
bar = handler.command('another command to run depending on the first authentication')
print('command: ', bar)
if __name__ == '__main__':
main()
Basically right now I get these answers (simplified and obfuscated) :
auth: Ok, authenticated
command: Command refused, not authenticated
I suppose my problem is that the block async with self.wws as echo: kind of create the connection, runs its code then drop it instead of keeping the connection alive. Since we are not using a usual __init__ here but some asyncio voodoo I don't understand, I'm kind of stuck.
I think your diagnosis is correct, the problem is that the async context manager it creating and closing a connection for each call of Handler.command ... really not want you want.
Instead you could just synchronously establish the websocket connection during the init of Handler and then store the connection websocket (instance of type WebSocketClientProtocol) as a class member for later use, as in this sample code:
import json
import asyncio
from websockets import connect
URL = 'ws://localhost:8000'
class Handler:
def __init__(self):
self.ws = None
self.loop = asyncio.get_event_loop()
# perform a synchronous connect
self.loop.run_until_complete(self.__async__connect())
async def __async__connect(self):
print("attempting connection to {}".format(URL))
# perform async connect, and store the connected WebSocketClientProtocol
# object, for later reuse for send & recv
self.ws = await connect(URL)
print("connected")
def command(self, cmd):
return self.loop.run_until_complete(self.__async__command(cmd))
async def __async__command(self, cmd):
await self.ws.send(json.dumps(cmd))
return await self.ws.recv()
def main():
handler = Handler()
foo = handler.command('authentication command')
print('auth: ', foo)
bar = handler.command('another command to run depending on the first authentication')
print('command: ', bar)
if __name__ == '__main__':
main()
I plan to write a class WebsocketHandler that wrap the package websockets
This is the code :
import asyncio
import websockets
class WebsocketHandler:
__connection = None
def __init__(self):
asyncio.run(self.__setConnection())
async def __setConnection(self):
async with websockets.connect("ws://localhost/your/path") as websocket:
self.__connection = websocket
print("Connected")
def send(self, msg):
self.__connection.send(msg)
print("message Send")
ws = WebsocketHandler()
ws.send("message")
For the server part I have another finished script that works (tested with other scripts in other languages) that sends me a message when I have a new connection, when I receive a message and when I have a client disconnection.
When I try it the script connect successfully to my websocket server (script print Connected and on the server side I get a new connection).
I get then a warning in my script
RuntimeWarning: coroutine 'WebSocketCommonProtocol.send' was never awaited
self.__connection.send(msg)
And then my script print Message send and it stops.
The problem is that on the server side I don't receive the fact of having a message but only the one that tells me that the client is disconnected. Basically the script does not send the message and does not produce an error.
Anyone have any idea what the problem is ?
The problem is that the WebsocketHandler class has got only one async method, namely __setConnection, which is run by calling the asyncio.run function. The docs say that
This function always creates a new event loop and closes it at the
end.
That is, this is the only place where the async code could be running. Your code creates a websocket connection and closes it just after it prints "Connected". It happens because you call the websockets.connect method with async with as an asynchronous context manager which closes the connection automatically on exit. This is the first flaw, the second one is that the self.__connection.send function is a coroutine and it won't be running until you'll await on it. This is exactly what the error message is telling you about. Here is how you can fix the websocket handler class:
import asyncio
import websockets
class WebsocketHandler(object):
def __init__(self):
self.conn = None
async def connect(self, url):
self.conn = await websockets.connect(url)
async def send(self, msg):
await self.conn.send(msg)
async def close(self):
await self.conn.close()
async def main():
handler = WebsocketHandler()
await handler.connect('ws://localhost:8765')
await handler.send('hello')
await handler.send('world')
await handler.close()
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
I have one function orderbook(pair) that opens up a websocket, subscribes to a channel and listens for messages. I want to be able to subscribe to multiple channels and listen to messages in parallel. Is there any way to do this?
As of now, I am attempting the following:
from dev.orderbook import *
from multiprocessing import Process
def main():
process01 = Process(target=orderbook('BTC-USD'))
process02 = Process(target=orderbook('ETH-USD'))
process01.start()
process02.start()
if __name__ == '__main__':
main()
However, all this does is run process01 and only when I stop the script it runs process02. Which is not the behavior I am looking for. Is there any way I can subscribe to both these channels at once and listen and print messages/responses concurrently? If not, then is there any way I can start one process - let it run for a specified time and then start the next process?
My websocket mirrors the following:
import websocket
class orderbook(object):
def __init__(self, pair):
self.pair = pair
websocket.enableTrace(True)
ws = websocket.WebSocketApp("ws://echo.websocket.org/",
on_message = self.on_message,
on_error = self.on_error,
on_close = self.on_close)
ws.on_open = self.on_open
ws.run_forever()
def on_message(self, ws, message):
print(message)
def on_error(self, ws, error):
print(error)
def on_close(self, ws):
print("### closed ###")
def on_open(self, ws):
ws.send(self.pair)
Your Process constructors include actual calls to your target function. So, the function is called, in the parent process, in the course of evaluating the constructor line, and the result is passed to Process() as target.
You should rather pass the function itself, and pass its arguments separately:
process01 = Process(target=orderbook, args=('BTC-USD',))
First I should say I am completely new to asyncio or its' paradigm.
So I am trying implement asyncio (python 3.4) client with asyncio.open_connection() able to send requests on tcp socket (telnet) and read it's responses and listen to what other side may send in the same time.
In other words I need bi-directional communication which I am initializing, therefore I am client. Yet all examples I found so far using StreamReader and StreamWriter were forced to break loop after reading empty line. Also I need somehow deal with if incoming message (every message is exactly one line) is response to previouse request or original message of other side.
I was thinking something like this might work.
class MyHandler:
#asyncio.coroutine
def connect(self):
self.reader, self.writer = asyncio.open_connection('localhost', 2020)
while True:
msg = self.reader.readline()
if msg is None:
asyncio.sleep(1)
continue
self.handle_msg(msg)
#asyncio.coroutine
def request(self, msg):
self.writer.write(msg)
return self.reader.readline()
if __name__ == '__main__':
h = MyHandler()
loop = asyncio.get_event_loop()
loop.run_until_complete(h.connect)
loop.close()
I've written this code only during creating this question. Now that I start run it, it fails even before getting to main point. Somehow loop.run_until_complete(h.connect) fails with TypeError: A Future or coroutine is required
h.connect is a function, whereas h.connect() is a coroutine. this means the correct code would be:
class MyHandler:
#asyncio.coroutine
def connect(self):
self.reader, self.writer = asyncio.open_connection('localhost', 2020)
while True:
msg = yield from self.reader.readline() # Yield from since it's a coroutine
if msg.strip() is None:
yield from asyncio.sleep(1) # Also a coroutine
continue
self.handle_msg(msg)
#asyncio.coroutine
def request(self, msg):
self.writer.write(msg)
return (yield from self.reader.readline())
if __name__ == '__main__':
h = MyHandler()
loop = asyncio.get_event_loop()
loop.run_until_complete(h.connect())
loop.close()