Python SocketIO, run socketio server in background - python

How can I start a SocketIO server and have it be listening in the background while I continue to execute code on the main thread?
Right now I am using the package python-socketio. I have also testedflask-socketio which using python-socketio anyway.
https://python-socketio.readthedocs.io/en/latest/server.html
What I've mostly tried is starting the server with sio.start_background_task.
For example:
class SocketIOServer(socketio.AsyncNamespace):
def __init__(self):
super().__init__()
self.sio = socketio.AsyncServer()
self.sio.register_namespace(self)
self.app = web.Application()
self.sio.attach(self.app)
self.task = None
def run(self):
web.run_app(self.app, port=8080)
def start(self):
self.task = self.sio.start_background_task(self.run)
I tried the above and multiple variations like using Flask, Tornado, etc.
To be more clear this is basically what I want to do:
if __name__ == '__main__':
# ...
# start app, e.g. -> web.run_app(self.app, port=8080)
# I want to continue to execute code here
I don't fully understand how everything is working, am I asking a stupid question?

The problem is as you described it. You do not fully understand how everything works.
The most important thing that you are missing is that the python-socketio package is not a web server. This package just implements the Socket.IO logic, but you have to add a web server through which your Socket.IO code is available.
From the code that you included in your question, it appears that you have selected to attach the Socket.IO code to a web application built with the aiohttp framework, and use its own web server. Correct?
Then the real question that you have is how to run the aiohttp web server in a non-blocking way.
And it runs out that aiohttp has information on this in their documentation: https://docs.aiohttp.org/en/stable/web_advanced.html#application-runners. The example they have does this:
runner = web.AppRunner(app)
await runner.setup()
site = web.TCPSite(runner, 'localhost', 8080)
await site.start() # <-- starts the server without blocking
# you can do whatever you want here!

Related

Running a Tornado Server within a Jupyter Notebook

Taking the standard Tornado demonstration and pushing the IOLoop into a background thread allows querying of the server within a single script. This is useful when the Tornado server is an interactive object (see Dask or similar).
import asyncio
import requests
import tornado.ioloop
import tornado.web
from concurrent.futures import ThreadPoolExecutor
class MainHandler(tornado.web.RequestHandler):
def get(self):
self.write("Hello, world")
def make_app():
return tornado.web.Application([
(r"/", MainHandler),
])
pool = ThreadPoolExecutor(max_workers=2)
loop = tornado.ioloop.IOLoop()
app = make_app()
app.listen(8888)
fut = pool.submit(loop.start)
print(requests.get("https://localhost:8888"))
The above works just fine in a standard python script (though it is missing safe shutdown). Jupyter notebook are optimal environment for these interactive Tornado server environments. However, when it comes to Jupyter this idea breaks down as there is already a active running loop:
>>> import asyncio
>>> asyncio.get_event_loop()
<_UnixSelectorEventLoop running=True closed=False debug=False>
This is seen when running the above script in a Jupyter notebook, both the server and the request client are trying to open a connection in the same thread and the code hangs. Building a new Asyncio loop and/or Tornado IOLoop does not seem to help and I suspect I am missing something in Jupyter itself.
The question: Is it possible to have a live Tornado server running in the background within a Jupyter notebook so that standard python requests or similar can connect to it from the primary thread? I would prefer to avoid Asyncio in the code presented to users if possible due to its relatively complexity for novice users.
Based on my recent PR to streamz, here is something that works, similar to your idea:
class InNotebookServer(object):
def __init__(self, port):
self.port = port
self.loop = get_ioloop()
self.start()
def _start_server(self):
from tornado.web import Application, RequestHandler
from tornado.httpserver import HTTPServer
from tornado import gen
class Handler(RequestHandler):
source = self
#gen.coroutine
def get(self):
self.write('Hello World')
application = Application([
('/', Handler),
])
self.server = HTTPServer(application)
self.server.listen(self.port)
def start(self):
"""Start HTTP server and listen"""
self.loop.add_callback(self._start_server)
_io_loops = []
def get_ioloop():
from tornado.ioloop import IOLoop
import threading
if not _io_loops:
loop = IOLoop()
thread = threading.Thread(target=loop.start)
thread.daemon = True
thread.start()
_io_loops.append(loop)
return _io_loops[0]
To call in the notebook
In [2]: server = InNotebookServer(9005)
In [3]: import requests
requests.get('http://localhost:9005')
Out[3]: <Response [200]>
Part 1: Let get nested tornado(s)
To find the information you need you would have had to follow the following crumbtrails, start by looking at what is described in the release notes of IPython 7
It particular it will point you to more informations on the async and await sections in the documentation, and to this discussion,
which suggest the use of nest_asyncio.
The Crux is the following:
A) either you trick python into running two nested event loops. (what nest_asyncio does)
B) You schedule coroutines on already existing eventloop. (I'm not sure how to do that with tornado)
I'm pretty sure you know all that, but I'm sure other reader will appreciate.
There are unfortunately no ways to make it totally transparent to users – well unless you control the deployment like on a jupyterhub, and can add these lines to the IPython startups scripts that are automatically loaded. But I think the following is simple enough.
import nest_asyncio
nest_asyncio.apply()
# rest of your tornado setup and start code.
Part 2: Gotcha Synchronous code block eventloop.
Previous section takes only care of being able to run the tornado app. But note that any synchronous code will block the eventloop; thus when running print(requests.get("http://localhost:8000")) the server will appear to not work as you are blocking the eventloop, which will restart only when the code finish execution which is waiting for the eventloop to restart...(understanding this is an exercise left to the reader). You need to either issue print(requests.get("http://localhost:8000")) from another kernel, or, use aiohttp.
Here is how to use aiohttp in a similar way as requests.
import aiohttp
session = aiohttp.ClientSession()
await session.get('http://localhost:8889')
In this case as aiohttp is non-blocking things will appear to work properly. You here can see some extra IPython magic where we autodetect async code and run it on the current eventloop.
A cool exercise could be to run a request.get in a loop in another kernel, and run sleep(5) in the kernel where tornado is running, and see that we stop processing requests...
Part 3: Disclaimer and other routes:
This is quite tricky and I would advise to not use in production, and warn your users this is not the recommended way of doing things.
That does not completely solve your case, you will need to run things not in the main thread which I'm not sure is possible.
You can also try to play with other loop runners like trio and curio; they might allow you to do stuff you can't with asyncio by default like nesting, but here be dragoons. I highly recommend trio and the multiple blog posts around its creation, especially if you are teaching async.
Enjoy, hope that helped, and please report bugs, as well as things that did work.
You can make the tornado server run in background using the %%script --bg magic command. The option --bg tells jupyter to run the code of the current cell in background.
Just create a tornado server in one cell alongwith the magic command and run that cell.
Example:
%%script python --bg
import tornado.ioloop
import tornado.web
class MainHandler(tornado.web.RequestHandler):
def get(self):
self.write("Hello, world")
def make_app():
return tornado.web.Application([
(r"/", MainHandler),
])
loop = tornado.ioloop.IOLoop.current()
app = make_app()
app.listen(8000) # 8888 was being used by jupyter in my case
loop.start()
And then you can use requests in a separate cell to connect to the server:
import requests
print(requests.get("http://localhost:8000"))
# prints <Response [200]>
One thing to note here is that if you stop/interrupt the kernel on any cell, the background script will also stop. So you'll have to run this cell again to start the server.

(flask + socket.IO) Result of emit callback is the response of my REST endpoint

Just to give a context here, I'm a node.JS developer, but I'm on a project that I need to work with Python using Flask framework.
The problem is, when a client request to an endpoint of my rest flask app, I need to emit an event using socket.IO, and get some data from the socket server, then this data is the response of the endpoint. But I didn't figured out how to send this, because flask needs a "return" statement saying what is the response, and my callback is in another context.
Sample of what I'm trying to do: (There's some comments explaining)
import socketio
import eventlet
from flask import Flask, request
sio = socketio.Server()
app = Flask(__name__)
#app.route('/test/<param>')
def get(param):
def ack(data):
print (data) #Should be the response
sio.emit('event', param, callback=ack) # Socket server call my ack function
#Without a return statement, the endpoint return 500
if __name__ == '__main__':
app = socketio.Middleware(sio, app)
eventlet.wsgi.server(eventlet.listen(('', 8000)), app)
Maybe, the right question here is: Is this possible?
I'm going to give you one way to implement what you want specifically, but I believe you have an important design flaw in this, as I explain in a comment above. In the way you have this coded, your socketio.Server() object will broadcast to all your clients, so will not be able to get a callback. If you want to emit to one client (hopefully not the same one that sent the HTTP request), then you need to add a room=client_sid argument to the emit. Or, if you are contacting a Socket.IO server, then you need to use a Socket.IO client here, not a server.
In any case, to block your HTTP route until the callback function is invoked, you can use an Event object. Something like this:
from threading import Event
from flask import jsonify
#app.route('/test/<param>')
def get(param):
ev = threading.Event()
result = None
def ack(data):
nonlocal result
nonlocal ev
result = {'data': data}
ev.set() # unblock HTTP route
sio.emit('event', param, room=some_client_sid, callback=ack)
ev.wait() # blocks until ev.set() is called
return jsonify(result)
I had a similar problem using FastAPI + socketIO (async version) and I was stuck at the exact same point. No eventlet so could not try out the monkey patching option.
After a lot of head bangings it turns out that, for some reason, adding asyncio.sleep(.1) just before ev.wait() made everything work smoothly. Without that, emitted event actually never reach the other side (socketio client, in my scenario)

Aiohttp wsgi handler don't works

I need to wrap my Flask application with aiohttp. When I start it, there is an error:
This webpage has a redirect loop
ERR_TOO_MANY_REDIRECTS
ReloadHide details
The webpage at http://localhost:5000/ has resulted in too many redirects. Clearing your cookies for this site or allowing third-party cookies may fix the problem. If not, it is possibly a server configuration issue and not a problem with your computer.
Learn more about this problem.
code:
import asyncio
from flask import Flask
from aiohttp import web
from aiohttp_wsgi import WSGIHandler
app = Flask(__name__)
#app.route('/')
def login():
return 'Hello World'
#asyncio.coroutine
def init(loop):
wsgi_flask_app = WSGIHandler(app)
aio_app = web.Application(loop=loop)
aio_app.router.add_route('*', '/{path_info:.*}', wsgi_flask_app)
srv = yield from loop.create_server(
aio_app.make_handler(), '127.0.0.1', 5000)
return srv
if __name__ == '__main__':
io_loop = asyncio.get_event_loop()
io_loop.run_until_complete(init(io_loop))
try:
io_loop.run_forever()
except KeyboardInterrupt:
print('Interrupted')
when I change the route like in this example to
aio_app.router.add_route('*', '{path_info:.*}', wsgi_flask_app)
it raises ValueError: path should be started with /. What am I doing wrong?
The "add_route" method in aiohttp.router can be worked around with following construction:
wsgi_route = DynamicRoute('*', wsgi_flask_app, 'wsgi_flask_app',
re.compile('^(?P<path_info>.*)$'), '{path_info}')
app.router.register_route(wsgi_route)
But it's not pretty good solution IMHO. It's looks like backwards-incompatible change in aiohttp, and better solution is to use another aiohttp version.
UPDATE:
Since aiohttp-wsgi 0.2.5 version you can add routes starts with '/'.

Websocket connection between socket.io client and tornado python server

I'm trying to get websockets to work between two machines. One pc and one raspberry pi to be exact.
On the PC I'm using socket.io as a client to connect to the server on the raspberry pi.
With the following code I iniated the connection and try to send predefined data.
var socket = io.connect(ip + ':8080');
socket.send('volumes', { data: data });
On the raspberry pi, the websocket server looks like this:
from tornado import web, ioloop
from sockjs.tornado import SockJSRouter, SockJSConnection
class EchoConnection(SockJSConnection):
def on_message(self, msg):
self.send(msg)
def check_origin(self, origin):
return True
if __name__ == '__main__':
EchoRouter = SockJSRouter(EchoConnection, '/echo')
app = web.Application(EchoRouter.urls)
app.listen(8080)
ioloop.IOLoop.instance().start()
But the connection is never established. And I don't know why. In the server log I get:
WARNING:tornado.access:404 GET /socket.io/1/?t=1412865634790
(192.168.0.16) 9.01ms
And in the Inspector on the pc there is this error message:
XMLHttpRequest cannot load http://192.168.0.10:8080/socket.io/1/?t=1412865634790. Origin sp://793b6d4588ead99e1780e35b71d24d1b285328f8.hue is not allowed by Access-Control-Allow-Origin.
I am out of ideas and don't know what to do. Can you help me?
Thank you!
Well, the solution for your problem has to do with the internal design of the sockjs-tornado library more than with the socket.io library.
Basically, your problem has to do with cross origin request i.e. the html that is generating the request to the websocket server is not at the same origin as the websocket server. I can see from your code that you already identified the problem ( and you tried to solve it by redefining the method "check_origin") but you didn´t find the proper way to do it, basically because within this library is not the SockJSConnection class the one that extends tornado WebSocketHandler and so redefining its "check_origin" is useless. If you dig a little bit into the code, you will see that there exists one class defined, namely SockJSWebSocketHandler that has a redefinition of such method itself, which relies on the tornado implementation if it returns true, but that also allows you to avoid that check using a setting parameter :
class SockJSWebSocketHandler(websocket.WebSocketHandler):
def check_origin(self, origin):
***
allow_origin = self.server.settings.get("websocket_allow_origin", "*")
if allow_origin == "*":
return True
So, to summarize, you just need to include the setting "websocket_allow_origin"="*" in the server settings and everything should work properly =D
if __name__ == '__main__':
EchoRouter = SockJSRouter(EchoConnection, '/echo', user_settings={"websocket_allow_origin":"*"})

Websocket/event-source/... implementation to expose a two-way RPC to a python/django application

for a django application I'm working on, I need to implement a two ways RPC so :
the clients can call RPC methods from the platform and
the platform can call RPC methods from each client.
As the clients will mostly be behind NATs (which means no public IPs, and unpredictable weird firewalling policies), the platform to client way has to be initiated by the client.
I have a pretty good idea on how I can write this from scratch, I also think I can work something out of the publisher/subscriber model of twisted, but I've learned that there is always a best way to do it in python.
So I'm wondering what would be the best way to do it, that would also integrate the best to django. The code will have to be able to scope with hundreds of clients in short term, and (we hope) with thousands of clients in medium/long term.
So what library/implementation would you advice me to use ?
I'm mostly looking to starting points for RTFM !
websocket is a moving target, with new specifications from time to time. Brave developpers implements server side library, but few implements client side. The client for web socket is a web browser.
websocket is not the only way for a server to talk to a client, event source is a simple and pragmatic way to push information to a client. It's just a never ending page. Twitter fire hose use this tricks before its specification. The client open a http connection and waits for event. The connection is kept open, and reopen if there is some troubles (connection cut, something like that).
No timeout, you can send many events in one connection.
The difference between websocket and eventsource is simple. Websocket is bidirectionnal and hard to implement. Eventsource is unidirectionnal and simple to implement, both client and server side.
You can use eventsource as a zombie controller. Each client connects and reconnect to the master and wait for instruction. When instruction is received, the zombie acts and if needed can talk to its master, with a classical http connection, targeting the django app.
Eventsource keep the connection open, so you need an async server, like tornado. Django need a sync server, so, you need both, with a dispatcher, like nginx. Django or a cron like action talks to the async server, wich talks to the right zombie. Zombie talks to django, so, the async server doesn't need any peristance, it's just a hub with plugged zombies.
Gevent is able to handle such http server but there is no decent doc and examples for this point. It's a shame. I want a car, you give me a screw.
You can also use Tornado + Tornadio + Socket.io. That's what we are using right now for notifications, and the amount of code that you should write is not that much.
from tornadio2 import SocketConnection, TornadioRouter, SocketServer
class RouterConnection(SocketConnection):
__endpoints__ = {'/chat': ChatConnection,
'/ping': PingConnection,
'/notification' : NotificationConnection
}
def on_open(self, info):
print 'Router', repr(info)
MyRouter = TornadioRouter(RouterConnection)
# Create socket application
application = web.Application(
MyRouter.apply_routes([(r"/", IndexHandler),
(r"/socket.io.js", SocketIOHandler)]),
flash_policy_port = 843,
flash_policy_file = op.join(ROOT, 'flashpolicy.xml'),
socket_io_port = 3001,
template_path=os.path.join(os.path.dirname(__file__), "templates/notification")
)
class PingConnection(SocketConnection):
def on_open(self, info):
print 'Ping', repr(info)
def on_message(self, message):
now = dt.utcnow()
message['server'] = [now.hour, now.minute, now.second, now.microsecond / 1000]
self.send(message)
class ChatConnection(SocketConnection):
participants = set()
unique_id = 0
#classmethod
def get_username(cls):
cls.unique_id += 1
return 'User%d' % cls.unique_id
def on_open(self, info):
print 'Chat', repr(info)
# Give user unique ID
self.user_name = self.get_username()
self.participants.add(self)
def on_message(self, message):
pass
def on_close(self):
self.participants.remove(self)
def broadcast(self, msg):
for p in self.participants:
p.send(msg)
here is a really simple solution I could came up with :
import tornado.ioloop
import tornado.web
import time
class MainHandler(tornado.web.RequestHandler):
#tornado.web.asynchronous
def get(self):
self.set_header("Content-Type", "text/event-stream")
self.set_header("Cache-Control", "no-cache")
self.write("Hello, world")
self.flush()
for i in range(0, 5):
msg = "%d<br>" % i
self.write("%s\r\n" % msg) # content
self.flush()
time.sleep(5)
application = tornado.web.Application([
(r"/", MainHandler),
])
if __name__ == "__main__":
application.listen(8888)
tornado.ioloop.IOLoop.instance().start()
and
curl http://localhost:8888
gives output when it comes !
Now, I'll just have to implement the full event-source spec and some kind of data serialization between the server and the clients, but that's trivial. I'll post an URL to the lib I'll write here when it'll be done.
I've recently played with Django, Server-Sent Events and WebSocket, and I've wrote an article about it at http://curella.org/blog/2012/jul/17/django-push-using-server-sent-events-and-websocket/
Of course, this comes with the usual caveats that Django probably isn't the best fit for evented stuff, and both protocols are still drafts.

Categories