Python. Django. Persistent socket.io connection - python

Is there any way to connect to socket.io server and keep connection alive, like js client do, from django backend?
Code like this:
from socketIO_client import SocketIO
from django.conf import settings
def trigger_socket():
socket_client = SocketIO(settings.SIO_HOST, settings.SIO_PORT)
socket_client.emit('cool_task', {})
Works fine, but opens connection at every task emit. Is there any way to avoid it? Or I'm missing something about sockets?
UPD FYI Actually, usage of socket.io to communicate between servers - was stupid idea. Use tornadorpc / xmlrpclib for this kind of task - leave socket.io only on js client side.

Brief:
It's not possible in the way you described. If you want to use django to connect to the something like socket.io, take a look at gevent.io
Detailed:
When deployed with traditional web server, Django is not suitable to handle long time connection tasks due to the process/thread model. To handle long time connection, you need to adopt some event driven web server, such as [gunicorn][2]

It's possible with :
from socketIO_client import SocketIO
from django.conf import settings
def trigger_socket():
with SocketIO(settings.SIO_HOST, settings.SIO_PORT) as socketIO:
socketIO.emit('cool_task', {})
socketIO.wait(seconds=0)

Related

Share a background process in django?

My django app talks to a SOAP service using the suds-jurko library
from suds.client import Client
try:
URL = "http://192.168.12.11/xdwq/some_service.asmx?WSDL"
client = Client(URL, timeout=30)
except:
# Fallback mode
pass
def get_data(ID):
try:
response = client.service.GetData(ID)
data = response.diffgram.NewDataSet.master
return data
except:
return None
In my views
data = get_data(ID)
The problem is that the service takes quite some time to initialize (~20 seconds). Subsequent requests take upto 3 seconds to return. Whenever the page is requested the webserver (apache with mod_wsgi) takes quite a while to load on some requests.
In my apache configuration
WSGIDaemonProcess www.example.com user=hyde group=hyde threads=15 maximum-requests=10000
How do I write my code, so that apache (or django) can share a single background process for the SOAP service and minimize the 30s penalty?
I have been reading about celery and other such methods but am unsure how to proceed. Please advise.
You must create separate background process, using pure python or some third party modules (for example celery, as mentioned) and communicate with that process from django views (using unix or tcp sockets for example).
Also instead of WSGI, you can use different method to serve django application (gunicorn, uwsgi) that will persist in memory, but this is really dirty solution and I don't recommend that.

Python gevent+bottle. Querying an API. How to use gevent to prevent timeout locks?

I'm using gevent + bottle for following:
call API method on remote server
Process result from the API
return HTML
I've set a tiemout for the API call (httplib/socket), but if it's set to 5 seconds (for example), my python script is busy for that time and can't return any other pages (which is normal).
Question:
Can I somehow make a clever use of gevent (in a separate script, maybe?) to handle such long requests?
I was thinking of starting a separate API-interrogating script on localhost:8080 and putting it behind a load balancer (as "Internet" suggested) but I'm sure there msut be a better way.
I am not an experienced programmer, so thank you for your help!
Actually, your problem should not exist. The gevent server backend can handle any number of requests at the same time. If one is blocked for 5 seconds, that does not affect the other requests arriving at the server. Thats the point of the gevent server backend.
1) Are you sure that you use the gevent server backend properly? And not just a monkey-patched version of the wsgiref default server (which is single-threaded)?
2) Did you start the server via bottle.py --server gevent? If not, did you gevent.monkey.patch_all() before importing all the other socket-related stuff (including bottle)?
Example:
from gevent import monkey
monkey.patch_all()
import bottle
import urllib2
#bottle.route(...)
def callback():
urllib2.open(...)
bottle.run(server='gevent')

How to add authentication to a (Python) twisted xmlrpc server

I am trying to add authentication to a xmlrpc server (which will be running on nodes of a P2P network) without using user:password#host as this will reveal the password to all attackers. The authentication is so to basically create a private network, preventing unauthorised users from accessing it.
My solution to this was to create a challenge response system very similar to this but I have no clue how to add this to the xmlrpc server code.
I found a similar question (Where custom authentication was needed) here.
So I tried creating a module that would be called whenever a client connected to the server. This would connect to a challenge-response server running on the client and if the client responded correctly would return True. The only problem was that I could only call the module once and then I got a reactor cannot be restarted error. So is there some way of having a class that whenever the "check()" function is called it will connect and do this?
Would the simplest thing to do be to connect using SSL? Would that protect the password? Although this solution would not be optimal as I am trying to avoid having to generate SSL certificates for all the nodes.
Don't invent your own authentication scheme. There are plenty of great schemes already, and you don't want to become responsible for doing the security research into what vulnerabilities exist in your invention.
There are two very widely supported authentication mechanisms for HTTP (over which XML-RPC runs, therefore they apply to XML-RPC). One is "Basic" and the other is "Digest". "Basic" is fine if you decide to run over SSL. Digest is more appropriate if you really can't use SSL.
Both are supported by Twisted Web via twisted.web.guard.HTTPAuthSessionWrapper, with copious documentation.
Based on your problem description, it sounds like the Secure Remote Password Protocol might be what you're looking for. It's a password-based mechanism that provides strong, mutual authentication without the complexity of SSL certificate management. It may not be quite as flexible as SSL certificates but it's easy to use and understand (the full protocol description fits on a single page). I've often found it a useful tool for situations where a trusted third party (aka Kerberos/CA authorities) isn't appropriate.
For anyone that was looking for a full example below is mine (thanks to Rakis for pointing me in the right direction). In this the user and password is stored in a file called 'passwd' (see the first useful link for more details and how to change it).
Server:
#!/usr/bin/env python
import bjsonrpc
from SRPSocket import SRPSocket
import SocketServer
from bjsonrpc.handlers import BaseHandler
import time
class handler(BaseHandler):
def time(self):
return time.time()
class SecureServer(SRPSocket.SRPHost):
def auth_socket(self, socket):
server = bjsonrpc.server.Server(socket, handler_factory=handler)
server.serve()
s = SocketServer.ForkingTCPServer(('', 1337), SecureServer)
s.serve_forever()
Client:
#! /usr/bin/env python
import bjsonrpc
from bjsonrpc.handlers import BaseHandler
from SRPSocket import SRPSocket
import time
class handler(BaseHandler):
def time(self):
return time.time()
socket, key = SRPSocket.SRPSocket('localhost', 1337, 'dht', 'testpass')
connection = bjsonrpc.connection.Connection(socket, handler_factory=handler)
test = connection.call.time()
print test
time.sleep(1)
Some useful links:
http://members.tripod.com/professor_tom/archives/srpsocket.html
http://packages.python.org/bjsonrpc/tutorial1/index.html

How do you use tornado.testing for creating WebSocket unit tests?

I'm working on a project that works with tornado's websocket functionality. I see a decent amount of documentation for working with asychronous code, but nothing on how this can be used to create unit tests that work with their WebSocket implementation.
Does tornado.testing provide the functionality to do this? If so, could someone provide a brief example of how to make it happen?
Thanks in advance.
As #Vladimir said, you can still use AsyncHTTPTestCase to create/manage the test webserver instance, but you can still test WebSockets in much the same way as you would normal HTTP requests - there's just no syntactic sugar to help you.
Tornado also has its own WebSocket client so there's no need (as far as I've seen) to use a third party client - perhaps it's a recent addition though. So try something like:
import tornado
class TestWebSockets(tornado.testing.AsyncHTTPTestCase):
def get_app(self):
# Required override for AsyncHTTPTestCase, sets up a dummy
# webserver for this test.
app = tornado.web.Application([
(r'/path/to/websocket', MyWebSocketHandler)
])
return app
#tornado.testing.gen_test
def test_websocket(self):
# self.get_http_port() gives us the port of the running test server.
ws_url = "ws://localhost:" + str(self.get_http_port()) + "/path/to/websocket"
# We need ws_url so we can feed it into our WebSocket client.
# ws_url will read (eg) "ws://localhost:56436/path/to/websocket
ws_client = yield tornado.websocket.websocket_connect(ws_url)
# Now we can run a test on the WebSocket.
ws_client.write_message("Hi, I'm sending a message to the server.")
response = yield ws_client.read_message()
self.assertEqual(response, "Hi client! This is a response from the server.")
# ...etc
Hopefully that's a good starting point anyway.
I've attempted to implement some unit tests on tornado.websocket.WebSocketHandler based handlers and got the following results:
First of all AsyncHTTPTestCase definitely has lack of web sockets support.
Still, one can use it at least to manage IOLoop and application stuff which is significant. Unfortunately, there is no WebSocket client provided with tornado, so here enter side-developed library.
Here is unit test on Web Sockets using Jef Balog's tornado websocket client.
This answer (and the question) may be of interest, I use ws4py for the client and Tornado's AsyncTestCase which simplifies the whole thing.

Python, Twisted, Django, reactor.run() causing problem

I have a Django web application. I also have a spell server written using twisted running on the same machine having django (running on localhost:8090). The idea being when user does some action, request comes to Django which in turn connects to this twisted server & server sends data back to Django. Finally Django puts this data in some html template & serves it back to the user.
Here's where I am having a problem. In my Django app, when the request comes in I create a simple twisted client to connect to the locally run twisted server.
...
factory = Spell_Factory(query)
reactor.connectTCP(AS_SERVER_HOST, AS_SERVER_PORT, factory)
reactor.run(installSignalHandlers=0)
print factory.results
...
The reactor.run() is causing a problem. Since it's an event loop. The next time this same code is executed by Django, I am unable to connect to the server. How does one handle this?
The above two answers are correct. However, considering that you've already implemented a spelling server then run it as one. You can start by running it on the same machine as a separate process - at localhost:PORT. Right now it seems you have a very simple binary protocol interface already - you can implement an equally simple Python client using the standard lib's socket interface in blocking mode.
However, I suggest playing around with twisted.web and expose a simple web interface. You can use JSON to serialize and deserialize data - which is well supported by Django. Here's a very quick example:
import json
from twisted.web import server, resource
from twisted.python import log
class Root(resource.Resource):
def getChild(self, path, request):
# represents / on your web interface
return self
class WebInterface(resource.Resource):
isLeaf = True
def render_GET(self, request):
log.msg('GOT a GET request.')
# read request.args if you need to process query args
# ... call some internal service and get output ...
return json.dumps(output)
class SpellingSite(server.Site):
def __init__(self, *args, **kwargs):
self.root = Root()
server.Site.__init__(self, self.root, **kwargs)
self.root.putChild('spell', WebInterface())
And to run it you can use the following skeleton .tac file:
from twisted.application import service, internet
site = SpellingSite()
application = service.Application('WebSpell')
# attach the service to its parent application
service_collection = service.IServiceCollection(application)
internet.TCPServer(PORT, site).setServiceParent(service_collection)
Running your service as another first class service allows you to run it on another machine one day if you find the need - exposing a web interface makes it easy to horizontally scale it behind a reverse proxying load balancer too.
reactor.run() should be called only once in your whole program. Don't think of it as "start this one request I have", think of it as "start all of Twisted".
Running the reactor in a background thread is one way to get around this; then your django application can use blockingCallFromThread in your Django application and use a Twisted API as you would any blocking API. You will need a little bit of cooperation from your WSGI container, though, because you will need to make sure that this background Twisted thread is started and stopped at appropriate times (when your interpreter is initialized and torn down, respectively).
You could also use Twisted as your WSGI container, and then you don't need to start or stop anything special; blockingCallFromThread will just work immediately. See the command-line help for twistd web --wsgi.
You should stop reactor after you got results from Twisted server or some error/timeout happening. So on each Django request that requires query your Twisted server you should run reactor and then stop it. But, it's not supported by Twisted library — reactor is not restartable. Possible solutions:
Use separate thread for Twisted reactor, but you will need to deploy your django app with server, which has support for long running threads (I don't now any of these, but you can write your own easily :-)).
Don't use Twisted for implementing client protocol, just use plain stdlib's socket module.

Categories