I've created remote objects monitoring application.
Application description: Twisted communicates with remote controllers and then saves measured data to database via "adbapi". Django is used as web interface.
Problem: Django uses models for database access. Twisted uses raw SQL queries, generates alerts and other before save operations. I want to move all logic from Twisted to Django. So that Twisted will be layer between remote controllers and Django. Django will use models and save measured data, perform various operations and so on.
Question: Best way to organize communication between Twisted and Django? Two-way communication bus between them. At the moment I've 3 options:
Perform http POST request via getPage from twisted.web.client and on other side handle it by Django.
Use RabbitMQ and pika module as transport on Twisted side. Use Django-standalone app as daemon on other side.
I believe that there should be a better solution than my offer.
P.S.:
Answer to my own question: for now I've used RabbitMQ as a transport. On twisted side used pika module via this adapter. On django side I created a django standalone script as a daemon. Daemon endlessly waiting for new messages from RabbitMQ via pika module.
Please, let me know if there is a better solution.
Related
Is there some way I could setup my architecture like this for my web application:
Backend - Django + DRF (Rest API)
Frontend - React
And on the backend setup a websocket Client to listen to an external websocket Server and then forward the data from the Client to a new Server that I will create. So in React I could listen to this websocket Server that I have created?
I tried implementing this in React to listen to an external websocket Server, but it just gave me headaches trying to use proxy to avoid CORS problems.
How should I approach this? Am I thinking straight here?
Hello Marcus C and welcome to StackOverflow!
Since you didn't post any code yourself I can't give you any concrete examples, but I can point you in the right direction. As you said yourself trying to use an external WebSockets server (such as Node.js with socket.io) is a pain. For this purpose the Django Channels library exists. It is really useful as it allows direct access to database and other Django-related stuff.
If you run Django in a Docker container, the best way to use Channels is to run two separate containers, one with say gunicorn or uWSGI servers for the synchronous part and another with Channels' recommended Daphne server for asynchronous part, both proxied by nginx. Standard (or rather common) way is to use /ws path prefix for the asynchronous endpoints.
I have a Django application, and I would like to make ZeroMQ calls during views. I would like to initialize the context once and have it globally available.
My question is twofold:
How can I initialize a context at the start-up of Django and cause it to be globally shared?
Does the multi-processing of uwsgi/nginx cause n completely separate instances, or does it do a fork, causing me to require n separate contexts?
Use a process manager for running separate application server (Django) and Zeromq server process using custom manage command that will solve your situation correctly.
NOTE: Using custom manage command for Zeromq server ( for running as a subscriber or other persistent daemon Zeromq device profiles), you can access Django models or APIs.
I've a django project running on gunicorn server behind an nginx proxy.
I would like to add real time chat functionality to this project and for this purpose I'm considering to use Tornado websocket API.
But the problem is, Chat messages needs to be authenticated by Django User as request.is_authenticated(). Similarily I need to use two Django models for handling Chat in tornado. I'm uanble to figure out how to handle this situation keeping both gunicorn and tornado server different.
I've considered few options:
1. Creating a dedicated API on gunicorn server that will listen to special messages from tornado server. This can be done but extra overhead will be on gunicorn server
2. Session authentication can be done via gunicorn server as API and rest of database handling can be done on tornado itself.
Are there better ways to handle this? Without creating API and integrating them seamlessly?
It's possible to run your Django app on top of tornado instead of gunicorn (see https://github.com/bdarnell/django-tornado-demo). This isn't the best option for performance, but will let the django and tornado apps interact without building out a separate API.
Personally I recommend running django and tornado in the same process only as a stepping stone to transition to 100% tornado. If you intend to keep the django server then it is probably best to just build an authentication API that you can call from tornado.
I'm currently researching websocket support in Python and am a bit confused with the offerings.
On one hand it's possible to use Flask + gevent. On the other hand, uwsgi has socket support and at last there is an extension that bundles both uwsgi and gevent.
What's the problem with implementing websockets with only one of these? What do I win by mixing them?
Changing the question
What does adding gevent do that threaded uwsgi won't?
In regular HTTP requests the connections between client and server are short-lived, a client connects to the server, sends a request, receives the response and then closes the connection. In this model the server can serve a large number of clients using a small number of workers. The concurrency model in this situation is typically based on threads, processes or a combination of both.
When you use websocket the problem is more complex, because a websocket connection is open for a long period of time, so the server cannot use a small pool of workers to serve a large number of clients, each client needs to get its own dedicated worker. If you use threads and/or processes then your app will not scale to support a large number of clients because you can't have large number of threads/processes.
This is where gevent enters the picture. Gevent has a concurrency model based on greenlets, which scale much better than threads/processes. So serving websocket connections with a gevent based server allows you support more clients, due to the lightweight nature of greenlets. With uWSGI you have a choice of concurrency models to use with web sockets, and that includes the greenlet based model from gevent. You can also use gevent's web server standalone if you want.
But note that gevent does not know anything about web sockets, it is just a server. To use websocket connections you have to add an implementation of the websocket server.
There are two extensions for Flask that simplify the use of websockets. The Flask-Sockets extension by Kenneth Reitz is a wrapper for gevent and gevent-websocket. The Flask-SocketIO extension (shameless plug as I'm the author) is a wrapper for gevent and gevent-socketio on the server, plus Socket.IO on the client. Socket.IO is higher level socket protocol that can use web socket if available but can also use other transport mechanisms on older browsers.
I have been working with Django for some time now and have written several apps on a setup that uses Apache 2 mod_wsgi and a PostgreSQL database on ubuntu.
I have aa app that uses xsendfile to serve files from Apache via a Django view, and also allow users to upload files via a form as well. All this working great, but I now want to ramp up the features (and the complexity I am sure) by allowing users to chat and to see when new files have been uploaded without refreshing their browser.
As I want this to be scale-able, I don't want to poll continually with AJAX as this is going to get very heavy with large numbers of users.
I have read more posts, sites and blogs then I can count on integrating comet functionality into a Django app but there are so many different opinions out there on how to do this that I am now completely confused.
Should I be using orbited, gevent, iosocket?
Where does Tornado fit into this debate?
I want the messages also be stored on the database, so do I need any special configuration
to prevent my application blocking when writing to the database?
Will running a chat server with Django have any impact on my ability to serve files from Apache?
I'd recommend using WebSockets for bidirectional realtime communication. Keep running Django as is and run a WebSocket server on another port. As far as your database blocking, yes, you'll need to keep that in mind as you write your WebSocket server and either use a non-blocking database driver, or address that in some way.
Client-side you'll want to use Socket.IO or web-socket-js to support flash fallback for older browsers which don't support flash.
For the server, I would lean towards gevent or tornado, personally. For gevent there is gevent-websocket and gevent-socketio, for tornado you get built-in WebSocket support and can use tornadio if you want to use Socket.IO. Eventlet and twisted both support WebSockets as well. There is also a pretty cool new project called autobahn which is built on twisted, and meinheld has WebSocket middleware you can use.
WebSockets are pretty exciting, and as such there are tons of great posts out there on the subject. I found these posts useful:
http://gehrcke.de/2011/06/the-best-and-simplest-tools-to-create-a-basic-websocket-application-with-flash-fallback-and-python-on-the-server-side/
http://codysoyland.com/2011/feb/6/evented-django-part-one-socketio-and-gevent/
http://toastdriven.com/blog/2011/jul/31/gevent-long-polling-you/
http://blog.jupo.org/post/8858247674/real-time-web-apps-with-django-and-websockets/
Instead of Apache + X-Sendfile you could use Nginx + X-Accel-Redirect. That way you can run a gevent/wsgi/django server behind Nginx with views that provide long-polling. No need for a separate websockets server.
I've used both Apache + X-Sendfile and Nginx + X-Accel-Redirect to serve (access-protected) content on Webfaction without any problems.