What to use for a medium to large python WSGI application, Apache + mod_wsgi or Nginx + mod_wsgi?
Which combination will need more memory and CPU time?
Which one is faster?
Which is known for being more stable than the other?
I am also thinking to use CherryPy's WSGI server but I hear it's not very suitable for a very high-load application, what do you know about this?
Note: I didn't use any Python Web Framework, I just wrote the whole thing from scratch.
Note': Other suggestions are also welcome.
For nginx/mod_wsgi, ensure you read:
http://blog.dscpl.com.au/2009/05/blocking-requests-and-nginx-version-of.html
Because of how nginx is an event driven system underneath, it has behavioural characteristics which are detrimental to blocking applications such as is the case with WSGI based applications. Worse case scenario is that with multiprocess nginx configuration, you can see user requests be blocked even though some nginx worker processes may be idle. Apache/mod_wsgi doesn't have this issue as Apache processes will only accept requests when it has the resources to actually handle the request. Apache/mod_wsgi will thus give more predictable and reliable behaviour.
The author of nginx mod_wsgi explains some differences to Apache mod_wsgi in this mailing list message.
The main difference is that nginx is built to handle large numbers of connections in a much smaller memory space. This makes it very well suited for apps that are doing comet like connections that can have many idle open connections. This also gives it quite a smaller memory foot print.
From a raw performance perspective, nginx is faster, but not so much faster that I would include that as a determining factor.
Apache has the advantage in the area of modules available, and the fact that it is pretty much standard. Any web host you go with will have it installed, and most techs are going to be very familiar with it.
Also, if you use mod_wsgi, it is your wsgi server so you don't even need cherrypy.
Other than that, the best advice I can give is try setting up your app under both and do some benchmarking, since no matter what any one tells you, your mileage may vary.
One thing that CherryPy's webserver has going for it is that it's a pure python webserver (AFAIK), which may or may not make deployment easier for you. Plus, I could see the benefits of using it if you're just using a server for WSGI and static content.
(shameless plug warning: I wrote the WSGI code that I'm about to mention)
Kamaelia will have WSGI support coming in the next release. The cool thing is that you'll likely be able to either use the pre-made one or build your own using the existing HTTP and WSGI code.
(end shameless plug)
With that said, given the current options, I would personally probably go with CherryPy because it seems to be the simplest to configure and I can understand python code moreso than I can understand C code.
You may do best to try each of them out and see what the pros and cons of each one are for your specific application though.
Related
I am new to the python web dev world and kind of confused about why we need a apache enviroment while we could run python web app with its build-in http sever? Also, from my experience, I could run an django app without setting up anything else. then why we still need apache + mod_wsgi? for performance?
actually what really confuse me is.... how my entry point of code should be written? e.g. I heard there are other advanced 'web server' as well like cherrypy/Tornado and every single of them will require different entry-point code. so I wonder if apache(+ mod_wsgi) is not overlapping with other web framework(I called them web server above)? (in most case) we should be using apache on production but use others as an 'addon'? thanks
Performance, stability, scalability, security, ...
The built in HTTP server is useful for simple testing or quickly running a web app on your development machine, but is in no way as scalable as the Apache server. Security will also likely be less hardened on the built-in one.
Also, Apache allows you to handle many extra things, such as vhosts, multiple kinds of server-side platforms (for instance, a Ruby on Rails app and a Django one on the same port/IP), which are harder to achieve with the built-in server.
Apache is way better than the python SimpleHTTPServer.
For one thing the SimpleHttpServer is single threaded, but apache can easily handle multiple threads. Apache can also be configured in many ways that SimpleHttpServer cannot do. Apache has an easy to use logging of requests, which is helpful for debugging & logging.
These days the standard entry point is a WSGI application object. Pretty well everything supports it. How each web framework exposes one, and how you set up each WSGI hosting mechanism to then use it is different. At the core though, the actual interface between web server and application is the same.
Performance and scalability would be the reasons to go with Apache in production. SimpleHTTPServer is fine for testing and internal use though.
Yes, in general you need those for performance.
If you want to avoid the complexity of setting up Apache until you really have to (which could be reasonable if you're short on time and/or lack experience) you will probably be better off by using CherryPy to serve Django. It has an all-Python web server with much better performance than the built-in.
You can find instructions for that here.
I think in the past python scripts would run off CGI, which would create a new thread for each process.
I am a newbie so I'm not really sure, what options do we have?
Is the web server pipeline that python works under any more/less effecient than say php?
You can still use CGI if you want, but the normal approach these days is using WSGI on the Python side, e.g. through mod_wsgi on Apache or via bridges to FastCGI on other web servers. At least with mod_wsgi, I know of no inefficiencies with this approach.
BTW, your description of CGI ("create a new thread for each process") is inaccurate: what it does is create a new process for each query's service (and that process typically needs to open a database connection, import all needed modules, etc etc, which is what may make it slow even on platforms where forking a process, per se, is pretty fast, such as all Unix variants).
I would suggest Django http://www.djangoproject.com. It is very convenient to use, has everything you need for making web services. The most efficient way to use it is to run it as via Apache's mod_wsgi, and make Apache itself serve the static files.
This generally has better performance than solutions such as CGI and mod-python, as the Python process running the web service runs separate from the main web server, so it can cache stuff and easily re-use resources (like DB handles).
Also, you can then tweak the number of worker threads for Apache and your web application separately, resulting in better scalability.
I suggest cherrypy (http://www.cherrypy.org/). It is very convenient to use, has everything you need for making web services, but still quite simple (no mega-framework). The most efficient way to use it is to run it as self-contained server on localhost and put it behind Apache via a Proxy statement, and make apache itself serve the static files.
This generally has better performance than solutions such as CGI and mod-python, as the Python process running the web service runs separate from the main web server, so it can cache stuff and easily re-use resources (like DB handles).
Also, you can then tweak the number of worker threads for Apache and your web application separately, resulting in better scalability.
I am currently running a high-traffic python/django website using Apache and mod_wsgi. I'm hoping that there's a faster webserver configuration out there, and I've heard a fair number of recommendations for lighttpd and fastcgi. Is this setup faster than apache+mod_wsgi for serving dynamic django pages (I'm already convinced that lighttpd can server static files better)? The benchmarks online are either poorly conducted or inconclusive so I'm looking for some personal anecdotes. What architectural benefits does lighttpd + fastcgi provide? I understand that lighttpd uses epoll, and that a fastcgi process will be multithreaded. Also, having two separate processes, one for lighttpd and one for the python interpreter, will be largely beneficial.
I am aware of tornado and its ability to handle thousands of file descriptors with much fewer threads using epoll and callbacks. However, I'd prefer to stick with django for now.
Thanks,
Ken
I suggest nginx with superfcgi for web sites with high load. nginx is very fast for static files. superfcgi uses multiple processes with multiple threads that shows high stability for python applications in spite of GIL, just set number of processes to number of CPU cores at your server.
I don't have thorough benchmarks, but I'm personally convinced that, just like lighttpd can outperform apache on simpler tasks, mod_wsgi gives apache the laurel when it comes to serving Python web apps. (nginx with its own mod_wsgi seems to perform even better than apache, but, hey, you didn't ask about that!-).
Doesn't answer you question, but do you already use caching for your site? Like memcached? This might give you a better performance gain than going through the mess of switching webservers.
you can try fcgid. https://github.com/chenyf/fcgid, it's a C++ fastcgi server
I am deploying a WSGI application. There are many ways to skin this cat. I am currently using apache2 with mod-wsgi, but I can see some potential problems with this.
So how can it be done?
Apache Mod-wsgi (the other mod-wsgi's seem to not be worth it)
Pure Python web server eg paste, cherrypy, Spawning, Twisted.web
as 2 but with reverse proxy from nginx, apache2 etc, with good static file handling
Conversion to other protocol such as FCGI with a bridge (eg Flup) and running in a conventional web server.
More?
I want to know how you do it, and why it is the best way to do it. I would absolutely love you to bore me with details about the whats and the whys, application specific stuff, etc.
As always: It depends ;-)
When I don't need any apache features I am going with a pure python webserver like paste etc. Which one exactly depends on your application I guess and can be decided by doing some benchmarks. I always wanted to do some but never came to it. I guess Spawning might have some advantages in using non blocking IO out of the box but I had sometimes problems with it because of the patching it's doing.
You are always free to put a varnish in front as well of course.
If an Apache is required I am usually going with solution 3 so that I can keep processes separate. You can also more easily move processes to other servers etc. I simply like to keep things separate.
For static files I am using right now a separate server for a project which just serves static images/css/js. I am using lighttpd as webserver which has great performance (in this case I don't have a varnish in front anymore).
Another useful tool is supervisord for controlling and monitoring these services.
I am additionally using buildout for managing my deployments and development sandboxes (together with virtualenv).
I would absolutely love you to bore me with details about the whats and the whys, application specific stuff, etc
Ho. Well you asked for it!
Like Daniel I personally use Apache with mod_wsgi. It is still new enough that deploying it in some environments can be a struggle, but if you're compiling everything yourself anyway it's pretty easy. I've found it very reliable, even the early versions. Props to Graham Dumpleton for keeping control of it pretty much by himself.
However for me it's essential that WSGI applications work across all possible servers. There is a bit of a hole at the moment in this area: you have the WSGI standard telling you what a WSGI callable (application) does, but there's no standardisation of deployment; no single way to tell the web server where to find the application. There's also no standardised way to make the server reload the application when you've updated it.
The approach I've adopted is to put:
all application logic in modules/packages, preferably in classes
all website-specific customisations to be done by subclassing the main Application and overriding members
all server-specific deployment settings (eg. database connection factory, mail relay settings) as class __init__() parameters
one top-level ‘application.py’ script that initialises the Application class with the correct deployment settings for the current server, then runs the application in such a way that it can work deployed as a CGI script, a mod_wsgi WSGIScriptAlias (or Passenger, which apparently works the same way), or can be interacted with from the command line
a helper module that takes care of above deployment issues and allows the application to be reloaded when the modules the application is relying on change
So what the application.py looks like in the end is something like:
#!/usr/bin/env python
import os.path
basedir= os.path.dirname(__file__)
import MySQLdb
def dbfactory():
return MySQLdb.connect(db= 'myappdb', unix_socket= '/var/mysql/socket', user= 'u', passwd= 'p')
def appfactory():
import myapplication
return myapplication.Application(basedir, dbfactory, debug= False)
import wsgiwrap
ismain= __name__=='__main__'
libdir= os.path.join(basedir, 'system', 'lib')
application= wsgiwrap.Wrapper(appfactory, libdir, 10, ismain)
The wsgiwrap.Wrapper checks every 10 seconds to see if any of the application modules in libdir have been updated, and if so does some nasty sys.modules magic to unload them all reliably. Then appfactory() will be called again to get a new instance of the updated application.
(You can also use command line tools such as
./application.py setup
./application.py daemon
to run any setup and background-tasks hooks provided by the application callable — a bit like how distutils works. It also responds to start/stop/restart like an init script.)
Another trick I use is to put the deployment settings for multiple servers (development/testing/production) in the same application.py script, and sniff ‘socket.gethostname()’ to decide which server-specific bunch of settings to use.
At some point I might package wsgiwrap up and release it properly (possibly under a different name). In the meantime if you're interested, you can see a dogfood-development version at http://www.doxdesk.com/file/software/py/v/wsgiwrap-0.5.py.
The absolute easiest thing to deploy is CherryPy. Your web application can also become a standalone webserver. CherryPy is also a fairly fast server considering that it's written in pure Python. With that said, it's not Apache. Thus, I find that CherryPy is a good choice for lower volume webapps.
Other than that, I don't think there's any right or wrong answer to this question. Lots of high-volume websites have been built on the technologies you talk about, and I don't think you can go too wrong any of those ways (although I will say that I agree with mod-wsgi not being up to snuff on every non-apache server).
Also, I've been using isapi_wsgi to deploy python apps under IIS. It's a less than ideal setup, but it works and you don't always get to choose otherwise when you live in a windows-centric world.
Nginx reverse proxy and static file sharing + XSendfile + uploadprogress_module. Nothing beats it for the purpose.
On the WSGI side either Apache + mod_wsgi or cherrypy server. I like to use cherrypy wsgi server for applications on servers with less memory and less requests.
Reasoning:
I've done benchmarks with different tools for different popular solutions.
I have more experience with lower level TCP/IP than web development, especially http implementations. I'm more confident that I can recognize a good http server than I can recognize a good web framework.
I know Twisted much more than Django or Pylons. The http stack in Twisted is still not up to this but it will be there.
I'm using Google App Engine for an application I'm developing. It runs WSGI applications.
Here's a couple bits of info on it.
This is the first web-app I've ever really worked on, so I don't have a basis for comparison, but if you're a Google fan, you might want to look into it. I've had a lot of fun using it as my framework for learning.
TurboGears (2.0)
TurboGears 2.0 is leaving Beta within the next month (has been in it for plenty of time). 2.0 improves upon 1.0 series and attempts to give you best-of-breed WSGI stack, so it makes some default choices for you, if you want the least fuss.
it has the tg* tools for testing and deployment in 1.x series, but now transformed to paster equivalents in 2.0 series, which shoud seem familiar if you've expermiented with pylons.
tg-admin quickstart —> paster quickstart
tg-admin info —> paster tginfo
tg-admin toolbox –> paster toolbox
tg-admin shell –> paster shell
tg-admin sql create –> paster setup-app development.ini
Pylons
It you'd like to be more flexible in your WSGI stack (choice of ORM, choice of templater, choice of form-ing), Pylons is becoming the consolidated choice. This would be my recommended choice, since it offers excellent documentation and allows you to experiment with different components.
It is a pleasure to work with as a result, and works on under Apache (production deployment) or stand-alone (helpful for testing and experimenting stage).
so it follows, you can do both with Pylons:
2 option for testing stage (python standalone)
4 for scalable production purposes (FastCGI, assuming the database you choose can keep up)
The Pylons admin interface is very similar to TurboGears. Here's a toy standalone example:
$ paster create -t pylons helloworld
$ cd helloworld
$ paster serve --reload development.ini
for production-class deployment, you could refer to the setup guide of Apache + FastCGI + mod_rewrite is available here. this would scale up to most needs.
Apache httpd + mod_fcgid using web.py (which is a wsgi application).
Works like a charm.
We are using pure Paste for some of our web services. It is easy to deploy (with our internal deployment mechanism; we're not using Paste Deploy or anything like that) and it is nice to minimize the difference between production systems and what's running on developers' workstations. Caveat: we don't expect low latency out of Paste itself because of the heavyweight nature of our requests. In some crude benchmarking we did we weren't getting fantastic results; it just ended up being moot due to the expense of our typical request handler. So far it has worked fine.
Static data has been handled by completely separate (and somewhat "organically" grown) stacks, including the use of S3, Akamai, Apache and IIS, in various ways.
Apache+mod_wsgi,
Simple, clean. (only four lines of webserver config), easy for other sysadimns to get their head around.
Good morning.
As the title indicates, I've got some questions about using python for web development.
What is the best setup for a development environment, more specifically, what webserver to use, how to bind python with it. Preferably, I'd like it to be implementable in both, *nix and win environment.
My major concern when I last tried apache + mod_python + CherryPy was having to reload webserver to see the changes. Is it considered normal? For some reason cherrypy's autoreload didn't work at all.
What is the best setup to deploy a working Python app to production and why? I'm now using lighttpd for my PHP web apps, but how would it do for python compared to nginx for example?
Is it worth diving straight with a framework or to roll something simple of my own? I see that Django has got quite a lot of fans, but I'm thinking it would be overkill for my needs, so I've started looking into CherryPy.
How exactly are Python apps served if I have to reload httpd to see the changes? Something like a permanent process spawning child processes, with all the major file includes happening on server start and then just lazy loading needed resources?
Python supports multithreading, do I need to look into using that for a benefit when developing web apps? What would be that benefit and in what situations?
Big thanks!
What is the best setup for a development environment?
Doesn't much matter. We use Django, which runs in Windows and Unix nicely. For production, we use Apache in Red Hat.
Is having to reload webserver to see the changes considered normal?
Yes. Not clear why you'd want anything different. Web application software shouldn't be dynamic. Content yes. Software no.
In Django, we develop without using a web server of any kind on our desktop. The Django "runserver" command reloads the application under most circumstances. For development, this works great. The times when it won't reload are when we've damaged things so badly that the app doesn't properly.
What is the best setup to deploy a working Python app to production and why?
"Best" is undefined in this context. Therefore, please provide some qualification for "nest" (e.g., "fastest", "cheapest", "bluest")
Is it worth diving straight with a framework or to roll something simple of my own?
Don't waste time rolling your own. We use Django because of the built-in admin page that we don't have to write or maintain. Saves mountains of work.
How exactly are Python apps served if I have to reload httpd to see the changes?
Two methods:
Daemon - mod_wsgi or mod_fastcgi have a Python daemon process to which they connect. Change your software. Restart the daemon.
Embedded - mod_wsgi or mod_python have an embedded mode in which the Python interpreter is inside the mod, inside Apache. You have to restart httpd to restart that embedded interpreter.
Do I need to look into using multi-threaded?
Yes and no. Yes you do need to be aware of this. No, you don't need to do very much. Apache and mod_wsgi and Django should handle this for you.
So here are my thoughts about it:
I am using Python Paste for developing my app and eventually also running it (or any other python web server). I am usually not using mod_python or mod_wsgi as it makes development setup more complex.
I am using zc.buildout for managing my development environment and all dependencies together with virtualenv. This gives me an isolated sandbox which does not interfere with any Python modules installed system wide.
For deployment I am also using buildout/virtualenv, eventually with a different buildout.cfg. I am also using Paste Deploy and it's configuration mechanism where I have different config files for development and deployment.
As I am usually running paste/cherrypy etc. standalone I am using Apache, NGINX or maybe just a Varnish alone in front of it. It depends on what configuration options you need. E.g. if no virtual hosting, rewrite rules etc. are needed, then I don't need a full featured web server in front. When using a web server I usually use ProxyPass or some more complex rewriting using mod_rewrite.
The Python web framework I use at the moment is repoze.bfg right now btw.
As for your questions about reloading I know about these problems when running it with e.g. mod_python but when using a standalone "paster serve ... -reload" etc. it so far works really well. repoze.bfg additionally has some setting for automatically reloading templates when they change. If the framework you use has that should be documented.
As for multithreading that's usually used then inside the python web server. As CherryPy supports this I guess you don't have to worry about that, it should be used automatically. You should just eventually make some benchmarks to find out under what number of threads your application performs the best.
Hope that helps.
+1 to MrTopf's answer, but I'll add some additional opinions.
Webserver
Apache is the webserver that will give you the most configurability. Avoid mod_python because it is basically unsupported. On the other hand, mod_wsgi is very well supported and gives you better stability (in other words, easier to configure for cpu/memory usage to be stable as opposed to spikey and unpredictable).
Another huge benefit, you can configure mod_wsgi to reload your application if the wsgi application script is touched, no need to restart Apache. For development/testing servers you can even configure mod_wsgi to reload when any file in your application is changed. This is so helpful I even run Apache+mod_wsgi on my laptop during development.
Nginx and lighttpd are commonly used for webservers, either by serving Python apps directly through a fastCGI interface (don't bother with any WSGI interfaces on these servers yet) or by using them as a front end in front of Apache. Calls into the app get passed through (by proxy) to Apache+mod_wsgi and then nginx/lighttpd serve the static content directly.
Nginx has the added advantage of being able to serve content directly from memcached if you want to get that sophisticated. I've heard disparaging comments about lighttpd and it does seem to have some development problems, but there are certainly some big companies using it successfully.
Python stack
At the lowest level you can program to WSGI directly for the best performance. There are lots of helpful WSGI modules out there to help you in areas you don't want to develop yourself. At this level you'll probably want to pick third-party WSGI components to do things like URL resolving and HTTP request/response handling. A great request/response component is WebOb.
If you look at Pylons you can see their idea of "best-of-breed" WSGI components and a framework that makes it easier than Django to choose your own components like templating engine.
Django might be overkill but I don't think that's a really good argument against. Django makes the easy stuff easier. When you start to get into very complicated applications is where you really need to look at moving to lower level frameworks.
Look at Google App Engine. From their website:
Google App Engine lets you run your
web applications on Google's
infrastructure. App Engine
applications are easy to build, easy
to maintain, and easy to scale as your
traffic and data storage needs grow.
With App Engine, there are no servers
to maintain: You just upload your
application, and it's ready to serve
your users.
You can serve your app using a free
domain name on the appspot.com domain,
or use Google Apps to serve it from
your own domain. You can share your
application with the world, or limit
access to members of your
organization.
App Engine costs nothing to get
started. Sign up for a free account,
and you can develop and publish your
application for the world to see, at
no charge and with no obligation. A
free account can use up to 500MB of
persistent storage and enough CPU and
bandwidth for about 5 million page
views a month.
Best part of all: It includes Python support, including Django. Go to http://code.google.com/appengine/docs/whatisgoogleappengine.html
When you use mod_python on a threaded Apache server (the default on Windows), CherryPy runs in the same process as Apache. In that case, you almost certainly don't want CP to restart the process.
Solution: use mod_rewrite or mod_proxy so that CherryPy runs in its own process. Then you can autoreload to your heart's content. :)