I'm working on a site on OpenShift (Django, Python 2.7, Apache), and I'd like it to always be served over HTTPS. I figured this would be a simple task using .htaccess, and OpenShift's knowledge base seems to show that it should be. However, I've been agonizing over this for days, and I can't figure out how to make it work. Basically, I can't get my .htaccess files to operate on anything but static files. I simply can't get requests for non-static resources to redirect to anything.
Here's my directory structure, inside ~/app-root/repo. It's using the "new" (March 2014 or so?) OpenShift directory structure.
/MyProject
settings.py
urls.py
[etc.]
/[app directories]
/static
/wsgi
wsgi.py
static -> ../static
manage.py
requirements.txt
Ok, so after much frustration, I've been testing with a minimum viable .htaccess to just see if I can get anything to work. It looks like this:
RewriteEngine on
RewriteRule .* http://www.google.com
Everything should go to Google. If I put this in the wsgi directory, all my static files get redirected to google.com, but nothing else does. If I put it in ~/app-root/repo, nothing at all happens. It's like the file doesn't exist.
Similarly, if I use the commands suggested by OpenShift and put the file in wsgi, my static files get redirected to use HTTPS, but nothing else does.
I've also tried WSGI and Django things like adding os.environ['HTTPS'] = "on" to wsgi.py as suggested by sites like this one. That's been ineffective, and I really can't say I'm knowledgable enough about those sorts of settings to know what I'd need to add, but ideally I'd like to do this through .htaccess. The request should be using SSL before it hits WSGI.
What do I need to do to make my .htaccess files work on all requests? And why is OpenShift ignoring .htaccess files not in wsgi?
I fought with this for a short time and quickly decided to use the django-ssl-redirect middleware package.
The whole reason I'm on OpenShift is to avoid fighting with these kinds of configuration issue, so I'm a little disappointed I had to resort to this.
Still, here goes:
Just add this to your MIDDLEWARE_CLASSES
'ssl_redirect.middleware.SSLRedirectMiddleware',
Add this to requirements.txt
django-ssl-redirect==1.0
Add something like this in settings (I put this in my base settings file that I share in multiple environments, hence the "if").
if os.environ.has_key('OPENSHIFT_REPO_DIR'):
SSL_ON = True
SSL_ALWAYS = True
More info here: https://bitbucket.org/nextscreenlabs/django-ssl-redirect
Related
I need to prevent local Django/runserver from serving my static files (I want to test whitenoise locally). The --nostatic flag is supposed to do just that.
python manage.py runserver --nostatic
But I still get 200s (successful) for my static files. So then I set debug = False but still get 200s! I even commented out 'django.contrib.staticfiles' from INSTALLED_APPS. But this did not work either. How could my static files still be served successfully - it's usually not this hard to break things. Perhaps I am misunderstanding what nostatic actually does; I was expecting to get 404s.
Resolved in the comments, but for future reference the answer was to make sure that whitenoise.middleware.WhiteNoiseMiddleware was removed from the MIDDLEWARE list.
I'm running a Django server with Gunicorn and Nginx hosted on DigitalOcean. I've run into a problem where adding a new file through the administrator interface produces a 403 forbidden error. Specifically, the file in question works fine if I summon a query of it (e.g. Object.objects.all())but can't be rendered in my templates. I've previously fixed the problem by doing chmod/chown, but the fix only applies to existing files, not new ones. Does anyone know how to permanently apply the fix once?
TL;DR:
FILE_UPLOAD_PERMISSIONS = 0o644 in settings.py
in bash shell: find /path/to/MEDIA_ROOT -type d -exec chmod go+rx {} +
The explanation
The files are created with permissions that are too restrictive, so the user Nginx runs as, cannot read them. To fix this you need to make sure Nginx can read the files and can get to the files.
The goal
First you need FILE_UPLOAD_PERMISSIONS to allow reading by the Nginx user. Second, MEDIA_ROOT and all subdirectories must be readable by Nginx and writeable by Gunicorn.
How to
You must ensure the directories are world readable (and executable) or the group for the directories must be a group that the Nginx process belongs to and they must be at least group readable (and executable).
As a side note, you said you've used chmod and chown before, so I assumed you were familiar with the terminology used. Since you're not, I highly recommend fully reading the linked tutorial, so you understand what the commands you used can do and can screw up.
I have a domain example.com and I run a standard Apache there (serving static files and PHP). I want to run Python served pages on subdomain.example.com.
I managed to configure Apache to do this and there is a Flask app running at subdomain.example.com. However, in the virtual host config file, the whole subdomain is tight to this one single app. I would like to go further and run several different apps on this subdomain.
For example:
subdomain.example.com/app1/ would run /var/www/apps/app1/app.wsgi
subdomain.example.com/app2/ would run /var/www/apps/app2/app.wsgi
and so on...
Furthermore I would like this to be fully automatic, that is when I set up new folder in /var/www/apps/, I could reach the app through the Apache without further configuration.
I can see several ways of doing this:
Configure Apache to route every subdomain.example.com request
to a single "meta app" in Python which would run a specific app
based on given URL.
Do some magic with Apache configuration that would take care of
this automatically.
Maybe use nginx? I don't really have much experience with this,
but someone told me this could solve the problem.
Is there any best practice about how to do this? Thank you.
It looks like you should be able to do this by providing a directory path (in your case, /var/www/apps) to the WSGIScriptAlias directive. Read more here:
https://modwsgi.readthedocs.org/en/develop/configuration-directives/WSGIScriptAlias.html
This seems like a neater solution than using a meta app.
So, I'm running Apache2 on a Linux machine, and I'm trying to serve pages with Django 1.3. I found a guide to do this here.
I have the django.wsgi configured, the settings.py configured, and the database created and successfully in sync with Django. However, when I try to visit the website, I am shown a page served by Apache, instead of Django. I get no errors/warnings at all.
I put print statements in both django.wsgi and settings.py (since they're both just python files), but nothing gets printed.
Does anyone have any idea as to what may be going wrong or any diagnostic steps I might be able to take?
Thanks!
As everyone has said in the comments, you need to add a WSGIServerAlias directive to the Apache configuration before anything will work. Otherwise, Apache can't possibly know to use WSGI to serve your site.
As the title states, I'm trying to figure out the best practice for where to store application files for a Python website on the server. Document root, or no?
I come from a land of PHP. :)
EDIT - To that end, links to any material describing the best practice differences between Python and PHP are hugely appreciated.
No. WSGI containers don't require the scripts to be in the document root, and so to increase security in case of a transient server error they shouldn't be placed in the document root.
There's no reason to store it in the document root.
While storing the app in the doc root isn't nessescarily a security problem - if configured correctly and handled carefully - storing it outside will remove a lot of headache and configuration work.
That's the main reason not to do it.
I personally use https://bitbucket.org/acoobe/django-starter/ layout with buildout. So developed apps goes to apps folder and just used apps goes to parts/eggs folders (parts for packets from git, mercurial or svn and eggs for pypi located apps).
So the answer is NO. Everything should be placed in separate tidied folders. All your server need to know were is wsgi script and where is var dir. Well just like everyone else here said.
Everything has been said I think, so I will only elaborate a bit. Here is an explanation of how Apache maps URLs to files on disk: http://httpd.apache.org/docs/2.2/urlmapping.html. As you can see, the base rule is that only the files within DocumentRoot are exposed to the outside world. You can change that by doing the explicit import of other files or folders using e.g. Alias directive.
Now, you obviously don't want your Python scripts to be exposed to everyone - which means that you should keep them outside DocumentRoot and any other folder "imported" to DocumentRoot (using e.g. the mentioned Alias directive). What you want to do instead is to merely hook given URL to your Python program - if you use mod_wsgi, this can be done with WSGIScriptAlias directive. In other words, you should map the effects (result) of your script to given URL, instead of mapping the files themselves.
So - where you should keep your Python files? I would say it's a matter of personal taste - some people advise to not keep them in user folder (i.e. /home/xyz/) because e.g. Apache configuration flaw may expose user folders to the outside world. What's left? E.g. /usr/local/, /var/www - there's really no magic in picking home folder for your scripts.