Managing multiple settings.py files [duplicate] - python

This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
How to manage local vs production settings in Django?
I have managed to deploy successfully a Django project on Apache's Web Server with mod_wsgi.
I would like some recommendations on how to manage multiple settings.py files. Right now I have one for development and one totally different for production (regarding DB parameters, static content localization, and stuff like that). My settings.py file is versioned (don't know if this is a good practise) and I deploy it with something like:
$ hg archive myproject.tbz2
$ cd /path/of/apache/web/project/location
$ bzip2 -db /home/myself/myproject/myproject.tbz2 | tar -xvf -
It's working OK. But I find myself manipulating multiple settings.py files.
I guess my question is: what are the best practices when deploying DJANGO PROJECTS regarding multiple settings.py file versions?

I use a settings module that is not a single file:
settings/
__init__.py
_base.py
_servers.py
development.py
production.py
testing.py
The __init__.py file is simple:
from _servers import get_server_type
exec("from %s import *" % get_server_type())
The _base.py file contains all of the common settings across all server types.
The _servers.py file contains a function get_server_type() that uses socket.gethostname() to determine what type of server the current machine is: it returns development, production or testing.
Then the other files look a bit like (production.py):
DEBUG=False
TEMPLATE_DEBUG=False
from _base import *
In each of these files, I put in the settings that only apply to this server type.

The trick that seems to be the most common is to maintain both a settings.py and local_settings.py (one for each environment) file.
Environment agnostic settings go into settings.py and at the bottom of the file, you'll import from local_settings.py
try:
from local_settings import *
except ImportError:
pass
You can override any settings.py settings in the appropriate local_settings.py

django-admin.py / manage.py both accept a --settings=mysite.settings option. In development you could explicitly specify --settings=dev_settings. You can also set the DJANGO_SETTINGS_MODULE environment variable in your apache configuration.
Personally, I simply don't check in settings.py. Instead I check in multiple settings files (dev_settings, prod_settings, etc) and symbolically link them to settings.py as desired. This way if I simply checkout my application it won't be runnable until I think about which settings file is appropriate and actually put that settings file in place.
Another suggestion I've heard but I don't particularly like is having a settings.py that dynamically imports a dev_settings.py if it exists. While this may be more convenient I'd be concerned that it's harder to read settings.py and know what the settings will actually be without also looking for overriding values in a dev_settings.py file that may or may not exist.

My preferred way is to load a separate ini file using ConfigParser, based off a single setting or environment variable. Anyway in the django wiki there are many different options described: http://code.djangoproject.com/wiki/SplitSettings

Related

When I add settings folder with settings for different environments, I get CommandError: You must set settings.ALLOWED_HOSTS if DEBUG is False

I want to make two files with different settings for dev and prod.
I created a python package in the app folder where my settings are, and even if I run the app with old settings, I receive an error:
CommandError: You must set settings.ALLOWED_HOSTS if DEBUG is False.
Here is my project structure:
Well at first for this purpose:
I want to make two files with different settings for dev and prod.
You should move to settings.py module to settings dir at first and it's better to change its name to for example base.py. After that, you can provide two modules called dev.py and prod.py for a different modes of your projects.
Also with these changes, you must change the route of your settings in manage.py module.
Note: with these changes, you can provide different values for DEBUG, ALLOWED_HOSTS, etc. in your different modes.
The problem was that I should specify the path to my settings in my run settings:
Also, in I should change my BASE_DIR by adding one more parent here:
BASE_DIR = Path(__file__).resolve().parent.parent.parent

overwrite settings and package defaults

I have created a reusable application for Django that requires you to add settings in the settings.py file. i.e.
version = '1'
url = 'api.domain.com'
However, these settings rarely change and I would much prefer them to be defaults in my package but allow the developer to overwrite them should they wish in their own settings.py.
My package is like an app so it does not have a settings.py so how do I go about added these setting as defaults in my package while still allowing them to be overwritten in developers settings.py of a project?
I hope that makes sense.
A common thing that django folks do is to include a local_settings.py in your local copy for stuff that only want on your local copy. Then, at the end of your settings.py file, put:
try:
from local_settings import *
except ImportError:
pass
Be sure to add local_settings.py to your .gitignore (or equivalent for your VCS) so people aren't stepping on eachothers' feet by accidentally commiting local stuff.

django manage.py settings default

I have a settings.py file and a dev_settings.py file that I use to override some values for dev purposes. Everytime I run the ./manage.py command, I have to specify --settings=whatever.local_settings. This becomes very tedious to do every time and I am trying to find a way to force manage.py to load my dev_settings.py file every by default so that I don't have to type that long argument every time I want to run a command.
I have tried setting DJANGO_SETTINGS_MODULE, however, it appears that manage.py overrides this option.
Is it possible to make this happen or am I doomed to always specify that argument?
manage.py sets path to settings for you, that's why it's ignoring DJANGO_SETTINGS_MODULE (it's basically just script that wraps around django-admin.py).
There are 2 easy ways to fix your problem:
set DJANGO_SETTINGS_MODULE and use django-admin.py to run all commands instead of manage.py. This is even better if you use vitualenv.
copy manage.py and name it local.py (that's the name in my case) and rename all settings mentions to dev_settings.
For example:
#!/usr/bin/env python
from django.core.management import execute_manager
import imp
try:
import settings_local
except ImportError:
import sys
sys.stderr.write("Error: Can't find the file 'settings_local.py' in the directory containing %r. It appears you've customized things.\nYou'll have to run django-admin.py, passing it your settings module.\n" % __file__)
sys.exit(1)
if __name__ == "__main__":
execute_manager(settings_local)
You can run all commands by ./local.py now.
The way this is typically done is you have settings.py with all settings that are common between environments (things like INSTALLED_APPS, etc.). Then, you have something like settings_local.py, that defines settings particular to the environment in context. You then import settings_local.py in settings.py.
# settings.py
from settings_local import *
settings.py gets added to your source code repository, but settings_local.py does not. (However, you would normally add something like settings_local.py.example to the repo.)
When you first move your app over to production, for example, you pull down the code base from your repo. You then copy settings_local.py.example to settings_local.py and make any necessary environment specific changes.
You then have separate settings_local.py files in each environment, and it all just works.
You can make a bash alias by adding these lines to your .bash_profile file:
mymanage()
{
python manage.py $1 --settings=settings_debug
}
alias mng=mymanage
Then when you run this command:
mng runserver
settings_debug.py file will be used for settings.
You can use django-admin.py with that environment variable. Commands are interchangeable, only django-admin.py doesn't override the variable you're trying to use.
If a settings file is common to all installation, you can just import it e.g.
from settings_local import *
but usually settings_local are changed and tweaked per installation and as my installation script directly copy files to target sites (without worrying what is local what is not), which mean settings_local may get overwritten, to avoid that I just keep settings_local in parent folder of the installation target and manually import it in settings.py e.g.
local_settings_file = os.path.join(prevFolder, "settings_local.py")
if os.path.exists(local_settings_file):
execfile(local_settings_file)

Python - packages and settings file

I have a python package that needs to pull in settings from my project directory, here is how my project is currently structured:
~/Project/bin/mypackage
- package files
~/Project/myproject/
- project files
- start.py
- settings.py
I guess it's similar to how Django is structured, you have a settings.py file in your project directory that is somehow referenced by the Django system package in your Python directory.
So, if I am running start.py like so:
python ~/Project/myproject/start.py
..and start.py imports and utilizes the mypackage package, is there any way I can reference the settings.py file local to start.py from within the package? Would I have to load the settings file in start.py and store the values in a global? Does anyone know how this is possible?
The way I see it you have several options:
look for settings and import them either from the current working directory or as determined from environment variables. This is the "django way" using DJANGO_SETTINGS_MODULE and PYTHONPATH. This is nice and magical when it works and inconvenient when it doesn't such as in your case when you are running from a different directory.
rely on module search path which will include the directory of the calling package. Nice and simple but the settings will vary based on the caller. For example all you need in mypackage is:
import settings
pass in settings as a variable
The directory containing the script that was used to invoke the python interpreter is added to the PYTHONPATH. It is available at sys.path[0]. See http://docs.python.org/library/sys.html#sys.path
This means that settings should be available from mypackage.mymodule simply by import settings.
However, I would consider handling the loading of settings in start.py and structuring your app so that a settings object (perhaps just a dict) is passed to it.

How do I setup and alternate directory structure for django?

I'm starting on my first large django project and have realized that the default directory structure isn't going to work well.
I saw this question and answer and decided to implement something similar.
Large Django application layout
What do I need to do to make something like this work? How can I change where django looks for apps/modules?
Python works automatically with deep directory structures. That's porbably you didn't find any instructions on how to do it.Here are some instructions on how to get classes and models to work.
If you want to have a module in folder yourproject/apps/firstapp you can just add it to INSTALLED_APPS by adding line 'apps.firstapp',. You will have to add a __init__.py file to each of the directories so that they are recognized as python packages.
When you import classes you can simply use from yourproject.apps.firstapp.filename import yourclass.
You should also make sure all template directories are listed in TEMPLATE_DIRS.
I have two methods to handle this; one for production and one for development.
In development:
Append the path to your apps in your settings.py. In my case, I have my main project in ~/hg/django/project/ and my apps are in ~/hg/django/apps/. So I use:
if DEVEL:
sys.path.insert(0, os.path.join(os.path.dirname(__file__), '..', 'apps'))
I also use the production method on my development box. This method has a knock on effect of letting me roll back to the most recent production version by simply commenting out the line path insertion in the settings.py.
In production:
I install the apps using distutils so that I can control deployment per server, and multiple projects running on each server can all access them. You can read about the distutils setup scripts here. Then, to install, you simply:
./setup.py install
if you add:
import os
PROJECT_ROOT = os.path.abspath(os.path.dirname(__file__))
to your settings .py and the following to your manage.py:
sys.path.insert(0, join(settings.PROJECT_ROOT, "apps"))
sys.path.insert(0, join(settings.PROJECT_ROOT, "lib"))
then you can include them in your installed apps just as you would if they were in your project root:
INSTALLED_APPS = (
#local apps
'myapp',
#local libs
'piston_dev',
)
this allows you a bit more freedom to move apps around without having to redeclare imports and such.

Categories