I am relatively new to Django and one thing that has been on my mind is changing the database that will be used when running the project.
By default, the DATABASES 'default' is used to run my test project. But in the future, I want to be able to define a 'production' DATABASES configuration and have it use that instead.
In a production environment, I won't be able to "manage.py runserver" so I can't really set the settings.
I read a little bit about "routing" the database to use another database, but is there an easier way so that I won't need to create a new router every time I have another database I want to use (e.g. I can have test database, production database, and development database)?
You can just use a different settings.py in your production environment.
Or - which is a bit cleaner - you might want to create a file settings_local.py next to settings.py where you define a couple of settings that are specific for the current machine (like DEBUG, DATABASES, MEDIA_ROOT etc.) and do a from settings_local import * at the beginning of your generic settings.py file. Of course settings.py must not overwrite these imported settings.
Why you need a test database? Django create test database automatically before running unittest. And, database routing is not fit your purpose, it's for routing you read/write requests to different database. If you want to use a development database, set up a new DATABASE config in, say local_settings.py, and at the last of your settings.py, type
try:
from local_settings import *
except ImportError:
pass
There is nothing you can specify in the settings directly. The practice I use is to have addtional setting files for different environments which contain just the settings overwritten which I want to change, like database settings or cache settings for example. My project root application for example would contain the following files on a development environment (attention to the leading underscore):
...
settings.py
settings_dev.py
_settings_test.py
_settings_prod.py
...
Then in settings.py I would add the following lines of code to the beginning:
try:
from settings_prod import *
except ImportError:
try:
from settings_test import *
except ImportError:
from settings_dev import *
Since I am on dev environment it will only import my settings_dev file, since the others have a leading underscore.
When I deploy then to a production or testing environment I would rename the relevant files. For production: _settings_prod.py -> settings_prod.py, for testing: _settings_test.py -> settings_test.py. settings_dev.py can basically stay as is, since it will be only imported if the other two fail.
The last step you could simply do with automated deployment via fabric or other tools. An example with fabric would be something like run('mv _settings_prod.py settings_prod.py') for renaming.
Related
As the title suggests - I'm looking for best practise for using API_KEYS, CLIENT_SECRETS etc etc within the settings.py of my Django project. I can't seem to find exactly what I'm looking for on this - documentation wise.
To ask an implicit question: what is the best method for storing this information in both development and in production?
The best practice for using API_KEYS and CLIENT_SECRET keys in settings.py would be to not store them there at all!
You will be better off setting environment variables in the OS and retrieving them in the settings.py file as and when needed. In this way, your keys will never touch the codebase and remain safely inside the OS.
You can do something like this in your settings.py:
import os
API_KEY = os.environ.get('API_KEY_ENVIRONMENT_VARIABLE')
CLIENT_SECRET = os.environ.get('CLIENT_SECRET_KEY_ENVIRONMENT_VARIABLE')
I add lines like this to my settings.py (This was from Python 2):
try:
from local import *
except ImportError:
print 'The local settings could not be found.'
Then I create a local.py file in the same directory that contains settings that are either specific to the deployment or secret. Then I add it to the ignore file for whatever revision control tool I use to make sure it never gets committed.
I have my project settings in settings.py file. There are database names, user names, host names etc. defined there. In project files I do import settings and then use the constants where needed like settings.HOST. For unit testing I would like to use different settings. How should I override the settings? I am not using django.
You could create a new file - say local_settings.py in which you override the specific settings for you debugging and testing purposes.
Then you add this block at the end of your settings.py
# Override these settings with local settings if such a file exists
try:
from local_settings import *
except ImportError as e:
pass
You should add local_settings.py to your .gitignore file to exclude this file from version control (if you are using git).
This is the standard way Django does this by the way.
I suppose the easiest way would be to move your settings.py to another folder for safekeeping and then make a new one, and edit that for debugging.
I have one application in Django 1.9.x, and I want to use multidb.
Have any configuration on the settings that I can enter multiple databases, and when it is on a specific server it uses the correct database. For example:
When I'm programming localhost use default, When I put in test serve, automatically switch to testserverdb and when I put in production server use productiondb, I tried use multi-db documentation but it's not what I want 'cuze this case is to work with legacy database, not my case.
How I do it?
In your settings file:
try:
from [app_name].local_settings import *
except ImportError:
pass
Changes you make in this local_settings file will be overridden. So now you can have a different local_settings file for your localhost, development or production. You can specify a separate db in these files separately.
It sounds like you want to have environment specific databases, not necessarily a single app that connects to many databases. You can easily accomplish this with a custom settings module for each of these environments.
You might have a structure like the following :
myproject/
- settings/
- __init__.py
- common.py
You'll want to put all your common settings under common.py. This will serve as the basis for all your other environment settings. From here,there are a few setups that you can use to do what you want, but I'm going to suggest that you use common.py as a base settings module that can be overridden locall.y
To do this, you can set your DJANGO_SETTINGS_MODULE to by myproject.settings, and in your __init__.py,
from .common import *
try:
from .local import *
except ImportError:
pass
Then on each environment (production/development/etc), you'll want to include a file named local.py in myproject/settings. Any settings you put in that local.py file will override your common.py when your settings module gets loaded up.
I have some Django middleware code that connects to a database. I want to turn the middleware into a reusable application ("app") so I can package it for distribution into many other projects, without needing to copy-and-paste.
I don't understand where a reusable application is supposed to configure itself. Since it's intended for redistribution I don't have the ability to write the central settings.py myself. Looking at the Django documentation I see there's settings.configure but it appears to replace the entire configuration, instead of letting me "splice" a new database into DATABASES.
What's the right way to give my reusable middleware app the ability to configure its own database connection? I don't want it to interfere with the databases of applications where I'll be installing it. Thanks.
You could follow the approach of Django debug toolbar. It includes an app config class, and overrides the settings in the ready method.
You can see the code here.
Database Connection.
Well first of all I am not quite sure it's a good idea for a middleware to connect to anything other than the default database for the project. If you really want that feature it would be best to ask the user to add it to the settings file directly because the database connection settings will vary wildly from install to install.
If your app wants to make raw queries just do
from django.db import connection
cursor = connection.cursor()
It would be better to use the ORM if you can.
App specific settings
One method is to create a file called app_settings.py in your reusable app. Then you can add code like the following into it.
from django.conf import settings
SOME_APP_SETTING = getattr(settings, 'SOME_APP_SETTING', SOME_DEFAULT)
This allows anyone installing your app to change some of the settings in the main settings.py file.
If you really wanted to get fancy, you could create a custom database settings section here and allow the user to override it in the settings.py file.
I want to store some system constants which do not change so frequently.
I have made an settings table in my database using django models to store them but this table will have only single entry and I change these settings with django admin.
Is there an alternate way to store some variables without having to create a database?
You want, I quote, some system constants which do not change so frequently. For this, you can defined your own variables in the settings.py file (or another files apart) and use them by importing them.
The most appropriate way would be to create a new file and import them into settings.py:
SETTING_1 = "/home/path/to/an/executable"
SETTING_2 = False
and then, in the settings.py:
from settings_site import *
It will make SETTING_* variables (give them useful names though) accessible in the settings of your project and you will be able to change the file even if you are using a VCS (SVN, Git...).
Otherwise, you can still implement a solution based on a configuration file, editable through a custom view, but it will require to create an application to manage that. But, coupled with the cache system, it can be as efficient as the use of the settings.py if you are parsing the file only when it is needed (at the startup and at every changes)