I have my project settings in settings.py file. There are database names, user names, host names etc. defined there. In project files I do import settings and then use the constants where needed like settings.HOST. For unit testing I would like to use different settings. How should I override the settings? I am not using django.
You could create a new file - say local_settings.py in which you override the specific settings for you debugging and testing purposes.
Then you add this block at the end of your settings.py
# Override these settings with local settings if such a file exists
try:
from local_settings import *
except ImportError as e:
pass
You should add local_settings.py to your .gitignore file to exclude this file from version control (if you are using git).
This is the standard way Django does this by the way.
I suppose the easiest way would be to move your settings.py to another folder for safekeeping and then make a new one, and edit that for debugging.
Related
As the title suggests - I'm looking for best practise for using API_KEYS, CLIENT_SECRETS etc etc within the settings.py of my Django project. I can't seem to find exactly what I'm looking for on this - documentation wise.
To ask an implicit question: what is the best method for storing this information in both development and in production?
The best practice for using API_KEYS and CLIENT_SECRET keys in settings.py would be to not store them there at all!
You will be better off setting environment variables in the OS and retrieving them in the settings.py file as and when needed. In this way, your keys will never touch the codebase and remain safely inside the OS.
You can do something like this in your settings.py:
import os
API_KEY = os.environ.get('API_KEY_ENVIRONMENT_VARIABLE')
CLIENT_SECRET = os.environ.get('CLIENT_SECRET_KEY_ENVIRONMENT_VARIABLE')
I add lines like this to my settings.py (This was from Python 2):
try:
from local import *
except ImportError:
print 'The local settings could not be found.'
Then I create a local.py file in the same directory that contains settings that are either specific to the deployment or secret. Then I add it to the ignore file for whatever revision control tool I use to make sure it never gets committed.
I have one application in Django 1.9.x, and I want to use multidb.
Have any configuration on the settings that I can enter multiple databases, and when it is on a specific server it uses the correct database. For example:
When I'm programming localhost use default, When I put in test serve, automatically switch to testserverdb and when I put in production server use productiondb, I tried use multi-db documentation but it's not what I want 'cuze this case is to work with legacy database, not my case.
How I do it?
In your settings file:
try:
from [app_name].local_settings import *
except ImportError:
pass
Changes you make in this local_settings file will be overridden. So now you can have a different local_settings file for your localhost, development or production. You can specify a separate db in these files separately.
It sounds like you want to have environment specific databases, not necessarily a single app that connects to many databases. You can easily accomplish this with a custom settings module for each of these environments.
You might have a structure like the following :
myproject/
- settings/
- __init__.py
- common.py
You'll want to put all your common settings under common.py. This will serve as the basis for all your other environment settings. From here,there are a few setups that you can use to do what you want, but I'm going to suggest that you use common.py as a base settings module that can be overridden locall.y
To do this, you can set your DJANGO_SETTINGS_MODULE to by myproject.settings, and in your __init__.py,
from .common import *
try:
from .local import *
except ImportError:
pass
Then on each environment (production/development/etc), you'll want to include a file named local.py in myproject/settings. Any settings you put in that local.py file will override your common.py when your settings module gets loaded up.
I want to store some system constants which do not change so frequently.
I have made an settings table in my database using django models to store them but this table will have only single entry and I change these settings with django admin.
Is there an alternate way to store some variables without having to create a database?
You want, I quote, some system constants which do not change so frequently. For this, you can defined your own variables in the settings.py file (or another files apart) and use them by importing them.
The most appropriate way would be to create a new file and import them into settings.py:
SETTING_1 = "/home/path/to/an/executable"
SETTING_2 = False
and then, in the settings.py:
from settings_site import *
It will make SETTING_* variables (give them useful names though) accessible in the settings of your project and you will be able to change the file even if you are using a VCS (SVN, Git...).
Otherwise, you can still implement a solution based on a configuration file, editable through a custom view, but it will require to create an application to manage that. But, coupled with the cache system, it can be as efficient as the use of the settings.py if you are parsing the file only when it is needed (at the startup and at every changes)
I'm making a very very reusable CMS in Django. I'm trying not to hardcode anything, so I want a config.py or something for the app itself.
I want to use it in templates (something like {{ config.SITE_NAME }}) and just regular Python code (views, models), etc.
How would I go about doing that?
Django already has the settings.py file and an interface to use the values from there, is that not good enough? Accessing settings in your code is easy:
from django.conf import settings
do_something_with(settings.MY_CONFIG_VARIABLE)
In your template it requires a bit more work, like making a context processor for example. This answer shows you have to make a context processor that exposes values from settings to your template.
settings.py serves this purpose already in django. You can add custom settings to it for your application.
https://docs.djangoproject.com/en/dev/topics/settings/#using-settings-in-python-code
When you create a django project, it automatically creates a settings.py file. Put your settings in that file, and then :-
from django.conf import settings
settings.YOUR_SETTING
also, if you have some settings, that will vary depending upon the machines (eg. production/test servers).
Then just add eg. conf/local_settings.py file and put the machine specific settings in that file, and in settings.py just do :-
from conf.local_settings import *
at the end of settings.py
Make sure you ignore local_settings.py from your VCS check-in (eg. in case of git add local_settings.py to .gitignore
I am relatively new to Django and one thing that has been on my mind is changing the database that will be used when running the project.
By default, the DATABASES 'default' is used to run my test project. But in the future, I want to be able to define a 'production' DATABASES configuration and have it use that instead.
In a production environment, I won't be able to "manage.py runserver" so I can't really set the settings.
I read a little bit about "routing" the database to use another database, but is there an easier way so that I won't need to create a new router every time I have another database I want to use (e.g. I can have test database, production database, and development database)?
You can just use a different settings.py in your production environment.
Or - which is a bit cleaner - you might want to create a file settings_local.py next to settings.py where you define a couple of settings that are specific for the current machine (like DEBUG, DATABASES, MEDIA_ROOT etc.) and do a from settings_local import * at the beginning of your generic settings.py file. Of course settings.py must not overwrite these imported settings.
Why you need a test database? Django create test database automatically before running unittest. And, database routing is not fit your purpose, it's for routing you read/write requests to different database. If you want to use a development database, set up a new DATABASE config in, say local_settings.py, and at the last of your settings.py, type
try:
from local_settings import *
except ImportError:
pass
There is nothing you can specify in the settings directly. The practice I use is to have addtional setting files for different environments which contain just the settings overwritten which I want to change, like database settings or cache settings for example. My project root application for example would contain the following files on a development environment (attention to the leading underscore):
...
settings.py
settings_dev.py
_settings_test.py
_settings_prod.py
...
Then in settings.py I would add the following lines of code to the beginning:
try:
from settings_prod import *
except ImportError:
try:
from settings_test import *
except ImportError:
from settings_dev import *
Since I am on dev environment it will only import my settings_dev file, since the others have a leading underscore.
When I deploy then to a production or testing environment I would rename the relevant files. For production: _settings_prod.py -> settings_prod.py, for testing: _settings_test.py -> settings_test.py. settings_dev.py can basically stay as is, since it will be only imported if the other two fail.
The last step you could simply do with automated deployment via fabric or other tools. An example with fabric would be something like run('mv _settings_prod.py settings_prod.py') for renaming.