I want to store some system constants which do not change so frequently.
I have made an settings table in my database using django models to store them but this table will have only single entry and I change these settings with django admin.
Is there an alternate way to store some variables without having to create a database?
You want, I quote, some system constants which do not change so frequently. For this, you can defined your own variables in the settings.py file (or another files apart) and use them by importing them.
The most appropriate way would be to create a new file and import them into settings.py:
SETTING_1 = "/home/path/to/an/executable"
SETTING_2 = False
and then, in the settings.py:
from settings_site import *
It will make SETTING_* variables (give them useful names though) accessible in the settings of your project and you will be able to change the file even if you are using a VCS (SVN, Git...).
Otherwise, you can still implement a solution based on a configuration file, editable through a custom view, but it will require to create an application to manage that. But, coupled with the cache system, it can be as efficient as the use of the settings.py if you are parsing the file only when it is needed (at the startup and at every changes)
Related
I am writing a app for django which i am planning to publish. This app requires a Bolean Setting variable CONSUMER_REGISTRATION.
Aim of getting this variable is to decide whether to define ConsumerRegistrationModel or not.
This is what I did.
from django.db import models
from django.conf import settings
if getattr(settings, 'CONSUMER_REGISTRATION', False):
class ConsumerRegistration(models.Model):
...
Its working fine. The only Issue i am facing that developers will need to run makemigrations and migrate commands each time they change the variable in settings.
1- Can this work be automated ?. So if they change the variable then some code in django auto run the makemigrations and migrate commands.
2- Or is it perfectly fine to leave this work on developers ??
3- Also I want to ask that is it a good aproach to do this in django ?
The accepted answer doesn't really provide a way to do what the OP is asking, which is to conditionally declare a model.
People may have various reasons for wanting to do this, from not declaring a model at all, to declaring models differently based on settings (it is implied that if you are doing this: you are intend to run the same code base in different places using different settings).
One solution is to put the model in its own app, and conditionally include the app based on a setting:
# Set this your per-project settings:
CONSUMER_REGISTRATION = True
# Set this in the generic settings
INSTALLED_APPS = [...]
if CONSUMER_REGISTRATION:
INSTALLED_APPS.append('consumer_registration') # Models will be loaded.
There's nothing wrong with creating an app which just contains a model.
With regards to "automating" it, the table will be created if migrations are run when the setting is true. It will not delete the table if it is changed to false.
You could simply define the model without any conditionals and tweak your app logic so that instances of ConsumerRegistration model are only interacted with (i.e. created, updated etc.) when the 'CONSUMER_REGISTRATION' flag is set to True.
Running migrations every single time the value of 'CONSUMER_REGISTRATION' is changed would make much more mess than leaving ConsumerRegistration table empty.
As indicated by #dahrens, you could isolate the ConsumerRegistration model along with relevant logic in a separate app, which would only be installed as needed by developers.
Iam trying to create multiple projects into one existing django project.
The directory should be set up as follows
directory picture
Is it possible to use this direcory without using multiple databases and config files. Everything in a single django instance?
If so, how?
The problem is, i cant reach the moduls in my mainproject urls.py. They cant be found.
thanks :)
Why would you want to do this? If you need two different django projects, keep them as different django projects.
It is different if you want to use the same database and reuse some of your existing apps.
For the first, you can set that in your settings.py file of each project to point to a common database, you can even manage to share only some tables in a common database and keep the others as a separated database for each project (there are some limitations with that approach though). Check django multidb docs for more info.
For the second, you can create a folder containing your django apps (with their models definitions, views, admin and whatever you need) and import them in the settings.py. An example:
APPS_PATH = "/django/apps/folder/"
sys.path.insert(0, APPS_PATH)
INSTALLED_APPS = (
...
custom_app1,
custom_app2,
)
You may also want to check django sites.
Hope it helps.
I am relatively new to Django and one thing that has been on my mind is changing the database that will be used when running the project.
By default, the DATABASES 'default' is used to run my test project. But in the future, I want to be able to define a 'production' DATABASES configuration and have it use that instead.
In a production environment, I won't be able to "manage.py runserver" so I can't really set the settings.
I read a little bit about "routing" the database to use another database, but is there an easier way so that I won't need to create a new router every time I have another database I want to use (e.g. I can have test database, production database, and development database)?
You can just use a different settings.py in your production environment.
Or - which is a bit cleaner - you might want to create a file settings_local.py next to settings.py where you define a couple of settings that are specific for the current machine (like DEBUG, DATABASES, MEDIA_ROOT etc.) and do a from settings_local import * at the beginning of your generic settings.py file. Of course settings.py must not overwrite these imported settings.
Why you need a test database? Django create test database automatically before running unittest. And, database routing is not fit your purpose, it's for routing you read/write requests to different database. If you want to use a development database, set up a new DATABASE config in, say local_settings.py, and at the last of your settings.py, type
try:
from local_settings import *
except ImportError:
pass
There is nothing you can specify in the settings directly. The practice I use is to have addtional setting files for different environments which contain just the settings overwritten which I want to change, like database settings or cache settings for example. My project root application for example would contain the following files on a development environment (attention to the leading underscore):
...
settings.py
settings_dev.py
_settings_test.py
_settings_prod.py
...
Then in settings.py I would add the following lines of code to the beginning:
try:
from settings_prod import *
except ImportError:
try:
from settings_test import *
except ImportError:
from settings_dev import *
Since I am on dev environment it will only import my settings_dev file, since the others have a leading underscore.
When I deploy then to a production or testing environment I would rename the relevant files. For production: _settings_prod.py -> settings_prod.py, for testing: _settings_test.py -> settings_test.py. settings_dev.py can basically stay as is, since it will be only imported if the other two fail.
The last step you could simply do with automated deployment via fabric or other tools. An example with fabric would be something like run('mv _settings_prod.py settings_prod.py') for renaming.
I'm writing my own CMS and have the situation where the some initial settings information should be written in the database. I don't like the idea to write some XML/Json file with fixture data which will be imported when I will run syncdb.
What I'm thinking about is that I can create some cms_init.py file and run it before manage.py syncdb. In this file I need to setup environment then and after that with the usage of the models I can write my custom data to database.
Another way - to have method in admin side, for example initialize() and the url for it. It will store some variable and will never run second time and in this function I just call the models I need and that's it.
Why I'm looking for the solution, because I need dynamic initial data writing, which will depend on settings.py and from other module's settings each time and I don't want to rewrite the database's initial file every time I run the new project.
Any ideas?
Don't do it automatically/implicitly, make the user do it explicitly with a manage.py command, e.g.
python manage.py your_cms init
When running your CMS functionality check or rather try-except if the initialization has taken place and if not remind the user to run the init command first.
Doing it explicitly is safer and not a big inconvenience for the user since it must only be done once.
In Django, settings are stored in a file, settings.py. This file is part of the code, and goes into the repository. It is only the developers who deal with this file. The admin deals with the models, the data in the database. This is the data that the non-development staff edits, and the site visitors see rendered in templates.
The thing is, our site, and many others, have lots of settings options that should be edited by non-developer staff. We're talking about stand-alone site-wide constants that really have no place in the database. Putting them in the database will result in numerous pointless queries. Caching could alleviate that, but that seems unnecessarily complex to handle what can be done with a single line in the settings.py file.
I did notice this dbsettings app, but it is old and unmaintained. I also noticed that the django e-commerce app, Satchmo, includes a use-case specific fork of this dbsettings app. We could build something similar into our site, an app that stores some settings as key/value pairs in a single database table, but it just really seems like the wrong approach. Why put something in the DB that doesn't belong there just to make it more easily editable by non-developers?
We have a list of site-wide settings on our Django site that we want to be editable by non-developer administrators. What is the best way of going about this?
Something like dbsettings (as you mentioned) seems like the way to go. From the reasons for existence for that project:
Not all settings belong in
settings.py, as it has some
particular limitations:
Settings are project-wide. This not only requires apps to clutter up
settings.py, but also increases the chances of naming
conflicts.
Settings are constant throughout an instance of Django. They cannot be
changed without restarting the application.
Settings require a programmer in order to be changed. This is true even
if the setting has no functional impact on anything else.
If dbsettings doesn't work for you, then implement your own, or fork it. It doesn't seem like it'd be too arduous.
I'm actually a big fan of dbsettings, and keep meaning to publish my fork that patches it to work with Django 1.1 (not actually a big change). Looks like someone has updated it already.
However, you're probably right that this is overkill for what you need. One thing I've done before is to add a line to the end of settings.py that imports and parses a YAML file. YAML is a simple markup language, which at its most basic is just KEY: VALUE ...
CONSTANT1: MyValue
CONSTANT2: Anothervalue
If you put this somewhere editors can access it, then at the end of settings.py you just do:
import yaml
try:
globals().update(yaml.load(open('/path/to/my/yaml/file.yml')))
except:
pass
You'll need the Python YAML library to parse the YML file.
The downside to this approach is that you'll need to restart Apache to get it to pick up the changes.
Edited to add It wouldn't be particularly difficult to build a front end which could edit this file, and provide a button which runs a script to restart Apache.
If you must avoid server restarts then a logical place for the settings is the database as Dominic and Daniel said, but you'll need to invalidate cached settings object every time it is updated.
Looks like it's possible to re-set values in the cache with Django's low level cache API. All you want should be achievable with these calls:
cache.set('settings', local_settings)
cache.add('settings', local_settings)
local_settings = cache.get('settings')
cache.delete('settings')
How about putting a sitesettings.py (or whatever) somewhere that your admins can access, then in settings.py do
from sitesettings import *
That seems good and simple, but I may have misunderstood or oversimplified your problem :)
models.py
class Setting(models.Model):
"""Global settings for app"""
name = models.CharField(max_length=100, null=False, blank=False)
value = models.CharField(max_length=100, null=False, blank=False)
def __str__(self):
return self.name
admin.py
from YOUR_APP.models import Setting
class SettingAdmin(admin.ModelAdmin):
list_display = (
'name',
'value',
)
admin.site.register(Setting, SettingAdmin)
extras.py
#register.filter
def get_setting(name):
setting = Setting.objects.get(name=name)
return setting.value
template.html
{% if 'setting_name'|get_setting == 'true' %}
Display your Feature
{% endif %}
Django Packages has a page that lists packages that provide such functionality - most query the database and then use caching to minimize hits on the DB.
I found Django Dynamic Preferences to be of particular interest due to the fine-grained control it gives you over the configuration.