As of now we have a file conf.py which stores most of our configuration variables for the service. We have code similar to this:
environment = 'dev' # could be dev, local, staging, production
configa = 'something'
configb = 'something else'
if environment = 'dev':
configa = 'something dev'
elif environment = 'local':
configa = 'something local'
Is this the right way to manage configuration file in a python project? Are these configuration loaded into variables at compile time (while creating pyc files), or are the if conditions checked every time the conf is imported in a python script or is it every time a configuration variable is accessed?
All code runs at import time. But since you are unlikely to import your application again and again while it's running you can ignore the (minimal) overhead.
This is subjective but there is a good discussion in this post:
What's the best practice using a settings file in Python?
With your method, it will be treated the same way as any other python script, i.e. on import. If you wanted it updated on access/or without restarting the service it is best to use an external/non-python config file (e.g. json, .ini) and set up functionality to refresh the file.
You must create a file, example settings.py,
add the path to the module where the file to the system.
Example:
sys.path.append(os.path.dirname(__file__))
Аfter anywhere you can import the file and to obtain from any setting:
import settings
env = settings.environment
Similarly, many working frameworks.
Related
I have a large codebase where all settings & constants are stored inside a settings.py file, that gets imported in various places. I want to be able to import an arbitrary .yml file instead, with the filename given at run-time to the executable main.py.
It's easy to change settings to load a .yml file; but of course I can't pass it an arbitrary filename from main, such that it will remember wherever else that settings gets imported.
I tried to think of a few solutions:
Modify main so that the first thing it does is copy the arbitrary yaml file, into the default place the yaml file will be loaded from (ugly)
Import the settings in main.py, and change references from import settings to import main (circular imports)
Use os to set an environment variable SETTINGS_FILE=some_file.yml, that later gets read by the settings submodule (somewhat ugly...)
Refactor the entire codebase to pass around a settings class instead of a module (a lot of work)
I feel like I'm generally thinking of this all in a stupid way, and that there must be a better way (although I couldn't find anything by search)... what am I missing?
EDIT:
I had no idea but apparently this works...
$ cat settings.py
setting = 1
$ cat other_module.py
import settings
print(settings.setting)
$ cat main.py
import settings
print(settings.setting)
settings.setting = 2
import other_module
$ python main.py
1
2
I am writing a python package which is dependent on a number of large data files. These files are not included with the package. Instead, users are required to have these files on disk already (with an arbitrary path). What is the best way to make my package aware of the location of these files?
I have been reading about setup.py and setup.cfg but I am still not sure how to to this. It seems to me that a user-editable option in setup.cfg would be a good way to go but I don't know if it is, if it can be done at all, or how I would do it if so...
I did see this almost identical question, Python Packaging: Ask user for variable values when (pip) installing, which focuses on user input during pip (which is discouraged in the comments). If that actually would be a good solution, I'm interested in how to do that too.
In my private development of the package, I have used module constants, as in
DEFAULT_PATH_FILE1 = "my/path/to/file1.csv"
DEFAULT_PATH_FILE2 = "my/path/to/file2.csv"
etc. and properties initialized with these constants. This doesn't seem viable at all for distribution.
What you want is not a one-time setup during install (which is also impossible with modern .whl installs), but a way for clients to configure your library at any point during runtime. Given that you don't provide a cli, you can either use environment variables as an option to provide that , or look for a user-defined config file.
This here is a simple recipe using appdirs to find out where the config file should be found. It loads on import of your package, and tells clients how bad it is if the config file isn't there. Usually, that'd be:
write a log message
use default settings
throw some kind of exception
a combination of the above
from logging import getLogger
from pathlib import Path
from configparser import ConfigParser
# loads .ini format files easily, just to have an example to go with
import appdirs # needs to be pip-installed
log = getLogger(__name__)
config = ConfigParser(interpolation=None)
# load config, substitute "my_package" with the actual name of your package
config_path = Path(appdirs.user_config_dir("my_package")) / "user.ini"
try:
with open(config_path) as f:
config.read_file(f, source="user")
except FileNotFoundError:
# only do whatever makes sense
log.info(f"User config expected at '{config_path}', but not found.")
config.read_string("[pathes]\nfile_foo=foo\nfile_bar=bar") # dubious
raise ImportError(f"Can't use this module; create a config at '{config_path}'.")
class Foo:
def __init__(self):
with open(cfg["pathes"]["file_foo"]) as f:
self.data = f.read()
This sounds like runtime configuration. It's none of the business of setup.py, which is concerned with installing your package.
For app configuration, it would be common to specify this resource location by command-line argument, environment variable, or configuration file. You will usually want to either hard-code some sensible default path in case the user does not specify any configuration, or raise an exception in case of resources not existing / not found.
Example for environment var:
import os
DEFAULT_PATH_FILE1 = "/default/path/to/file1.csv"
PATH_FILE1 = os.environ.get("PATH_FILE1", DEFAULT_PATH_FILE1)
I have a collection of scripts written in Python. Each of them can be executed independently. However, most of the time they should be executed one after the other, so there is a MainScript.py which calls them in the appropriate order. Each script has some configurable variables (let's call them Root_Dir, Data_Dir and LinWinFlag). If this collection of scripts is moved to a different computer, or different data needs to be processed, these variable values need to be changed. As there are many scripts this duplication is annoying and error-prone. I would like to group all configuration variables into a single file.
I tried making Config.py which would contain them as per this thread, but import Config produces ImportError: No module named Config because they are not part of a package.
Then I tried relying on variable inheritance: define them once in MainScript.py which calls all the others. This works, but I realized that each script would not be able to run on its own. To solve this, I tried adding useGlobal=True in MainScript.py and in other files:
if (useGlobal is None or useGlobal==False):
# define all variables
But this fails when scripts are run standalone: NameError: name 'useGlobal' is not defined. The workaround is to define useGlobal and set it to False when running the scripts independently of MainScript.py. It there a more elegant solution?
The idea is that python wants to access files - including the Config.py - primarily as part of a module.
The nice thing is that Python makes building modules (i.e. python packages) really easy - initializing it can be done by creating a
__init__.py
file in each directory you want as a module, a submodule, a subsubmodule, and so on.
So your import should go through if you have created this file.
If you have further questions, look at the excellent python documentation.
The best way to do this is to use a configuration file placed in your home directory (~/.config/yourscript/config.json).
You can then load the file on start and provide default values if the file does not exist :
Example (config.py) :
import json
default_config = {
"name": "volnt",
"mail": "oh#hi.com"
}
def load_settings():
settings = default_config
try:
with open("~/.config/yourscript/config.json", "r") as config_file:
loaded_config = json.loads(config_file.read())
for key in loaded_config:
settings[key] = loaded_config[key]
except IOError: # file does not exist
pass
return settings
For a configuration file it's a good idea to use json and not python, because it makes it easy to edit for people using your scripts.
As suggested by cleros, ConfigParser module seems to be the closest thing to what I wanted (one-line statement in each file which would set up multiple variables).
I am relatively new to Python. I am looking to create a "settings" module where various application-specific constants will be stored.
Here is how I am wanting to set up my code:
settings.py
CONSTANT = 'value'
script.py
import settings
def func():
var = CONSTANT
# do some more coding
return var
I am getting a Python error stating:
global name 'CONSTANT' is not defined.
I have noticed on Django's source code their settings.py file has constants named just like I do. I am confused on how they can be imported to a script and referenced through the application.
EDIT
Thank you for all your answers! I tried the following:
import settings
print settings.CONSTANT
I get the same error
ImportError: cannot import name CONSTANT
The easiest way to do this is to just have settings be a module.
(settings.py)
CONSTANT1 = "value1"
CONSTANT2 = "value2"
(consumer.py)
import settings
print settings.CONSTANT1
print settings.CONSTANT2
When you import a python module, you have to prefix the the variables that you pull from it with the module name. If you know exactly what values you want to use from it in a given file and you are not worried about them changing during execution, then you can do
from settings import CONSTANT1, CONSTANT2
print CONSTANT1
print CONSTANT2
but I wouldn't get carried away with that last one. It makes it difficult for people reading your code to tell where values are coming from. and precludes those values being updated if another client module changes them. One final way to do it is
import settings as s
print s.CONSTANT1
print s.CONSTANT2
This saves you typing, will propagate updates and only requires readers to remember that anything after s is from the settings module.
step 1: create a new file settings.py on the same directory for easier access.
#database configuration settings
database = dict(
DATABASE = "mysql",
USER = "Lark",
PASS = ""
)
#application predefined constants
app = dict(
VERSION = 1.0,
GITHUB = "{url}"
)
step 2: importing settings module into your application file.
import settings as s # s is aliasing settings & settings is the actual file you do not have to add .py
print(s.database['DATABASE']) # should output mysql
print(s.app['VERSION']) # should output 1.0
if you do not like to use alias like s you can use a different syntax
from settings import database, app
print(database['DATABASE']) # should output mysql
print(app['VERSION']) # should output 1.0
notice on the second import method you can use the dict names directly
A small tip you can import all the code on the settings file by using * in case you have a large file and you will be using most of the settings on it on your application
from settings import * # * represent all the code on the file, it will work like step 2
print(database['USER']) # should output lark
print(app['VERSION']) # should output 1.0
i hope that helps.
When you import settings, a module object called settings is placed in the global namespace - and this object carries has that was in settings.py as attributes. I.e. outside of settings.py, you refer to CONSTANT as settings.CONSTANT.
Leave your settings.py exactly as it is, then you can use it just as Django does:
import settings
def func():
var = settings.CONSTANT
...Or, if you really want all the constants from settings.py to be imported into the global namespace, you can run
from settings import *
...but otherwise using settings.CONSTANT, as everyone else has mentioned here, is quite right.
See the answer I posted to Can I prevent modifying an object in Python? which does what you want (as well as force the use of UPPERCASE identifiers). It might actually be a better answer for this question than it was for the the other.
This way is more efficient since it loads/evaluates your settings variables only once. It works well for all my Python projects.
pip install python-settings
Docs here: https://github.com/charlsagente/python-settings
You need a settings.py file with all your defined constants like:
# settings.py
DATABASE_HOST = '10.0.0.1'
Then you need to either set an env variable (export SETTINGS_MODULE=settings) or manually calling the configure method:
# something_else.py
from python_settings import settings
from . import settings as my_local_settings
settings.configure(my_local_settings) # configure() receives a python module
The utility also supports Lazy initialization for heavy to load objects, so when you run your python project it loads faster since it only evaluates the settings variable when its needed
# settings.py
from python_settings import LazySetting
from my_awesome_library import HeavyInitializationClass # Heavy to initialize object
LAZY_INITIALIZATION = LazySetting(HeavyInitializationClass, "127.0.0.1:4222")
# LazySetting(Class, *args, **kwargs)
Just configure once and now call your variables where is needed:
# my_awesome_file.py
from python_settings import settings
print(settings.DATABASE_HOST) # Will print '10.0.0.1'
I'm new in python but if we define an constant like a function
on setings.py
def CONST1():
return "some value"
main.py
import setings
print setings.CONST1() ##take an constant value
here I see only one, value cant be changed but its work like a function..
Try this:
In settings.py:
CONSTANT = 5
In your main file:
from settings import CONSTANT
class A:
b = CONSTANT
def printb(self):
print self.b
I think your above error is coming from the settings file being imported too late. Make sure it's at the top of the file.
Also worth checking out is the simple-settings project which allows you to feed the settings into your script at runtim, which allows for environment-specific settings (think dev, test, prod,...)
I tend to use SQLite when doing Django
development, but on a live server something more robust is
often needed (MySQL/PostgreSQL, for example).
Invariably, there are other changes to make to the Django
settings as well: different logging locations / intensities,
media paths, etc.
How do you manage all these changes to make deployment a
simple, automated process?
Update: django-configurations has been released which is probably a better option for most people than doing it manually.
If you would prefer to do things manually, my earlier answer still applies:
I have multiple settings files.
settings_local.py - host-specific configuration, such as database name, file paths, etc.
settings_development.py - configuration used for development, e.g. DEBUG = True.
settings_production.py - configuration used for production, e.g. SERVER_EMAIL.
I tie these all together with a settings.py file that firstly imports settings_local.py, and then one of the other two. It decides which to load by two settings inside settings_local.py - DEVELOPMENT_HOSTS and PRODUCTION_HOSTS. settings.py calls platform.node() to find the hostname of the machine it is running on, and then looks for that hostname in the lists, and loads the second settings file depending on which list it finds the hostname in.
That way, the only thing you really need to worry about is keeping the settings_local.py file up to date with the host-specific configuration, and everything else is handled automatically.
Check out an example here.
Personally, I use a single settings.py for the project, I just have it look up the hostname it's on (my development machines have hostnames that start with "gabriel" so I just have this:
import socket
if socket.gethostname().startswith('gabriel'):
LIVEHOST = False
else:
LIVEHOST = True
then in other parts I have things like:
if LIVEHOST:
DEBUG = False
PREPEND_WWW = True
MEDIA_URL = 'http://static1.grsites.com/'
else:
DEBUG = True
PREPEND_WWW = False
MEDIA_URL = 'http://localhost:8000/static/'
and so on. A little bit less readable, but it works fine and saves having to juggle multiple settings files.
At the end of settings.py I have the following:
try:
from settings_local import *
except ImportError:
pass
This way if I want to override default settings I need to just put settings_local.py right next to settings.py.
I have two files. settings_base.py which contains common/default settings, and which is checked into source control. Each deployment has a separate settings.py, which executes from settings_base import * at the beginning and then overrides as needed.
The most simplistic way I found was:
1) use the default settings.py for local development and 2)
create a production-settings.py starting with:
import os
from settings import *
And then just override the settings that differ in production:
DEBUG = False
TEMPLATE_DEBUG = DEBUG
DATABASES = {
'default': {
....
}
}
Somewhat related, for the issue of deploying Django itself with multiple databases, you may want to take a look at Djangostack. You can download a completely free installer that allows you to install Apache, Python, Django, etc. As part of the installation process we allow you to select which database you want to use (MySQL, SQLite, PostgreSQL). We use the installers extensively when automating deployments internally (they can be run in unattended mode).
I have my settings.py file in an external directory. That way, it doesn't get checked into source control, or over-written by a deploy. I put this in the settings.py file under my Django project, along with any default settings:
import sys
import os.path
def _load_settings(path):
print "Loading configuration from %s" % (path)
if os.path.exists(path):
settings = {}
# execfile can't modify globals directly, so we will load them manually
execfile(path, globals(), settings)
for setting in settings:
globals()[setting] = settings[setting]
_load_settings("/usr/local/conf/local_settings.py")
Note: This is very dangerous if you can't trust local_settings.py.
In addition to the multiple settings files mentioned by Jim, I also tend to place two settings into my settings.py file at the top BASE_DIR and BASE_URL set to the path of the code and the URL to the base of the site, all other settings are modified to append themselves to these.
BASE_DIR = "/home/sean/myapp/"
e.g. MEDIA_ROOT = "%smedia/" % BASEDIR
So when moving the project I only have to edit these settings and not search the whole file.
I would also recommend looking at fabric and Capistrano (Ruby tool, but it can be used to deploy Django applications) which facilitate automation of remote deployment.
Well, I use this configuration:
At the end of settings.py:
#settings.py
try:
from locale_settings import *
except ImportError:
pass
And in locale_settings.py:
#locale_settings.py
class Settings(object):
def __init__(self):
import settings
self.settings = settings
def __getattr__(self, name):
return getattr(self.settings, name)
settings = Settings()
INSTALLED_APPS = settings.INSTALLED_APPS + (
'gunicorn',)
# Delete duplicate settings maybe not needed, but I prefer to do it.
del settings
del Settings
So many complicated answers!
Every settings.py file comes with :
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
I use that directory to set the DEBUG variable like this (reaplace with the directoy where your dev code is):
DEBUG=False
if(BASE_DIR=="/path/to/my/dev/dir"):
DEBUG = True
Then, every time the settings.py file is moved, DEBUG will be False and it's your production environment.
Every time you need different settings than the ones in your dev environment just use:
if(DEBUG):
#Debug setting
else:
#Release setting
Why make things so much complicated? I come into Django from a PHP/Laravel background. I use .env and you can easily configure it.
Install this package
django-environ
Now, in the folder where you've settings.py, create a file .env (make sure to put this file in gitignore)
In the .env file, put the env variables like debug setting state, secret key, mail credentials etc
A snapshot of example .env
SECRET_KEY="django-insecure-zy%)s5$=aql=#ox54lzfjyyx!&uv1-q0kp^54p(^251&_df75i"
DB_NAME=bugfree
DB_USER=postgres
DB_PASSWORD=koushik
DB_PORT=5433
DB_HOST=localhost
APP_DEBUG=True # everything is string here
In the settings, make sure to instantiate it using this
import environ
env = environ.Env()
environ.Env.read_env()
Now you can import values from the .env file and put them wherever you want. Some examples in settings.py
SECRET_KEY = env('SECRET_KEY')
DEBUG = bool(env('APP_DEBUG', False))
You can also put default value too like this
env('DB_NAME', 'default value here')
TIP
You can create another .env.example in the same folder where you've .env file and you can have a template of .env and you can commit the .example file. It helps the future dev to know easily what env variables are there.
.env.example would be something like this
SECRET_KEY=VALUE_HERE
DB_NAME=VALUE_HERE
DB_USER=VALUE_HERE
DB_PASSWORD=VALUE_HERE
DB_PORT=VALUE_HERE
DB_HOST=VALUE_HERE
EMAIL_HOST=VALUE_HERE
EMAIL_PORT=VALUE_HERE
EMAIL_HOST_USER=VALUE_HERE
EMAIL_HOST_PASSWORD=VALUE_HERE
DEFAULT_FROM_EMAIL=VALUE_HERE
I think it depends on the size of the site as to whether you need to step up from using SQLite, I've successfully used SQLite on several smaller live sites and it runs great.
I use environment:
if os.environ.get('WEB_MODE', None) == 'production' :
from settings_production import *
else :
from settings_dev import *
I believe this is a much better approach, because eventually you need special settings for your test environment, and you can easily add it to this condition.
This is an older post but I think if I add this useful library it will simplify things.
Use django-configuration
Quickstart
pip install django-configurations
Then subclass the included configurations.Configuration class in your project's settings.py or any other module you're using to store the settings constants, e.g.:
# mysite/settings.py
from configurations import Configuration
class Dev(Configuration):
DEBUG = True
Set the DJANGO_CONFIGURATION environment variable to the name of the class you just created, e.g. in ~/.bashrc:
export DJANGO_CONFIGURATION=Dev
and the DJANGO_SETTINGS_MODULE environment variable to the module import path as usual, e.g. in bash:
export DJANGO_SETTINGS_MODULE=mysite.settings
Alternatively supply the --configuration option when using Django management commands along the lines of Django's default --settings command line option, e.g.:
python manage.py runserver --settings=mysite.settings --configuration=Dev
To enable Django to use your configuration you now have to modify your manage.py or wsgi.py script to use django-configurations' versions of the appropriate starter functions, e.g. a typical manage.py using django-configurations would look like this:
#!/usr/bin/env python
import os
import sys
if __name__ == "__main__":
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'mysite.settings')
os.environ.setdefault('DJANGO_CONFIGURATION', 'Dev')
from configurations.management import execute_from_command_line
execute_from_command_line(sys.argv)
Notice in line 10 we don't use the common tool django.core.management.execute_from_command_line but instead configurations.management.execute_from_command_line.
The same applies to your wsgi.py file, e.g.:
import os
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'mysite.settings')
os.environ.setdefault('DJANGO_CONFIGURATION', 'Dev')
from configurations.wsgi import get_wsgi_application
application = get_wsgi_application()
Here we don't use the default django.core.wsgi.get_wsgi_application function but instead configurations.wsgi.get_wsgi_application.
That's it! You can now use your project with manage.py and your favorite WSGI enabled server.
In fact you should probably consider having the same (or almost the same) configs for your development and production environment. Otherwise, situations like "Hey, it works on my machine" will happen from time to time.
So in order to automate your deployment and eliminate those WOMM issues, just use Docker.