We're running a pylons app with multiple ini files (production, staging, development, etc). When a new setting is added that can be the same in all environments, it would be great to be able to set it once in some sort of master configuration that gets included with all .ini files. Or included via some other way to load centralized config as well as deploy-specific config.
It doesn't look like there's an "import" syntax for pylons ini files. What's the best way to achieve this type of config compositing for pylons, if any?
You can use the ConfigParser module.
Related
In this answer, I was told that the logger needed to only be declared once globally.
On similar lines, how are program configuration settings loaded and used? Whether I use Dynaconf or simply use yaml.safe_load, what is the best practice for storing and loading configuration data?
Example folder structure:
main.py
|
|--diskOperations
| __init__.py
| fileAndFolderOps.py
|
|--timer
__init__.py
timers.py
stopwatch.py
Functionality needed:
Some program settings need to be loaded in main.py. The classes in
fileAndFolderOps.py of the diskOperations package need to load
some settings. The classes in timers.py and stopwatch.py of the
timer package need to load some settings.
The settings need to be segregated into settings that are loaded
during development, testing, production etc.
Questions:
What are the best practices for creating and storing the config
files? Are separate config files to be created and stored in each
package's folder, and do each of the .py files load the
configuration settings so that it becomes accessible to all the
classes in the .py file?
If there is any setting to be used globally across the entire
program, is it best to load it into main.py and then put it into
some ConfigurationHandler class and pass the class instance to all
instances of classes of every package file?
How best to switch between development, test and production
modes? At least in Dynaconf, there's an environment parameter.
What about other config libraries? I'm asking about the best
practice when needing to use settings across multiple packages.
To ensure that the variables in the classes match the configurations
in the config files, is it recommended to have a "generator"
function in each class, that auto-generates configuration files with
default values (if the config file does not exist)?
Perhaps it'd also help to point us to any open source project that uses config files and settings in an ideal way (it helps to know about the 12 factor app guide).
I have a publicly hosted repository on GitHub which I require to manage updates for scripts on my server. I would like my scripts to call some sensitive arguments automatically, however I don't want those arguments to be public.
My thoughts are to .gitignore a config file with my sensitive arguments and manually copy the config file when installing. Alternatively I was thinking of including an encrypted config file in the GitHub repository and manually inputting the hash as an environmental arg on my server.
What is the best practice to achieve the outcome? Am I missing something completely? Any info or advice would be appreciated.
Adding a config file to .gitignore and manually copying it over is a commonly used approach and is fine. I'd say the most common approach is to use environment variables, although you'd still have to manually configure them on your server. Here's a short article with some good examples on these approaches.
I know this issue has been discussed before, but I am struggling to find a starightforward explanation of how to approach configuration between local development and production server.
What I have done so far: I had one my_app_config.py file that had a section with machine / scenario (test vs production) sections I could just comment out. I would develop with my local machine path hardcoded, test database connection string, my test spreadsheet location, etc. When it comes time to deploy the code to the server, I comment out the "test" section and uncomment the "production section". As you may guess, this is wrought with errors.
I recently adopted the Python ConfigParser library to use .ini files. Now, I have the following lines in my code
import ConfigParser
config = ConfigParser.RawConfigParser()
config.read(os.path.abspath(os.path.join(os.path.dirname( __file__ ), '..', 'settings',
'my_app_config.ini')))
database_connect_string_admin = config.get('Database', 'admin_str')
The problems with this are many...
I need to have the import at the top of every file
The filename my_app_config.ini can't change. So, I rely on comments within the content of the .ini file to know which one I'm dealing with. They are stored in a folder tree so I know which is which.
notice the path to the config file is defined here. So, depending where the python file lives in the tree structure dictates if I get a copy / paste error.
I tried to set environment variables at the beginning of the program, but all the imports for all modules are performed right away at code launch. I was getting "not found" errors left and right.
What I want: To understand how to keep all the configurations stored in one place that is not easy to lose track of what I am doing. I want an easy way to keep these configuration files (ideally one file or script) under version control (security is a whole other issue, I digress). I want to be able to seamlessly switch contexts (local-test, local-production, serverA-test, serverA-production, serverB-test, serverB-production) My app uses
my_app_config.ini read by my parser
uwsgi.ini read by the uwsgi application server emperor
web_config.py used by the flask application
nginx.conf symlinked to the web server's configuration
celery configuration
not to mention different paths for everything (ideally handled within the magic config handling genie). I imagine once I figure this out I will be embarrassed it took so long to grasp.
Are Environment variables what I am trying to do here?
You have to try `simple-settings. It will resolve all you issues. One way set environment variable
in development
$ export SIMPLE_SETTINGS=settings.general,settings.development
$ python app.py
in production
$ export SIMPLE_SETTINGS=settings.general,settings.production
$ python app.py
You can keep `` development.pyandproduction.py` not in a repository for security reasons.
Example
settings/general.py
SIMPLE_CONF = 'simple'
app.py
from simple_settings import settings
print(settings.SIMPLE_CONF)
The documentation indicated many more features and benefits.
I am following a step-by-step tutorial blog on the flask micro framework for python.
I bumped into an issue, when they require me to 'setup' a configuration file in the root of the application folder, so it can easily be accessible if needed.
They called it config.py.
My question is, if the local path to my application is /home/test/application1/, should I create the file inside the ./application1/ directory? What gets me confused in this somewhat obvious question is that I did a quick search for other config.py files in the local directory inside /home/test/application1/, where I found 4 other files. There were in the following directories:
/home/test/application1/flask/lib/python2.7/site-packages/flask/testsuite/config.py
/home/test/application1/flask/lib/python2.7/site-packages/flask/config.py
/home/test/application1/flask/local/lib/python2.7/site-packages/flask/testsuite/config.py
/home/test/application1/flask/local/lib/python2.7/site-packages/flask/config.py
So should I create a new config.py file in the directory that I first mentioned or should I add some lines in one of the previously created config.py files.
Here is the source of the step-by-step tutorial:
It is at the beginning, right after Configuration.
Unlike other frameworks, Flask does not have a lot of rules, in general you can implement things in the way they make sense to you.
But note that the other config.py files that you found are all in the virtual environment, and are all scripts that come with Flask. They have nothing to do with the application configuration.
I wrote the tutorial you are following. In the tutorial I'm putting config.py outside of the application package. This is what I like, I consider the configuration separate from the application. My thinking is that you should be able to run the same application with different configuration files, so that for example, you can have a production configuration, a testing configuration and a development configuration, all different.
I hope this helps.
I'm fairly new to Python and Django, and I'm working on a webapp now that will be run on multiple servers. Each server has it's own little configuration details (commands, file paths, etc.) that I would like to just be able to store in a settings file, and then have a different copy of the file on each system.
I know that in Django, there's a settings file. However, is that only for Django-related things? Or am I supposed to put this type of stuff in there too?
This page discusses yours and several other situations involving configurations on Django servers: http://code.djangoproject.com/wiki/SplitSettings
It's for any type of settings, but it's better to put local settings in a separate file so that version upgrades don't clobber them. Have the global settings file detect the presence of the local settings file and then either import everything from it or just execfile() it.
There isn't a standard place to put config files. You can easily create your own config file as needed though. ConfigParser might suit your needs (and is a Python built-in).
Regardless of the format that I use for my config file, I usually have my scripts that depend on settings get the config file path from environment variables
(using bash):
export MY_APP_CONFIG_PATH=/path/to/config/file/on/this/system
and then my scripts pick up the settings like so:
import os
path_to_config = os.environ['MY_APP_CONFIG_PATH']