My app uses Python with cherrypy to provide a web service. I want to set up a different config file, depending on if my application is started on my local machine or on my remote server.
Is there any easy way to check if my application is started from the server or on my local machine? If not, maybe I could pass some parameters, when running python myApp.py, which myApp.py will then be passed to myApp.py? Of course an automatic solution would be nicer.
Create a local configuration file and put a variable named environment inside it. Assign it dev for local env and production for production, and anything else you'd like. Just set it once, then reuse it everywhere -
from local_settings.py import environment
if environment == 'dev':
debug = True
# anything you'd like
If you're using any VCS like git, and using that to deploy, ignore the local_settings.py file. Local settings files are also handy for saving sensitive data that should not be public in any VCS repo, like API keys and all.
Related
I'm trying to upload files to an S3 bucket and I'm using this tutorial. I'm setting config variables in the terminal using heroku config:set and when I enter heroku config in the terminal, my variables appear.
However, if I access S3_BUCKET in code when running locally, the value returned is None:
S3_BUCKET = os.environ.get('S3_BUCKET')
print("value of S3_BUCKET: " + S3_BUCKET)
This prints None.
Does anyone know why this doesn't work? The frontend of my application is in React, if that matters, but all of the bucket upload code is done in python.
heroku config:set sets variables on Heroku. Like their name suggests, environment variables are specific to a particular environment. Settings set this way have no impact on your local machine.
There are several ways to set environment variables locally. One common method involves putting them into an untracked .env file. Many tools, including heroku local, read that file and add variables found there to the environment when running locally.
Heroku recommends this, though you will have to make sure whatever tooling you use picks up those variables. Aside from heroku local, Visual Studio Code understands .env files for Python projects. So does Pipenv. There are also standalone tools for this like direnv.
Hi I am relatively new to programming and building my first flask project and I haven't been able to figure out if I should prefer accessing environment variables by using dotenv / load_dotenv or using them from a config.py file.
I understand the config route is more flexible but my question is specifically to do with environment variables.
Is there a best practice here? [I am building a simple app that will be hosted externally]
Best practices dictate that any value which is secret should not be hard-coded into any files which persist with the project or are checked into source control. Your config file is very likely to be saved in source control, so it should not store secrets, but instead load them from the environment variables set at execution time of the app. For example, let's say you are configuring an SMTP relay:
MAIL_PORT is a value that is not secret and not likely to change so it is a good candidate to be set in your config file.
MAIL_PASSWORD is a secret value that you do not want saved in your project's repository, so it should be loaded from the host's environment variables.
In this example, your config file might contain entries that look something like this:
MAIL_PORT = 465
MAIL_PASSWORD = os.environ.get('MAIL_PASSWORD')
Beyond evaluating whether or not a config value is a secret, also consider how often the value will change and how hard it would be to make that change. Anything hard-coded into your config file will require changing the file and adding a new commit to your source control, possibly even triggering a full CI/CD pipeline process. If the value were instead loaded from environment variables then this value could be changed by simply stopping the application, exporting the new value as an environment variable, and restarting the application.
Dotenv files are simply a convenience for grouping a number of variables together and auto-loading them to be read by your configuration. A .env file is not always used as these values can be manually exported when the app is invoked or handled by another system responsible for starting or scaling your application. Do not check .env or .flaskenv files into your source control.
I want to use a test db on my test environment, and the production db on production environment in my Python application.
How should I handle routing to two dbs? Should I have an untracked config.yml file that has the test db's connection string on my test server, and the production db's connection string on production server?
I'm using github for version control and travis ci for deployment.
Let's take Linux environment for example. Often, the user level configuration of an application is placed under your home folder as a dot file. So what you can do is like this:
In your git repository, track a sample configure file, e.g., config.sample.yaml, and put the configuration structure here.
When deploying, either in test environment or production environment, you can copy and rename this file as a dot-file, e.g., $HOME/.{app}.config.yaml. Then in your application, you can read this file.
If you are developing an python package, you can make the file copy operation done in the setup.py. There are some advantages:
You can always track the structure changes of your configuration file.
Separate configuration between test and production enviroment.
More secure, you do not need to code your import db connection information in the public file.
Hope this would be helpful.
I have a Flask application that I want to deploy to Amazon Elastic Beanstalk, but have the following problem:
Our configuration contains a lot of secret API keys, so we don't have it checked into Git.
Elastic Beanstalk only deploys the files that are committed to Git, so our config doesn't get uploaded as part of the deployment process.
Elastic Beanstalk offers a way to set and read environment variables - but Flask typically expects environment variables to be set in Python files that are imported at runtime.
Is there a way we can get our configuration file automatically uploaded to AWS EB alongside the source code when we deploy it? I considered having it in a file that gets created via configuration in .ebextensions/... but that would just need to be in Git anyway, defeating the object.
Or do we have to (a) convert all our code to read configuration from environment variables, and (b) create some sort of pre-startup script that injects the values in the current config scripts into the environment? It wouldn't be ideal to have to maintain 2 totally different ways of reading configuration data into our app depending on whether it's running on AWS or not.
I have seen this question/answer - How to specify sensitive environment variables at deploy time with Elastic Beanstalk - and I don't think that adequately addresses the issue because it is not very scalable to large numbers of config options nor deals with the fact that Flask typically expects its config in a different format.
If you are doing a 12 factor application, the simplest way to handle this kind of configuration is to lookup environment variables by default, and fall back on your hard-coded configurations:
from os import environ
from flask.config import Config
class EnvConfiguration(Config):
"""Configuration sub-class that checks its environment
If no environment variable is available,
it falls back to the hard-coded value provided"""
def __getitem__(self, key):
environ.get(key, super(EnvConfig, self).__getitem__(key))
You can then override either Flask.config_class (v1+) or Flask.make_config to ensure that the configuration used is the one you want:
class CustomApp(Flask):
# 1.0+ way
config_class = EnvConfiguration
# 0.8+ way (you only need one of these)
def make_config(self, instance_relative=False):
config = super(CustomApp, self).make_config(instance_relative=instance_relative)
config = EnvConfig(**config)
return config
I have a Flask-based python app that needs a bunch of configuration information (e.g. database connection parameters) on app start.
In my nginx configuration, I can provide parameters using uwsgi_param, as shown in this SO question.
However my problem is that the request.environ object is not available outside of a request handler, so I can't use it at application start. Furthermore, I'm trying to provide a mostly deployment-agnostic configuration mechanism so that I can test using Flask's embedded server during development and use parameters passed by uWSGI in production.
What's the best way to get configuration into my application so that it's accessible via os.environ or something similar?
Look at http://flask.pocoo.org/docs/config/#development-production. You always can have development, test and production config, you also can get some settings from OS envarment or specific file.
For example I have config module and import some secret settings (which don't want store with source code) from another module. Then on deploy I just replace file with secret settings. Probably better use for this OS envarement.