Using different dbs on production and test environment - python

I want to use a test db on my test environment, and the production db on production environment in my Python application.
How should I handle routing to two dbs? Should I have an untracked config.yml file that has the test db's connection string on my test server, and the production db's connection string on production server?
I'm using github for version control and travis ci for deployment.

Let's take Linux environment for example. Often, the user level configuration of an application is placed under your home folder as a dot file. So what you can do is like this:
In your git repository, track a sample configure file, e.g., config.sample.yaml, and put the configuration structure here.
When deploying, either in test environment or production environment, you can copy and rename this file as a dot-file, e.g., $HOME/.{app}.config.yaml. Then in your application, you can read this file.
If you are developing an python package, you can make the file copy operation done in the setup.py. There are some advantages:
You can always track the structure changes of your configuration file.
Separate configuration between test and production enviroment.
More secure, you do not need to code your import db connection information in the public file.
Hope this would be helpful.

Related

How to setup environment specific params for each environment in python

I am new to deploying python code that connects to databases up though our environments. How and where would I specify the database server and database info (or any environment specific info) for my application when it runs in our STAGE environment vs our PROD environment?
What is the best practice for doing this in a python application (the application isn't a web app or API app)?
The code is stored in a ADO repo and it has a build and release pipeline that pushes the code out to each VM.
If I understand your problem correctly, you want to have several possible configurations depending on the deployment environment without having to rework the code. If this is the case :
I recommend you to look at the configparser library. It allows you to read a configuration file with different parameters.
You can have one configuration file per environment and put your parameters in it.
For example:
In your file database.cfg
[DATABASE]
DATABASE_TYPE=mysql
USERNAME=user
PASSWORD=Password
IP_ADDRESS=127.0.0.1
PORT=3306
DATABASE_NAME=python
And in your python program
import configparser
config = configparser.ConfigParser()
config.read("database.cfg")
username = config["DATABASE"]["USERNAME"]

What is the best way to manage client/server specific files with git?

I use python to develop code on my work laptop and then deploy to our server for automation purposes.
I just recently started using git and github through PyCharm in hopes of making the deployment process smoother.
My issue is that I have a config file (YAML) that uses different parameters respective to my build environment (laptop) and production (server). For example, the file path changes.
Is there a git practice that I could implement that when either pushing from my laptop or pulling from the server it will excluded changes to specific parts of a file?
I use .gitignore for files such as pyvenv.cfg but is there a way to do this within a file?
Another approach I thought of would be to utilize different branches for local and remote specific parameters...
For Example:
Local branch would contain local parameters and production branch would contain production parameters. In this case I would push 1st from my local to the local branch. Next I would make the necessary changes to the parameters for production, in my situation it is much easier to work on my laptop than through the server, then push to the production branch. However, I have a feeling this is against good practice or simply changes the use of branches.
Thank you.
Config files are also a common place to store credentials (eg : a login/pwd for the database, an API key for a web service ...) and it is generally a good idea to not store those in the repository.
A common practice is to store template files in the repo (eg : config.yml.sample), to not store the actual config file along with the code (even add it in .gitignore, if it is in a versioned directory), and add steps at deployment time to either set up the initial config file or update the existing one - those steps can be manual, or scripted. You can backup and version the config separately, if needed.
Another possibility is to take the elements that should be adapted from somewhere else (the environment for instance), and have some user: $APP_DB_USER entries in your config file. You should provision these entries on both your servers - eg : have an env.txt file on your local machine and a different one on your prod server.

Is Appengine dispatch.yaml file affecting a specific module version?

I am starting to use modules in my python Google Appengine app.
I managed to test my configuration locally (on the dev server) and everything is working fine.
I want to test my changes on a side version online and didn't find a place that states whether the dispatch configuration will affect only my side version or my main serving version also (that's dangerous).
I know that cron.yaml is not a version specific file, how about dispatch.yaml?
Is it safe to deploy a side version with a dispatch file?
Thanks
From Configuration Files:
Optional application-level configuration files (dispatch.yaml,
cron.yaml, index.yaml, and queue.yaml) are included in the top
level app directory.
So no, you can't test a dispatch.yaml file change without affecting all versions of all your app's services/modules since it's an app-level configuration.
To be able to test app-level config file changes I'm using an entirely separate application as a staging environment.

Hiding settings.py passwords for Heroku Django deployment

I have sensitive data (database passwords) in settings.py and I was advised to upload my Django project to a github repository before pushing it to Heroku on their "Getting Started with Django on Heroku". If I put settings.py in .gitignore, then presumably it won't get deployed with my project. How can I prevent settings.py from being exposed but still get it deployed with my project ?
You can use environment variables (with heroku config:add SECRET=my-secret) to setup sensitive data and retrieve them in your settings with:
SECRET = os.environ.get('SECRET', 'my-default-secret-key')
If you don't want to be able to start your app without having set up some data, use this syntax instead:
SECRET = os.environ['SECRET']
It will raise an exception if you didn't set the SECRET environment variable.
You should use a tool designed for factoring out sensitive data. I use YamJam https://pypi.python.org/pypi/yamjam/ . It allows all the advantages of the os.environ method but is simpler -- you still have to set those environ variables, you'll need to put them in a script/ rc file somewhere. YamJam eliminates these questions and stores these config settings in a config store outside of the project. This allows you to have different settings for dev, staging and production.
from YamJam import yamjam
secret = yamjam()['myproject']['secret']
Is the basic usage. And like the os.environ method, it is not framework specific, you can use it with Django or any other app/framework. I've tried them all, multiple settings.py files, brittle logic of if/then and environment wrangling. In the end, I switched to yamjam and haven't regretted it.

Check if application is running on server or local

My app uses Python with cherrypy to provide a web service. I want to set up a different config file, depending on if my application is started on my local machine or on my remote server.
Is there any easy way to check if my application is started from the server or on my local machine? If not, maybe I could pass some parameters, when running python myApp.py, which myApp.py will then be passed to myApp.py? Of course an automatic solution would be nicer.
Create a local configuration file and put a variable named environment inside it. Assign it dev for local env and production for production, and anything else you'd like. Just set it once, then reuse it everywhere -
from local_settings.py import environment
if environment == 'dev':
debug = True
# anything you'd like
If you're using any VCS like git, and using that to deploy, ignore the local_settings.py file. Local settings files are also handy for saving sensitive data that should not be public in any VCS repo, like API keys and all.

Categories