I've been exposed to both of these tools , but they seem to serve the same purpose. My question is are they different and if so, how?
In my research it seems to me that autoenv is global in scope while dotenv is a bit more application specific. While this seems an advantage in many cases, I wonder if it could also create unforeseen problems.
Second what would be the pros / cons of using one over the other (or should I use each in different situations?)
I've read through documentation for each, but have been unable to find an article comparing the two. It is relatively recent that I've developed a stronger grasp on environment variables in general so apologies if I missed something obvious in the documentation.
I'm primarily developing web apps with Flask and deploying on Heroku if that would influence my choice.
Thanks in advance.
autoenv, is meant for the cli, to enable environments when you cd into a directory containing an .env file.
Fx. if you need some of the environment variables on your local development environment whenever you cd to the directory, you would use autoenv or the more mature alternative direnv.
dotenv is used in python to find an .env file in the running directory or parent directories and load their variables, this is good for services as they usually don't have a shell running.
So for your heroku deployments you should use dotenv.
If you however are putting in config vars straight in the heroku settings, you don't need either, you would simply use os.getenv:
from os import getenv
print(getenv('MY_ENVIRONMENT_VARIABLE'))
Related
I have already searched on the web on this doubt, but they don't really seem to apply to my case.
I have 3 different config files - Dev, Staging, Prod (of course)
I want to modularize settings properly without repetition. So, I have made base_settings.py and I am importing it to dev_settings.py, stg_settings.py etc.
Problem - How to invoke the scripts on each env properly with minimal changes?
Right now, I'm doing this (taking dev env as an example)-
python manage.py runserver --settings=core.dev_settings
This works so far, but I am not convinced on how good workaround is this.
Because wsgi.py and a couple of other services have -
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'core.settings')
I am looking to do something without changing the config files of other services. Thank you everyone in advance.
PS - I've tried to be as clear as possible, but please excuse me if anything is unclear.
Just set DJANGO_SETTINGS_MODULE in environment variables to your desired config file.
That won't make you to change any of other services config files, and you don't even need to change django settings files.
Have a look at the Django Configurations package.
I know this issue has been discussed before, but I am struggling to find a starightforward explanation of how to approach configuration between local development and production server.
What I have done so far: I had one my_app_config.py file that had a section with machine / scenario (test vs production) sections I could just comment out. I would develop with my local machine path hardcoded, test database connection string, my test spreadsheet location, etc. When it comes time to deploy the code to the server, I comment out the "test" section and uncomment the "production section". As you may guess, this is wrought with errors.
I recently adopted the Python ConfigParser library to use .ini files. Now, I have the following lines in my code
import ConfigParser
config = ConfigParser.RawConfigParser()
config.read(os.path.abspath(os.path.join(os.path.dirname( __file__ ), '..', 'settings',
'my_app_config.ini')))
database_connect_string_admin = config.get('Database', 'admin_str')
The problems with this are many...
I need to have the import at the top of every file
The filename my_app_config.ini can't change. So, I rely on comments within the content of the .ini file to know which one I'm dealing with. They are stored in a folder tree so I know which is which.
notice the path to the config file is defined here. So, depending where the python file lives in the tree structure dictates if I get a copy / paste error.
I tried to set environment variables at the beginning of the program, but all the imports for all modules are performed right away at code launch. I was getting "not found" errors left and right.
What I want: To understand how to keep all the configurations stored in one place that is not easy to lose track of what I am doing. I want an easy way to keep these configuration files (ideally one file or script) under version control (security is a whole other issue, I digress). I want to be able to seamlessly switch contexts (local-test, local-production, serverA-test, serverA-production, serverB-test, serverB-production) My app uses
my_app_config.ini read by my parser
uwsgi.ini read by the uwsgi application server emperor
web_config.py used by the flask application
nginx.conf symlinked to the web server's configuration
celery configuration
not to mention different paths for everything (ideally handled within the magic config handling genie). I imagine once I figure this out I will be embarrassed it took so long to grasp.
Are Environment variables what I am trying to do here?
You have to try `simple-settings. It will resolve all you issues. One way set environment variable
in development
$ export SIMPLE_SETTINGS=settings.general,settings.development
$ python app.py
in production
$ export SIMPLE_SETTINGS=settings.general,settings.production
$ python app.py
You can keep `` development.pyandproduction.py` not in a repository for security reasons.
Example
settings/general.py
SIMPLE_CONF = 'simple'
app.py
from simple_settings import settings
print(settings.SIMPLE_CONF)
The documentation indicated many more features and benefits.
I have a python/django project that I've set up for development and production using git revision control. I have three settings.py files:
-settings.py (which has dummy variables for potential open source project),
-settings_production.py (for production variables), and
-settings_local.py (to override settings just for my local environment). This file is not tracked by git.
I use this method, which works great:
try:
from settings_production import *
except ImportError, e:
print 'Unable to load settings_production.py:', e
try:
from settings_local import *
except ImportError, e:
print 'Unable to load settings_local.py:', e
HOWEVER, I want this to be an open source project. I've set up two git remotes, one called 'heroku-production' and one called 'github-opensource'. How can I set it up so that the 'heroku-remote' includes settings_production.py while 'github-opensource' doesn't, so that I can keep those settings private?
Help! I've look at most of the resources over the internets, but they don't seem to address this use case. Is this the right way? Is there a better approach?
The dream would be to be able to push my local environment to either heroku-production or github-opensource without haveing to mess with the settings files.
Note: I've looked at the setup where you use environment variables or don't track the production settings, but that feels overly complicated. I like to see everything in front of me in my local setup. See this method.
I've also looked through all these methods, and they don't quite seem to fit the bill.
There's a very similar question here. One of the answers suggests git submodules
which I would say are the easiest way to go about this. This is a problem for your VCS, not your Python code.
I think using environment variables, as described on the Two Scoops of Django book, is the best way to do this.
I'm following this approach and I have an application running out of a private GitHub repository in production (with an average of half a million page views per month), staging and two development environments and I use a directory structure like this:
MyProject
-settings
--__init__.py
--base.py
--production.py
--staging.py
--development_1.py
--development_2.py
I keep everything that's common to all the environments in base.py and then make the appropiate changes on production.py, staging.py, development_1.py or development_2.py.
My deployment process for production includes virtualenv, Fabric, upstart, a bash script (used by upstart), gunicorn and Nginx. I have a slightly modified version of the bash script I use with upstart to run the test server; it is something like this:
#!/bin/bash -e
# starts the development server using environment variables and django-admin.py
PROJECTDIR=/home/user/project
PROJECTENV=/home/user/.virtualenvs/project_virtualenv
source $PROJECTENV/bin/activate
cd $PROJECTDIR
export LC_ALL="en_US.UTF-8"
export HOME="/home/user"
export DATABASES_DEFAULT_NAME_DEVELOPMENT="xxxx"
export DATABASES_DEFAULT_USER_DEVELOPMENT="xxxxx"
export DATABASES_DEFAULT_PASSWORD_DEVELOPMENT="xxx"
export DATABASES_DEFAULT_HOST_DEVELOPMENT="127.0.0.1"
export DATABASES_DEFAULT_PORT_DEVELOPMENT="5432"
export REDIS_HOST_DEVELOPMENT="127.0.0.1:6379"
django-admin.py runserver --pythonpath=`pwd` --settings=MyProject.settings.development_1 0.0.0.0:8006
Notice this is not the complete story and I'm simplifying to make my point. I have some extra Python code in base.py that takes the values from these environment variables too.
Play with this and make sure to check the relevant chapter in Two Scoops of Django, I was also using the import approach you mentioned but having settings out of the repository wasn't easy to manage and I made the switch a few months ago; it's helped me a lot.
I'm currently developing a Python project which is growing and I may implement it as a webapp in GAE in the future.
As the project is growing I'm pruning potentially reusable code into separate packages & modules, which at present are on my PYTHONPATH.
Do you have any advice on how to structure my project, and reusable packages so that it will fit nicely into a GAE project in the future?
Looking at recommendations on GAE project structure in other posts (such as this and this and this) seem fairly flat - is that the best way to go?
What about 3rd party packages/modules? Is it best to bite the bullet and use VirtualEnv from the beginning?
Thanks very much.
Prembo.
Just put your various libraries in packages off the root directory of your app. The root directory is automatically added to your app's sys.path.
If you wish you can put them in a lib directory off the root, you can do so, but you'll have to write a module that adds that directory to the end of sys.path, and import it before you import anything from lib.
Using virtualenv is an option, but I personally don't think it gains you much, since you can't run a virtualenv in production, and the dev_appserver emulates the production environment.
I can't tell you about GAE in particular, but I can tell you that biting the bullet has nothing to do with it - using VirtualEnv (and virtualenvwrapper) will make your Python development smoother, simpler, and easier all around.
The over head is low, the benefits are many.
Switch. Now.
My master thesis as a student was implemented in App Engine. The project is open source and you can use it however you like, I hope you will get the idea and you might adapt it to your needs.
The GAE Python SDK creates its own Virtual Environment when running on your local environment, so virtualenv won't help you much. There are frameworks like web2py and django-nonrel that works well with GAE, if you're up to porting your code, or at least take inspiration on their folder structure.
I'm running Mac OS X Leopard and wanted to know what the easy way to setup a web development environment to use Python, MySQL, Apache on my machine which would allow me to develop on my Mac and then easily move it to a host in the future.
I've been trying to get mod_wsgi installed and configured to work with Django and have a headache now. Are there any web hosts that currently use mod_wsgi besides Google, so I could just develop there?
FWIW, we've found virtualenv [http://pypi.python.org/pypi/virtualenv] to be an invaluable part of our dev setup. We typically work on multiple projects that use different versions of Python libraries etc. It's very difficult to do this on one machine without some way to provide a localized, customized Python environment, as virtualenv does.
Most Python applications are moving away from mod_python. It can vary by framework or provider, but most development effort is going into mod_wsgi.
Using the WSGI standard will make your Python application server agnostic, and allow for other nice additions like WSGI middleware. Other providers may only provide CGI (which won't scale well performance wise), or FastCGI.
I've worked with Django using only the included server in the manager.py script and have not had any trouble moving to a production environment.
If you put your application in a host that does the environment configuration for you (like WebFaction) you should not have problems moving from development to production.
I run a Linux virtual machine on my Mac laptop. This allows me to keep my development environment and production environments perfectly in sync (and make snapshots for easy experimentation / rollback). I've found VMWare Fusion works the best, but there are free open source alternatives such as VirtualBox if you just want to get your feet wet.
I share the source folders from the guest Linux operating system on my Mac and edit them with the Mac source editor of my choosing (I use Eclipse / PyDev because the otherwise excellent TextMate doesn't deal well with Chinese text yet). I've documented the software setup for the guest Linux operating system here; it's optimized for serving multiple Django applications (including geodjango).
For extra added fun, you can edit your Mac's /etc/hosts file to make yourdomainname.com resolve to your guest Linux boxes internal IP address and have a simple way to work on / test multiple web projects online or offline without too much hassle.
What you're looking for is Mod_Python. It's an Apache-based interpreter for Python. Check it out here:
http://www.modpython.org/
Google App Engine has done it for you. Some limitations but it works great, and it gives you a path to hosting free.
Of course Mac OS X, in recent versions, comes with Python and Apache. However you may want to have more flexibility in the versions you use, or you may not like the tweaks Apple has made to the way they are configured. A good way to get a more generic set of tools, including MySQL, is to install them anew. This will help your portability issues. The frameworks can be installed relatively easily with one of these open source package providers.
Fink
MacPorts
MAMP
mod_wsgi is really, really simple.
Pyerweb is a really simple (~90 lines including comments/whitespace) WSGI-compliant routing-framework I wrote. Basically the WSGI API is just a function that gets passed environ, and wsgi_start_response, and it returns a string.
envrion is a dict with the request info, for example environ['PATH_INFO'] is the request URI)
wsgi_start_response which is a callable function which you execute to set the headers,:
wsgi_start_response(output_response, output_headers)
output_response is the string containing the HTTP status you wish to send (200 OK etc), and output_headers is a list-of-tuples containing your headers (for example, [("Content-type", "text/html")] would set the content-type)
Then the function returns a string containing your output.. That's all there is to it!
To run it, using spawning you can just do spawn scriptname.my_wsgi_function_nae and it will start listening on port 8080.
To use it via mod_wsgi, it's documentation is good, http://code.google.com/p/modwsgi/wiki/QuickConfigurationGuide and there is a django specific section
The up-side to using mod_wsgi is it's the standard for serving Python web-applications. I recently decided to play with Google App Engine, and was surprised when Pyerweb (which I linked to at the start of this answer) worked perfectly on it, completely unintentionally. I was even more impressed when I noticed Django applications run on it too.. Standardisation is a good thing!
You may want to look into web2py. It includes an administration interface to develop via your browser. All you need in one package, including Python.
Check out WebFaction—although I don't use them (nor am I related to / profit from their business in any way). I've read over and over how great their service is and particularly how Django-friendly they are. There's a specific post in their forums about getting up and running with Django and mod_wsgi.
Like others before me in this thread, I highly recommend using Ian Bicking's virtualenv to isolate your development environment; there's a dedicated page in the mod_wsgi documentation for exactly that sort of setup.
I'd also urge you to check out pip, which is basically a smarter easy_install which knows about virtualenv. Pip does two really nice things for virtualenv-style development:
Knows how to install from source control (SVN, Git, etc...)
Knows how to "freeze" an existing development environement's requirements so that you can create that environment somewhere else—very very nice for multiple developers or deployment.