Set environment variables in GAE control panel - python

I deploy my project to GAE over Github. There is some foreign API-key which I don't want to save in repository and make them public. Is it possible to set an environment variable for a project in GAE control panel so I can catch it in my application?

You can store your keys in datastore. Later when you need them in the code, you can fetch them from datastore and cache them by memcache.

I prefer using the Datastore for keys like this. See the code in my answer at Securely storing environment variables in GAE with app.yaml
That code auto-generates placeholder values that you can then update from the developer console. Also, it uses the ndb library, so reading the keys is fast.

You can define environment variables in configuration file for App Engine application. In case of Python, it is app.yaml
env_variables:
MY_ENV_VAR: 'some_value'
You can find more details here.
There is no such a thing like project parameters that can be defined in Developers Console at the moment.

Related

App Engine local environment shows incorrect data

Just started using Google Cloud SDK Shell after using the older, gui-based, version. I have multiple projects under development, if that matters.
Here's what I do
run gcloud SDK shell (click on the icon!)
cd \myproject
dev_appserver.py app.yaml
In the browser (Chrome),
browse to http://localhost:8000/datastore
Under Datastore Viewer, I see 'tables' from a completely different project
(say, myotherproject)
Under Datastore Indexes, I see 'indexes' from the correct project (myproject)
Under Task Queues, I see the correct queues listed (I have specified different queues setup for parts of myproject)
Everything works fine for myotherproject. So, is there something I am missing to get the Datastore Viewer to show the correct 'tables'?
Many thanks, David
Edit: no matter what project I run, Datastore Viewer shows the same data (from myotherproject) but Datastore Indexes show the correct indexes.
Edit: Windows 8.1, Python v2.7.13:a06454b1afa1
Edit: further questions 1) does gcloud sdk use a different datastore from the original app engine sdk? 2) if so, where is it by default or do I have to define it upfront?
Thanks to everyone for their help with this. It appears GCloud uses one datastore for all projects so the --datastore_path is not really optional when you have multiple paths. However, I kept getting errors with --datastore_path so I went with the following...
dev_appserver.py --storage_path=c:\gcdata\projectname app.yaml
Yes, could have been c:\temp but this gives me separate 'databases', one for each project.
Note also that GCloud SDK does not use the same data as the original App Engine SDK grrrrrr!

Pass environment variables to GAE instance

I'm using GAE to deploy my app and I have some environment variables that I would like to pass to my GAE instance. For example, every time I use the DB, the unix socket assignment is currently like this:
unix_socket='<my_path_to_my_sockets>/{instance}'
.format(instance=properties.get_property('data.instance'))
That is great but the problem is that it's a shared code and everytime someone makes a local test it changes the path and push the changes to the repository. When someone pull the new changes then it needs to change the in order to make the db requests because everyone have a different path to the sockets changes too. So I've created the following statement:
unix_socket= (os.environ['CLOUDSQL_INSTANCES_DIR']
if 'CLOUDSQL_INSTANCES_DIR' in os.environ
else '/home/cpinam/cloudsql/')
+ properties.get_property('cloud.instance')
So, if someone has an environment variable in its system, then it takes the variable and avoid the absolute path. The problem is that this environment variable is not referring to my computer but GAE instance.
The question is, how can I take my environment variable instead of any environment variable of server instance? Is it possible?
PS: I've know that I can pass environment variables through the app.yaml file but that implies to modify the file.
Thank you
App Engine does not support what you want in the way that you want it.
There are a few alternative approaches that you might want to consider. It sounds like your primary constraint is wanting to enable individual developers to store alternative configuration to what they might find in production. You might want to consider allowing developers to store said configuration in their local Datastore instances.
Your code would look something like:
if os.getenv('SERVER_SOFTWARE', '').startswith('Google App Engine/'):
unix_socket = os.environ['CLOUDSQL_INSTANCES_DIR']
...
else:
unix_socket = your_configuration_type.get("CLOUDSQL_INSTANCES_DIR")
...
Another alternative would be the approach outlined here where you'd store the relevant env vars in your own version of client_secrets.json and made sure that file was listed in .gitignore.
This can be currently done using the original App Engine SDK's appcfg.py command for deployment. This is not possible using the gcloud App Engine deployment as far as I know.
You can define a default environment variable in your app.yaml file:
env_variables:
HOST_NAME: ''
Pass your environment variable using -E option of the appcfg.py command -
-E NAME:VALUE. Ex : -E HOST_NAME:WOAH
-E description : Set an environment variable, potentially overriding an env_variable value from app.yaml file (flag may be repeated to set multiple variables).

How do I run a Django 1.6 project with multiple instances running off the same server, using the same db backend?

I have a Django 1.6 project (stored in a Bitbucket Git repo) that I wish to host on a VPS.
The idea is that when someone purchases a copy of the software I have written, I can type in a few simple commands that will take a designated copy of the code from Git, create a new instance of the project with its own subdomain (e.g. <customer_name>.example.com), and create a new Postgres database (on the same server).
I should hopefully be able to create and remove these 'instances' easily.
What's the best way of doing this?
I've looked into writing scripts using some sort of combination of Supervisor/Gnunicorn/Nginx/Fabric etc. Other options could be something more serious like using Docker or Vagrant. I've also looked into various PaaS options too.
Thanks in advance.
(EDIT: I have looked at the following services/things: Dokku (can't use Heroku due to data constraints), Vagrant (inc Puppet), Docker, Fabfile, Deis, Cherokee, Flynn (under dev))
If I was doing it (and I did a similar thing with a PHP application I inherited), I'd have a fabric command that allows me to provision a new instance.
This could be broken up into the requisite steps (check-out code, create database, syncdb/migrate, create DNS entry, start web server).
I'd probably do something sane like use the DNS entry as the database name: or at least use a reversible function to do that.
You could then string these together to easily create a new instance.
You will also need a way to tell the newly created instance which database and domain name they needed to use. You could have the provisioning script write some data to a file in the checked out repository that is then used by Django in it's initialisation phase.

Hiding settings.py passwords for Heroku Django deployment

I have sensitive data (database passwords) in settings.py and I was advised to upload my Django project to a github repository before pushing it to Heroku on their "Getting Started with Django on Heroku". If I put settings.py in .gitignore, then presumably it won't get deployed with my project. How can I prevent settings.py from being exposed but still get it deployed with my project ?
You can use environment variables (with heroku config:add SECRET=my-secret) to setup sensitive data and retrieve them in your settings with:
SECRET = os.environ.get('SECRET', 'my-default-secret-key')
If you don't want to be able to start your app without having set up some data, use this syntax instead:
SECRET = os.environ['SECRET']
It will raise an exception if you didn't set the SECRET environment variable.
You should use a tool designed for factoring out sensitive data. I use YamJam https://pypi.python.org/pypi/yamjam/ . It allows all the advantages of the os.environ method but is simpler -- you still have to set those environ variables, you'll need to put them in a script/ rc file somewhere. YamJam eliminates these questions and stores these config settings in a config store outside of the project. This allows you to have different settings for dev, staging and production.
from YamJam import yamjam
secret = yamjam()['myproject']['secret']
Is the basic usage. And like the os.environ method, it is not framework specific, you can use it with Django or any other app/framework. I've tried them all, multiple settings.py files, brittle logic of if/then and environment wrangling. In the end, I switched to yamjam and haven't regretted it.

Is it possible to deploy one GAE application from another GAE application?

In order to redeploy a GAE application, I currently have to install the GAE deployment tools on the system that I am using for deployment. While this process is relatively straight forward, the deployment process is a manual process that does not work from behind a firewall and the deployment tools must be installed on every machine that will be used for updating GAE apps. A more ideal solution would be if I could update a GAE application from another GAE application that I have deployed previously. This would remove the need to have multiple systems configured to deploy apps.
Since the GAE deployment tools are written in Python and the GAE App Engine supports Python, is it possible to modify appcfg.py to work from within GAE? The use case would be to pull a project from GitHub or some other online repository and update one GAE application from another GAE app. If this is not possible, what is the limiting constraint?
Is it possible? Yes. The protocol appcfg uses to update apps is entirely HTTP-based, so there's absolutely no reason you couldn't write an app that's capable of deploying other apps (or redeploying itself - self-modifying code)! You may even be able to reuse large parts of appcfg.py to do it.
Is it easy? Probably not. It's quite likely you'll need to understand a decent chunk of appcfg's internals, and the RPCs it uses to upload new apps - not a trivial undertaking. You'll also need to store your credentials in the app, in all likelihood - though you can use a role account that is and admin only for the apps it's deploying to minimize risk there.
One limiting constraint could be the protocol that the python sdk uses to communicate with the GAE servers. If it only uses HTTP, you might be OK. but if it's anything else, you might be out of luck because you can't open a socket directly from within GAE.
What problem did you have by trying to update behind a firewall?
I've got some, but finally I manage to work around them.
About your question, the constraint is that you cannot write files into a GAE app, so even though you could possibly pull from the VCS you can't write those pulled files.
So you would have to update from outside the GAE in first place.
Anyway every machine that needs to update the GAE should have the SDK anyway just to see if they changes work.
So, If you really want to do this you have two alternatives:
Host your own "updater" site and istall the SDK there, then when you want to update log into your side ( or run a script ) and do the remote update.
Although I don't know Amazon EC2 well, I think you can do pretty much the same thing as op 1 from there.
Finally I think the password to update has to be typed always. ( you could have the SDK of the App engine and modify that, because it is open source )

Categories