I have an AWS config file that my boto3 session has access to, via the AWS_CONFIG_FILE environment variable.
The config file looks like this: (multi-account environment)
[profile profile1]
credential_source Environment
region=us-east-whatever
role_arn=arn:aws:iam:<ACCOUNT NUMBER 1>:role/all-profiles-same-role-name
[profile profile2]
credential_source Environment
region=us-east-whatever
role_arn=arn:aws:iam:<ACCOUNT NUMBER 2>:role/all-profiles-same-role-name
[profile profileN]
credential_source Environment
region=us-east-whatever
role_arn=arn:aws:iam:<ACCOUNT NUMBER N>:role/all-profiles-same-role-name
In my Python code, I am trying to setup RefreshableCredentails (boto3 method) using somethign like this: (excluding full code because I think the issue is mainly about parsing the aws_config_file):
def __get_session_credentials(self):
# hardcode one role_arn for now but need to variablize
session_ttl=3000
aws_role_arn="arn:aws:iam::<ACCOUNT NUM>:role/all-profiles-same-role-name
...
Can I somehow parse the "role_arn" from the config file on a per profile basis to make that function more extensible? How would I do that?
You can use the configparser module from the standard library:
import configparser
from pathlib import Path
def main():
path_to_config = Path(Path.home(), ".aws", "config")
parser = configparser.ConfigParser()
parser.read(path_to_config)
for profile in parser.sections():
if "role_arn" in parser[profile]:
print(
"Found profile", profile, "with role_arn", parser[profile]["role_arn"]
)
if __name__ == "__main__":
main()
I'm not going to share the output here, though ;-)
I think that your problem could be solved if you configure a new environment variable AWS_PROFILE
According to boto3 docs[1]:
AWS_PROFILE
The default profile to use, if any. If no value is specified, Boto3 attempts to search the shared credentials file and the config file for the default profile.
And aws-cli (just for reference) docs[2]:
AWS_PROFILE
Specifies the name of the AWS CLI profile with the credentials and options to use. This can be the name of a profile stored in a credentials or config file, or the value default to use the default profile.
If defined, this environment variable overrides the behavior of using the profile named [default] in the configuration file. You can override this environment variable by using the --profile command line parameter.
So, just set this environment variable to the profileN
Related
I am trying to store my API Keys in a .env file
I created the file as a File containing settings for editor file type. Stored my APIKeys
TWILIO_ACCOUNT_SID=***
TWILIO_AUTH_TOKEN=***
TWIML_APPLICATION_SID=***
TWILIO_API_KEY=***
TWILIO_API_SECRET=***
Installed decouple, imported and used config to retrieve my API tokens in my settings.py file
from decouple import config
...
TWILIO_ACCOUNT_SID = config(TWILIO_ACCOUNT_SID)
TWILIO_AUTH_TOKEN = config(TWILIO_AUTH_TOKEN)
TWIML_APPLICATION_SID = config(TWIML_APPLICATION_SID)
TWILIO_API_KEY = config(TWILIO_API_KEY)
TWILIO_API_SECRET = config(TWILIO_API_SECRET)
I am however getting the error message:
TWILIO_ACCOUNT_SID = config(TWILIO_ACCOUNT_SID)
NameError: name 'TWILIO_ACCOUNT_SID' is not defined
You don't need to use the decouple library to read your environment variables.
Firstly download the .env plug-in supporter for PyCharm (if that's what you're using)
https://www.codestudyblog.com/cs2112pyc/1224021812.html
This will allow you to set and get variables from your file. Make sure your configuration has the correct .env file set.
my .env file has the variable set to:
TWILIO_ACCOUNT_SID=SUPER SECRET KEY
Then all you need to is:
import os
twilio_key = os.environ.get('TWILIO_ACCOUNT_SID')
print(twilio_key)
>>>SUPER SECRET KEY
Process finished with exit code 0
The instructions for authentication can be found here: https://gspread.readthedocs.io/en/latest/oauth2.html#for-end-users-using-oauth-client-id
Step 7 in the authentication sequence says:
"Move the downloaded file to ~/.config/gspread/credentials.json. Windows users should put this file to %APPDATA%\gspread\credentials.json"
Is anyone aware of a way to keep the credentials.json file somewhere else and use it to authorize gpread?
Alternatively I have thought about using shutil.move to grab the json file move it to the desired location but need to be able to do that without making assumptions about the whereabout of the python library or even if it is on a windows or unix machine. Any environmental variables that would reveal location of certain libraries?I could do something like this:
import gspread, os, shutil
loc = gspread.__location__
cred_path = os.path.join(loc, "credentials.json")
if not os.path.isfile(cred_path):
shutil.move(input("Enter creds path:"), cred_path)
Found the solution to my own question. This function will set all the relevant environmental variables to your directory of choice where the credentials.json file should be kept (and the authorized_user.json file.):
import gspread.auth as ga
def gspread_paths(dir):
ga.DEFAULT_CONFIG_DIR = dir
ga.DEFAULT_CREDENTIALS_FILENAME = os.path.join(
ga.DEFAULT_CONFIG_DIR, 'credentials.json')
ga.DEFAULT_AUTHORIZED_USER_FILENAME = os.path.join(
ga.DEFAULT_CONFIG_DIR, 'authorized_user.json')
ga.DEFAULT_SERVICE_ACCOUNT_FILENAME = os.path.join(
ga.DEFAULT_CONFIG_DIR, 'service_account.json')
ga.load_credentials.__defaults__ = (ga.DEFAULT_AUTHORIZED_USER_FILENAME,)
ga.store_credentials.__defaults__ = (ga.DEFAULT_AUTHORIZED_USER_FILENAME, 'token')
EDIT: Recently there was a merged pull request that added this functionality to gspread: https://github.com/burnash/gspread/pull/847
[plugin_jira]
maxuser=
finduser=
endpoint="https://nepallink.atlassian.net/rest/api/latest/user/search?startAt=0&maxResults={maxi}&username={manche}%".format(maxi=maxresult,manche=user_find)
This is my config file , in the endpoints item why I am using the format is so that I can pass a variable to it in my script.The script where it is running is below ,my main.py
maxresult = config.get('plugin_jira', 'maxuser')
user_find = config.get('plugin_jira', 'finduser')
endpoint = config.get('plugin_jira', 'endpoint')
Now what I am confused is when I call the endpoints values in the script it just fetching what is in the config without the variable values that got defined just above it.
How can I make the variable value of maxresult and user_find added to endpoints which is defined to access it.
In your config file, import the variables from main.py as following.
Config file:
from main import maxresult, user_find
[plugin_jira]
endpoint="https://nepallink.atlassian.net/rest/api/latest/user/search?startAt=0&maxResults={maxi}&username={manche}%".format(maxi=maxresult,manche=user_find)
This will allow you to access the required variables to your endpoints.
Hope it helps!
I have a Python project where I am using a Twilio number (201-282-1111), but I do not want this private number to be in my public Git repository. I could make a new repo and just not include the values, but I would like it be easy enough for non techie people to use my script.
My question is, what can I do to mask the fake number provided below? Also, I have Twilio account_sid and auth_token that I would like to mask as well.
client.sms.messages.create(
body = "Your song was not found",
to = phoneNum,
from_ = "+2012821111")
account_sid = "XX1a116061fe67df9ab9eb0bc3c7ed0111"
auth_token = "1x11bc1b6555c52a93762f46d45861xx"
client = TwilioRestClient(account_sid,auth_token)
Use a config file. On one of my projects, this is what I did:
Install ConfigParser.
Create a file named config (or another name, your choice) that looks like this:
[Config]
account_sid=<your sid>
auth_token=<token>
phone_number=<your number>
Edit your .gitignore file and add config to it, this will keep Git from committing your config file to the repository
[optional] Add a config.sample file, which is a copy of the config file but with your values set to defaults or dummy values. Commit this one to Git. This helps other people set up your app later - all they have to do is copy the file to config and put in their credentials.
To access the config values, use this code:
config = ConfigParser.RawConfigParser()
config.read('config')
try:
your_value = config.get("Config", "your_value")
print your_value
except ConfigParser.NoOptionError:
print "'your_value' isn't defined in the config file!"
Another option would be to use an environment variable or to simply ask the user for their credentials when your app starts.
In trying to find a place to store and save settings beyond settings.py and the database, I used an environment.json for environment variables. I import these in settings.py.
My problem is that when I try to change or store new values in my environment, env, settings.py does not notice the change - perhaps because the time and number of times settings.py is read by Django.
Is there a way I would be able to use my environment variables the way I want like attempted below?
# settings.py
import json
with open('/home/dotcloud/environment.json') as f:
env = json.load(f)
EMAIL_HOST = env.get('EMAIL_PORT', '500')
# views.py
import json
def site_configuration(request):
with open('/home/dotcloud/environment.json') as f:
env = json.load(f)
if request.method == 'POST':
os.environ['EMAIL_PORT'] = request.POST['email_port']
return render(request, ...)
# python manage.py shell demo
>>> import json
>>> with open('/home/dotcloud/environment.json') as f:
... env = json.load(f)
...
>>> project_settings.EMAIL_PORT
'500'
>>> env['EMAIL_PORT']
Traceback (most recent call last):
File "<console>", line 1, in <module>
KeyError: 'EMAIL_PORT'
>>> env['EMAIL_PORT'] = "123"
>>> env['EMAIL_PORT']
'123'
>>> project_settings.EMAIL_PORT
'500'
>>> project_settings.EMAIL_PORT == env['EMAIL_PORT']
False'
And if not, how else could I store changeable settings that are retrieved by settings.py somewhere in my Django project?
You might want to look into foreman (GitHub) or honcho (GitHub). Both of these look for a .env file in your current directory from which to load local environment variables.
My .env looks like this for most projects (I use dj-database-url for database configuration):
DATABASE_URL=sqlite://localhost/local.db
SECRET_KEY=<a secret key>
DEBUG=True
In your settings.py file, you can load these settings from os.environ like this:
import os
DEBUG = os.environ.get('DEBUG', False)
If there are required settings, you can assert their presence before trying to set them:
assert 'SECRET_KEY' in os.environ, 'Set SECRET_KEY in your .env file!'
SECRET_KEY = os.environ['SECRET_KEY']
I've been using this method of handling local settings for the last few projects I've started and I think it works really well. One caveat is to never commit your .env to source control. These are local settings that exist only for the current configuration and should be recreated for a different environment.
I see the question changed slightly, the original answers are still below but this one has a slightly different answer:
First, make sure you are using the right settings.py (print 'This file is being loaded' should do the trick).
Second, personally I would advise against using json files for config since it is less dynamic than Python files, but it should work regardless.
My recommended way of doing something like this:
create a base_settings.py file with your standard settings
create a settings.py which will be your default settings import. This file should have a from base_settings import * at the top to inherit the base settings.
If you want to have a custom settings file, dotcloud_settings.py for example, simply add the from dotcloud_settings import settings (or base_settings) and set the environment variable DJANGO_SETTINGS_MODULE to dotcloud_settings or your_project.dotcloud_settings depending on your setup.
Do note that you should be very careful with importing Django modules from these settings files. If any module does a from django.conf import settings it will stop parsing your settings after that point.
As for using json files, roughly the same principle of course:
Once again, make sure you don't have anything that imports django.conf.settings here
Make all of the variables within your json file global to your settings file:
import json
with open('/home/dotcloud/environment.json') as f:
env = json.load(f)
# A little hack to make all variables within our env global
globals().update(env)
Regardless though, I'd recommend turning this around and letting the settings file import this module instead.
Also, Django doesn't listen to environment variables by default (besides the DJANGO_SETTINGS_MODULE so that might be the problem too.