Cloudwatch API: How to use multiple AWS accounts - python

I'm trying to get the cpu utilization for ec2 instances for an account. My code is like following.
def GetRegions():
return array of regions
def getEC2InstanceID(RegionName):
cloudwatch = boto3.client('cloudwatch', region_name=RegionName)
response = cloudwatch.get_metric_statistics(
.
.
.)
returns array of ec2instanceID
def EC2_Average_Utilization(InstanceID, RegionName):
returns avg cpuusage
def main():
regions= GetRegions()
for i in range(len(regions)):
print(regions[i])
instance_id = getEC2InstanceID(regions[i])
print(instance_id) # prints all the instances if there is any
if (type(instance_id)==list):
for j in range(len(instance_id)):
print(instance_id[j])
print ("For InstanceID "+ instance_id[j] + ":")
EC2_Average_Utilization(instance_id[j], regions[i])
This code executes perfectly for all the regions under only one account. If I want to do the same thing for multiple AWS accounts, what will be the procedure?
n.b I've seen configuring the .aws/config by creating multiple profiles under every account in .aws/credentials, but as I'm generating the regions in the code, I don't want to specify them.

You will need to use a boto3 Session object, the 'Security Token Service (STS)', and a call to assume_role for each account/region combo. The effect is the same as the named profile - you need a role in each account with adequate permissions to call the API methods (EC2, CloudWatch, etc). Also, the target roles need a trust relationship back to the original account credentials.
sts = boto3.client('sts')
#this is called with your default credentials. Target roles need to trust this identity
creds = sts.assume_role(RoleArn='...', RoleSessionName='...')
# set up a session w/ the temporary credentials
session = boto3.Session(
aws_access_key_id=creds['Credentials']['AccessKeyId'],
aws_secret_access_key=creds['Credentials']['SecretAccessKey'],
aws_session_token=creds['Credentials']['SessionToken']
region_name='...')
# all subsequent clients/resources should be instantiated from the session object
cloudwatch = session.client('cloudwatch')
Hope this helps.
See https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/sts.html#STS.Client.assume_role

Related

AWS Assume Role STS

I am trying to access my S3 bucket daily using Python but my session expires every so often. Someone on this site advised I use an "Assumed Role" STS script to re-establish connection. I found a script that uses it and I am getting the following error. FYI, i have my credentials file in .aws folder.
"botocore.exceptions.NoCredentialsError: Unable to locate credentials"
below is my code:
import boto3
# The calls to AWS STS AssumeRole must be signed with the access key ID
# and secret access key of an existing IAM user or by using existing temporary
# credentials such as those from another role. (You cannot call AssumeRole
# with the access key for the root account.) The credentials can be in
# environment variables or in a configuration file and will be discovered
# automatically by the boto3.client() function. For more information, see the
# Python SDK documentation:
# http://boto3.readthedocs.io/en/latest/reference/services/sts.html#client
# create an STS client object that represents a live connection to the
# STS service
sts_client = boto3.client('sts')
# Call the assume_role method of the STSConnection object and pass the role
# ARN and a role session name.
assumed_role_object=sts_client.assume_role(
RoleArn="ARNGOESHERE",
RoleSessionName="AssumeRoleSession1"
)
# From the response that contains the assumed role, get the temporary
# credentials that can be used to make subsequent API calls
credentials=assumed_role_object['Credentials']
# Use the temporary credentials that AssumeRole returns to make a
# connection to Amazon S3
s3_resource=boto3.resource(
's3',
aws_access_key_id=credentials['AccessKeyId'],
aws_secret_access_key=credentials['SecretAccessKey'],
aws_session_token=credentials['SessionToken'],
)
# Use the Amazon S3 resource object that is now configured with the
# credentials to access your S3 buckets.
for bucket in s3_resource.buckets.all():
print(bucket.name)
You will have 2 options here:
Create a separate user with programmatic access. This would be permanent and the credentials would not expire. Usually this is not allowed for developers in organizations for security concerns. Refer steps:
https://aws.amazon.com/premiumsupport/knowledge-center/create-access-key/
If you are not allowed to have a permanent access token through the above method, then you can get the token expiration duration increased from default (1 hour) to 12 hours max to skip re-running PowerShell script every hour or so. For that, you would need to modify the PowerShell script 'saml2aws' you run to get credentials.
Add the arg 'DurationSeconds' for assume_role_with_saml() method. Refer: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/sts.html#STS.Client.assume_role_with_saml
response = client.assume_role_with_saml(
RoleArn='string',
PrincipalArn='string',
SAMLAssertion='string',
PolicyArns=[
{
'arn': 'string'
},
],
Policy='string',
DurationSeconds=123
)
The max duration you can enter here would be as per max session duration setting for your role. You can view it in your AWS console at IAM>Roles>{RoleName}>Summary>MaximumSessionDuration.

Trigger an Azure Function upon VM creation

I'm looking for a way to trigger an Azure Function which will run some Python code, each time a new virtual machine is created. I have already done the same thing in AWS using CloudWatch + Lambda, but I can't find where/how achieve the same thing in Azure.
I have tried to use Logic App with Event Grid but there is no trigger to monitor VM state.
Anyone could provide me with some guidance here ?
Many thanks in advance.
Azure service don't have built-in method to achieve your requirement, but I think you can achieve this by your own python code. The main logic is to polling the VM names from your subscription and then store the VM names in somewhere, if they changes, post a request to something like 'HttpTrigger' endpoint(Or just put the logic in the polling algorithm.).
And the for the polling algorithm, you can design by yourself or just use the 'TimeTrigger' to achieve.
I notice you add the 'Python' tag, so just use code like below and put them inside a polling algorithm:
import requests
from azure.identity import ClientSecretCredential
import json
client_id = 'xxx'
tenant_id = 'xxx'
client_secret = 'xxx'
subscription_id = 'xxx'
credential = ClientSecretCredential(tenant_id=tenant_id, client_id=client_id, client_secret=client_secret)
accesstoken = str(credential.get_token('https://management.azure.com/.default'))[19:1287]
bearertoken = "Bearer "+accesstoken
r = requests.get("https://management.azure.com/subscriptions/"+subscription_id+"/resources?$filter=resourceType eq 'Microsoft.Compute/virtualMachines'&api-version=2020-06-01",headers={'Authorization': bearertoken})
items = json.loads(r.text)
print(r.text)
for item in items['value']:
print(item['name'])#This line is print, you need to store this in some place such as database, azure blob storage, azure table storage etc.
#check the VM names here. If some VM been added, post a request to the HttpTrigger function.
If you use azure function 'Time Trigger' instead of self-designed algorithm, then you can store the client id, tenent id, client_secret and subscription id to the keyvault and then let your function app configuration settings refer to the keyvault, this will make it safe.
Above code is based on AAD bearer token, you need to create a AAD App and let it have the 'Owner' RBAC role of the subscription. You need to something like this:
This just like a 'custom trigger' that trigger by the VM created in your 'subscription'. And I think your VM will not be many, so it will not consume much computing resources.

Switching IAM-user roles with Athena and boto3

I am writing a python program using boto3 that grabs all of the queries made by a master account and pushes them out to all of the master account's sub accounts.
Grabbing the query IDs from the master instance is done, but I'm having trouble pushing them out to the sub accounts. With my authentication information AWS is connecting to the master account by default, but I can't figure out how to get it to connect to a sub account. Generally AWS services do this by switching roles, but Athena doesn't have a built in method for this. I could manually create different profiles but I'm not sure how to switch them manually in the middle of code execution
Here's Amazon's code example for switching in STS, which does support assuming different roles https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch-role-api.html
Here's what my program looks like so far
#!/usr/bin/env python3
import boto3
dev = boto3.session.Session(profile_name='dev')
#Function for executing athena queries
client = dev.client('athena')
s3_input = 's3://dev/test/'
s3_output = 's3://dev/testOutput'
database = 'ex_athena_db'
table = 'test_data'
response = client.list_named_queries(
MaxResults=50,
WorkGroup='primary'
)
print response
So I have the "dev" profile, but I'm not sure how to differentiate this profile to indicate to AWS that I'd like to access one of the child accounts. Is it just the name, or do I need some other paramter? I don't think I can (or need to) generate a seperate authentication token for this
I solved this by creating a new user profile for the sub account with a new ARN
sample config
[default]
region = us-east-1
[profile ecr-dev]
role_arn = arn:aws:iam::76532435:role/AccountRole
source_profile = default
sample code
#!/usr/bin/env python3
import boto3
dev = boto3.session.Session(profile_name='name', region_name="us-east-1")
#Function for executing athena queries
client = dev.client('athena')
s3_input = 's3:/test/'
s3_output = 's3:/test'
database = 'ex_athena_db'
response = client.list_named_queries(
MaxResults=50,
WorkGroup='primary'
)
print response

using boto3 and Python how to create AWS MFA authorized session which can be used by other roles

SCENARIO
I have two AWS accounts linked to a third aws account, main account.
The accounts names, for instance, are aws_acc_main, aws_acc_1, aws_acc_2
aws_main is used to maintain the users, groups and roles for the other two accounts. however, except IAM no other services are used in the main account.
Using boto3 and Python I want to get ec2 instances for the account-1 and account-2
Thr AWS configuration looks like:
.aws/credentials file
[aws_acc_main]
aws_access_key_id=AKJFJHNUCTYUUAPW
aws_secret_access_key = 2uldfr94tuowjuHUKbnBIby8jhfdgjh
.aws/config
[aws_acc_main]
output = json
region = eu-west-2
[profile aws_acc_1]
source_profile = aws_acc_main
output = json
region = eu-west-2
role_arn = arn:aws:iam::111111111111:role/ACC1_ROLE
mfa_serial = arn:aws:iam::111111111110:mfa/ABCD
[profile aws_acc_2]
source_profile = aws_acc_main
output = json
region = eu-west-2
role_arn = arn:aws:iam::111111111112:role/ACC2_ROLE
mfa_serial = arn:aws:iam::111111111110:mfa/ABCD
Now the python files.
test1.py
sessions = ['aws_acc_1', 'aws_acc_2']
for session_name in sessions:
session = boto3.Session(profile_name=session_name)
ec2 = session.client('ec2', region_name='eu-west-2')
resources = ec2.describe_instances()
test2.py
mfa_code = raw_input("Enter the MFA code: ")
client = boto3.client('sts')
response = client.get_session_token(
DurationSeconds=3600,
SerialNumber='arn:aws:iam::111111111110:mfa/ABCD',
TokenCode=mfa_code
)
credentials = response['Credentials']
for session_name in sessions:
session = boto3.Session(profile_name=session_name,
aws_access_key_id = credentials['AccessKeyId'],
aws_secret_access_key = credentials['SecretAccessKey'],
aws_session_token = credentials['SessionToken'],
)
ec2Client = session.client('ec2', region_name='eu-west-2')
resources = ec2.describe_instances()
PROBLEM
The first (test1.py) works OK but I have to provide MFA for each account in every iteration.
The second file (test2.py) does not give any errors either but it does not read the EC2 service of aws_acc_1 and aws_acc_2, instead it only get the ec2 service of aws_acc_main, which has nothing in it. It does not even give any error.
DESIRED OUTPUT
I want to give MFA only once for the entire session. Then, I want to switch roles without providing MFA again and again. It means I want to fix the test2.py.
QUESTION
In AWS web client, once I login using my userid, password and MFA, I can switch roles without proving MFA again and again. That is what I want to do.
If I use boto3 to create a session, how can I use the same session to switch roles to different accounts without providing MFA again and again?
Please note... that an important thing in this scenario is that the aws_acc_1 and aws_acc_2 are part of aws_acc_main and the MFA is handled only through aws_acc_main.

Boto3 uses old credentials

I am using tkinter to create gui application that returns the security groups. Currently if you want to change your credentials (e.g. if you accidentally entered the wrong ones) you would have to restart the application otherwise boto3 would carry on using the old credentials.
I'm not sure why it keeps using the old credentials because I am running everything again using the currently entered credentials.
This is a snippet of the code that sets the environment variables and launches boto3. It works perfectly fine if you enter the right credentials the first time.
os.environ['AWS_ACCESS_KEY_ID'] = self.accessKey
os.environ['AWS_SECRET_ACCESS_KEY'] = self.secretKey
self.sts_client = boto3.client('sts')
self.assumedRoleObject = self.sts_client.assume_role(
RoleArn=self.role,
RoleSessionName="AssumeRoleSession1"
)
self.credentials = self.assumedRoleObject['Credentials']
self.ec2 = boto3.resource(
'ec2',
region_name=self.region,
aws_access_key_id=credentials['AccessKeyId'],
aws_secret_access_key=credentials['SecretAccessKey'],
aws_session_token=credentials['SessionToken'],
)
The credentials variables are set using:
self.accessKey = str(self.AWS_ACCESS_KEY_ID_Form.get())
self.secretKey = str(self.AWS_SECRET_ACCESS_KEY_Form.get())
self.role = str(self.AWS_ROLE_ARN_Form.get())
self.region = str(self.AWS_REGION_Form.get())
self.instanceID = str(self.AWS_INSTANCE_ID_Form.get())
Is there a way to use different credentials in boto3 without restarting the program?
You need boto3.session.Session to overwrite the access credentials.
Just do this
reference http://boto3.readthedocs.io/en/latest/reference/core/session.html
import boto3
# Assign you own access
mysession = boto3.session.Session(aws_access_key_id='foo1', aws_secret_access_key='bar1')
# If you want to use different profile call foobar inside .aws/credentials
mysession = boto3.session.Session(profile_name="fooboar")
# Afterwards, just declare your AWS client/resource services
sqs_resource=mysession.resource("sqs")
# or client
s3_client=mysession.client("s3")
Basically, little change to your code. you just pass in the session instead of direct boto3.client/boto3.resource
self.sts_client = mysession.client('sts')
Sure, just create different sessions from botocore.session.Session object for each set of credentials:
import boto3
s1 = boto3.session.Session(aws_access_key_id='foo1', aws_secret_access_key='bar1')
s2 = boto3.session.Session(aws_access_key_id='foo2', aws_secret_access_key='bar2')
Also you can leverage set_credentials method to keep 1 session an change creds on the fly:
import botocore
session - botocore.session.Session()
session.set_credentials('foo', 'bar')
client = session.create_client('s3')
client._request_signer._credentials.access_key
u'foo'
session.set_credentials('foo1', 'bar')
client = session.create_client('s3')
client._request_signer._credentials.access_key
u'foo1'
The answers given by #mootmoot and #Vor clearly state the way of dealing with multiple credentials using a session.
#Vor's answer
import boto3
s1 = boto3.session.Session(aws_access_key_id='foo1', aws_secret_access_key='bar1')
s2 = boto3.session.Session(aws_access_key_id='foo2', aws_secret_access_key='bar2')
But some of you would be curious about
why does the boto3 client or resource behave in that manner in the first place?
Let's clear out a few points about Session and Client as they'll actually lead us to the answer to the aforementioned question.
Session
A 'Session' stores configuration state and allows you to create service clients and resources
Client
if the credentials are not passed explicitly as arguments to the boto3.client method, then the credentials configured for the session will automatically be used. You only need to provide credentials as arguments if you want to override the credentials used for this specific client
Now let's get to the code and see what actually happens when you call boto3.client()
def client(*args, **kwargs):
return _get_default_session().client(*args, **kwargs)
def _get_default_session():
if DEFAULT_SESSION is None:
setup_default_session()
return DEFAULT_SESSION
def setup_default_session(**kwargs):
DEFAULT_SESSION = Session(**kwargs)
Learnings from the above
The function boto3.client() is really just a proxy for the boto3.Session.client() method
If you once use the client, the DEFAULT_SESSION is set up and for the next consecutive creation of clients it'll keep using the DEFAULT_SESSION
The credentials configured for the DEFAULT_SESSION are used if the credentials are not explicitly passed as arguments while creating the boto3 client.
Answer
The first call to boto3.client() sets up the DEFAULT_SESSION and configures the session with the oldCredsAccessKey, oldCredsSecretKey, the already set values for env variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACESS_KEY respectively.
So even if you set new values of credentials in the environment, i.e do this
os.environ['AWS_ACCESS_KEY_ID'] = newCredsAccessKey
os.environ['AWS_SECRET_ACCESS_KEY'] = newCredsSecretKey
The upcoming boto3.client() calls still pick up the old credentials configured for the DEFAULT_SESSION
NOTE
boto3.client() call in this whole answer means that no arguments passed to the client method.
References
https://boto3.amazonaws.com/v1/documentation/api/latest/_modules/boto3.html#client
https://boto3.amazonaws.com/v1/documentation/api/latest/_modules/boto3/session.html#Session
https://ben11kehoe.medium.com/boto3-sessions-and-why-you-should-use-them-9b094eb5ca8e

Categories