Switching IAM-user roles with Athena and boto3 - python

I am writing a python program using boto3 that grabs all of the queries made by a master account and pushes them out to all of the master account's sub accounts.
Grabbing the query IDs from the master instance is done, but I'm having trouble pushing them out to the sub accounts. With my authentication information AWS is connecting to the master account by default, but I can't figure out how to get it to connect to a sub account. Generally AWS services do this by switching roles, but Athena doesn't have a built in method for this. I could manually create different profiles but I'm not sure how to switch them manually in the middle of code execution
Here's Amazon's code example for switching in STS, which does support assuming different roles https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch-role-api.html
Here's what my program looks like so far
#!/usr/bin/env python3
import boto3
dev = boto3.session.Session(profile_name='dev')
#Function for executing athena queries
client = dev.client('athena')
s3_input = 's3://dev/test/'
s3_output = 's3://dev/testOutput'
database = 'ex_athena_db'
table = 'test_data'
response = client.list_named_queries(
MaxResults=50,
WorkGroup='primary'
)
print response
So I have the "dev" profile, but I'm not sure how to differentiate this profile to indicate to AWS that I'd like to access one of the child accounts. Is it just the name, or do I need some other paramter? I don't think I can (or need to) generate a seperate authentication token for this

I solved this by creating a new user profile for the sub account with a new ARN
sample config
[default]
region = us-east-1
[profile ecr-dev]
role_arn = arn:aws:iam::76532435:role/AccountRole
source_profile = default
sample code
#!/usr/bin/env python3
import boto3
dev = boto3.session.Session(profile_name='name', region_name="us-east-1")
#Function for executing athena queries
client = dev.client('athena')
s3_input = 's3:/test/'
s3_output = 's3:/test'
database = 'ex_athena_db'
response = client.list_named_queries(
MaxResults=50,
WorkGroup='primary'
)
print response

Related

Python: AWS Aurora Serverless Data API: password authentication failed for user

I am running out of ideas.
I have created a Aurora Serverless RDS (Version 1) with Data API enabled. I now wish to execute SQL statements against it using the Data API (https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/data-api.html)
I have made a small test script using the provided guidelines (https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/data-api.html#data-api.calling:~:text=Calling%20the%20Data%20API%20from%20a%20Python%20application)
import boto3
session = boto3.Session(region_name="eu-central-1")
rds = session.client("rds-data")
secret = session.client("secretsmanager")
cluster_arn = "arn:aws:rds:eu-central-1:<accountID>:cluster:aurorapostgres"
secret_arn = "arn:aws:secretsmanager:eu-central-1:<accountID>:secret:dbsecret-xNMeQc"
secretvalue = secret.get_secret_value(
SecretId = secret_arn
)
print(secretvalue)
SQL = "SELECT * FROM pipelinedb.dataset"
res = rds.execute_statement(
resourceArn = cluster_arn,
secretArn = secret_arn,
database = "pipelinedb",
sql = SQL
)
print(res)
However I get the error message:
BadRequestException: An error occurred (BadRequestException) when calling the ExecuteStatement operation: FATAL: password authentication failed for user "bjarki"; SQLState: 28P01
I have verified the following:
Secret value is correct
Secret JSON structure is correctly following recommended structure (https://docs.aws.amazon.com/secretsmanager/latest/userguide/reference_secret_json_structure.html)
IAM user running the python script has Admin access to the account, and thus is privileged enough
Cluster is running in Public Subnets (internet gateways attached to route tables) and ACL and security groups are fully open.
The user "bjarki" is the master user and thus should have the required DB privileges to run the query
I am out of ideas on why this error is appearing - any good ideas?
Try this AWS tutorial that is located in the AWS Examples
Code Library. It shows how to use the AWS SDK for Python (Boto3) to create a web application that tracks work items in an Amazon Aurora database and emails reports by using Amazon Simple Email Service (Amazon SES). This example uses a front end built with React.js to interact with a Flask-RESTful Python backend.
Integrate a React.js web application with AWS services.
List, add, and update items in an Aurora table.
Send an email report of filtered work items by using Amazon SES.
Deploy and manage example resources with the included AWS CloudFormation script.
https://docs.aws.amazon.com/code-library/latest/ug/cross_RDSDataTracker_python_3_topic.html
Try running the CDK to properly setup the database too.
Once you successfully implemented this example, you wil get this front end with a Python backend.

Bigquery - google auth does not direct to url

I'm trying to run a query on bigquery in a Django project and get results. While working successfully in localhost, it does not redirect to the verification link at all when I take it to the live server.
I think I need to change the redirect_uri value as I read it. I added this in Da appflow variable but the url doesn't change. I am using the same query below with the example query in google's document, I am submitting my own query because it contains private information, but it is exactly the same query.
I have added to Authorized redirect URIs, and I put the api in production mode.;
The resulting redirect url is output as localhost in this way;
https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=123-nml31ekr2n0didomei5.apps.googleusercontent.com&redirect_uri=http%3A%2F%2Flocalhost%3A8080%2F&scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery&state=XF1WdtCoR4HaICwzSKk9A1giBrSzBv&access_type=offline
def query_stackoverflow():
launch_browser = True
project = 'xx-prod'
appflow = flow.InstalledAppFlow.from_client_secrets_file("static/client_secret_518684-nmpoqtgo5flvcgnl31ekr2ni5.apps.googleusercontent.com.json", scopes=["https://www.googleapis.com/auth/bigquery"], redirect_uri=["https://xx.com/"])
if launch_browser:
appflow.run_local_server()
else:
appflow.run_console()
credentials = appflow.credentials
client = bigquery.Client(project=project, credentials=credentials)
client = bigquery.Client()
query_job = client.query(
"""
SELECT
CONCAT(
'https://stackoverflow.com/questions/',
CAST(id as STRING)) as url,
view_count
FROM `bigquery-public-data.stackoverflow.posts_questions`
WHERE tags like '%google-bigquery%'
ORDER BY view_count DESC
LIMIT 10"""
)
results = query_job.result() # Waits for job to complete.
for row in results:
print("{} : {} views".format(row.url, row.view_count))
On live server google return auth url like this;
Please visit this URL to authorize this application: https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=51864584-nmpoqtgo5flvcgnln0didomei5.apps.googleusercontent.com&redirect_uri=http%3A%2F%2Flocalhost%3A8080%2F&scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery&state=W2uMZwzaYMEpFzExodRCf2wA4&access_type=offline
The first problem is that it does not automatically redirect to the link as in localhost, the second problem is that when I open this link manually, the link cannot be reached after mail verification.
From what i can see your code is using installed app flow. This means that the consent screen is going to open up on the machine its running on. If you have this running on a server, are you logging into the server and running it or are you in fact creating a web application?
flow.InstalledAppFlow
web app
If you are making a web application then you should be following this sample.
API access on behalf of your clients (web flow)
You will need to convert it to work with big query.
import google.oauth2.credentials
import google_auth_oauthlib.flow
# Initialize the flow using the client ID and secret downloaded earlier.
# Note: You can use the GetAPIScope helper function to retrieve the
# appropriate scope for AdWords or Ad Manager.
flow = google_auth_oauthlib.flow.Flow.from_client_secrets_file(
'client_secret.json',
scope=[oauth2.GetAPIScope('adwords')])
# Indicate where the API server will redirect the user after the user completes
# the authorization flow. The redirect URI is required.
flow.redirect_uri = 'https://www.example.com/oauth2callback'
The code for a web application is slightly different then that of an installed application.

Python Generating an IAM authentication token boto3.session

I am trying to use the documentation on https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/UsingWithRDS.IAMDBAuth.Connecting.Python.html. Right now I am stuck at session = boto3.session(profile_name='RDSCreds'). What is profile_name and how do I find that in my RDS?
import sys
import boto3
ENDPOINT="mysqldb.123456789012.us-east-1.rds.amazonaws.com"
PORT="3306"
USR="jane_doe"
REGION="us-east-1"
os.environ['LIBMYSQL_ENABLE_CLEARTEXT_PLUGIN'] = '1'
#gets the credentials from .aws/credentials
session = boto3.Session(profile_name='RDSCreds')
client = session.client('rds')
token = client.generate_db_auth_token(DBHostname=ENDPOINT, Port=PORT, DBUsername=USR, Region=REGION)
session = boto3.Session(profile_name='RDSCreds')
profile_name here means the name of the profile you have configured to use for your aws cli.
usually when you run aws configure it creates a default profile.But sometime users want to manage aws cli with another account credentials or amange request for another region so they configure separate profile. docs for creating configuring multiple profiles
aws configure --profile RDSCreds #enter your access keys for this profile
in case if you think you have already created RDSCreds profile to check that profile less ~/.aws/config
the documentation which you have mentioned for rds using boto3 also says "The code examples use profiles for shared credentials. For information about the specifying credentials, see Credentials in the AWS SDK for Python (Boto3) documentation."

Trigger an Azure Function upon VM creation

I'm looking for a way to trigger an Azure Function which will run some Python code, each time a new virtual machine is created. I have already done the same thing in AWS using CloudWatch + Lambda, but I can't find where/how achieve the same thing in Azure.
I have tried to use Logic App with Event Grid but there is no trigger to monitor VM state.
Anyone could provide me with some guidance here ?
Many thanks in advance.
Azure service don't have built-in method to achieve your requirement, but I think you can achieve this by your own python code. The main logic is to polling the VM names from your subscription and then store the VM names in somewhere, if they changes, post a request to something like 'HttpTrigger' endpoint(Or just put the logic in the polling algorithm.).
And the for the polling algorithm, you can design by yourself or just use the 'TimeTrigger' to achieve.
I notice you add the 'Python' tag, so just use code like below and put them inside a polling algorithm:
import requests
from azure.identity import ClientSecretCredential
import json
client_id = 'xxx'
tenant_id = 'xxx'
client_secret = 'xxx'
subscription_id = 'xxx'
credential = ClientSecretCredential(tenant_id=tenant_id, client_id=client_id, client_secret=client_secret)
accesstoken = str(credential.get_token('https://management.azure.com/.default'))[19:1287]
bearertoken = "Bearer "+accesstoken
r = requests.get("https://management.azure.com/subscriptions/"+subscription_id+"/resources?$filter=resourceType eq 'Microsoft.Compute/virtualMachines'&api-version=2020-06-01",headers={'Authorization': bearertoken})
items = json.loads(r.text)
print(r.text)
for item in items['value']:
print(item['name'])#This line is print, you need to store this in some place such as database, azure blob storage, azure table storage etc.
#check the VM names here. If some VM been added, post a request to the HttpTrigger function.
If you use azure function 'Time Trigger' instead of self-designed algorithm, then you can store the client id, tenent id, client_secret and subscription id to the keyvault and then let your function app configuration settings refer to the keyvault, this will make it safe.
Above code is based on AAD bearer token, you need to create a AAD App and let it have the 'Owner' RBAC role of the subscription. You need to something like this:
This just like a 'custom trigger' that trigger by the VM created in your 'subscription'. And I think your VM will not be many, so it will not consume much computing resources.

using boto3 and Python how to create AWS MFA authorized session which can be used by other roles

SCENARIO
I have two AWS accounts linked to a third aws account, main account.
The accounts names, for instance, are aws_acc_main, aws_acc_1, aws_acc_2
aws_main is used to maintain the users, groups and roles for the other two accounts. however, except IAM no other services are used in the main account.
Using boto3 and Python I want to get ec2 instances for the account-1 and account-2
Thr AWS configuration looks like:
.aws/credentials file
[aws_acc_main]
aws_access_key_id=AKJFJHNUCTYUUAPW
aws_secret_access_key = 2uldfr94tuowjuHUKbnBIby8jhfdgjh
.aws/config
[aws_acc_main]
output = json
region = eu-west-2
[profile aws_acc_1]
source_profile = aws_acc_main
output = json
region = eu-west-2
role_arn = arn:aws:iam::111111111111:role/ACC1_ROLE
mfa_serial = arn:aws:iam::111111111110:mfa/ABCD
[profile aws_acc_2]
source_profile = aws_acc_main
output = json
region = eu-west-2
role_arn = arn:aws:iam::111111111112:role/ACC2_ROLE
mfa_serial = arn:aws:iam::111111111110:mfa/ABCD
Now the python files.
test1.py
sessions = ['aws_acc_1', 'aws_acc_2']
for session_name in sessions:
session = boto3.Session(profile_name=session_name)
ec2 = session.client('ec2', region_name='eu-west-2')
resources = ec2.describe_instances()
test2.py
mfa_code = raw_input("Enter the MFA code: ")
client = boto3.client('sts')
response = client.get_session_token(
DurationSeconds=3600,
SerialNumber='arn:aws:iam::111111111110:mfa/ABCD',
TokenCode=mfa_code
)
credentials = response['Credentials']
for session_name in sessions:
session = boto3.Session(profile_name=session_name,
aws_access_key_id = credentials['AccessKeyId'],
aws_secret_access_key = credentials['SecretAccessKey'],
aws_session_token = credentials['SessionToken'],
)
ec2Client = session.client('ec2', region_name='eu-west-2')
resources = ec2.describe_instances()
PROBLEM
The first (test1.py) works OK but I have to provide MFA for each account in every iteration.
The second file (test2.py) does not give any errors either but it does not read the EC2 service of aws_acc_1 and aws_acc_2, instead it only get the ec2 service of aws_acc_main, which has nothing in it. It does not even give any error.
DESIRED OUTPUT
I want to give MFA only once for the entire session. Then, I want to switch roles without providing MFA again and again. It means I want to fix the test2.py.
QUESTION
In AWS web client, once I login using my userid, password and MFA, I can switch roles without proving MFA again and again. That is what I want to do.
If I use boto3 to create a session, how can I use the same session to switch roles to different accounts without providing MFA again and again?
Please note... that an important thing in this scenario is that the aws_acc_1 and aws_acc_2 are part of aws_acc_main and the MFA is handled only through aws_acc_main.

Categories