AWS Assume Role STS - python

I am trying to access my S3 bucket daily using Python but my session expires every so often. Someone on this site advised I use an "Assumed Role" STS script to re-establish connection. I found a script that uses it and I am getting the following error. FYI, i have my credentials file in .aws folder.
"botocore.exceptions.NoCredentialsError: Unable to locate credentials"
below is my code:
import boto3
# The calls to AWS STS AssumeRole must be signed with the access key ID
# and secret access key of an existing IAM user or by using existing temporary
# credentials such as those from another role. (You cannot call AssumeRole
# with the access key for the root account.) The credentials can be in
# environment variables or in a configuration file and will be discovered
# automatically by the boto3.client() function. For more information, see the
# Python SDK documentation:
# http://boto3.readthedocs.io/en/latest/reference/services/sts.html#client
# create an STS client object that represents a live connection to the
# STS service
sts_client = boto3.client('sts')
# Call the assume_role method of the STSConnection object and pass the role
# ARN and a role session name.
assumed_role_object=sts_client.assume_role(
RoleArn="ARNGOESHERE",
RoleSessionName="AssumeRoleSession1"
)
# From the response that contains the assumed role, get the temporary
# credentials that can be used to make subsequent API calls
credentials=assumed_role_object['Credentials']
# Use the temporary credentials that AssumeRole returns to make a
# connection to Amazon S3
s3_resource=boto3.resource(
's3',
aws_access_key_id=credentials['AccessKeyId'],
aws_secret_access_key=credentials['SecretAccessKey'],
aws_session_token=credentials['SessionToken'],
)
# Use the Amazon S3 resource object that is now configured with the
# credentials to access your S3 buckets.
for bucket in s3_resource.buckets.all():
print(bucket.name)

You will have 2 options here:
Create a separate user with programmatic access. This would be permanent and the credentials would not expire. Usually this is not allowed for developers in organizations for security concerns. Refer steps:
https://aws.amazon.com/premiumsupport/knowledge-center/create-access-key/
If you are not allowed to have a permanent access token through the above method, then you can get the token expiration duration increased from default (1 hour) to 12 hours max to skip re-running PowerShell script every hour or so. For that, you would need to modify the PowerShell script 'saml2aws' you run to get credentials.
Add the arg 'DurationSeconds' for assume_role_with_saml() method. Refer: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/sts.html#STS.Client.assume_role_with_saml
response = client.assume_role_with_saml(
RoleArn='string',
PrincipalArn='string',
SAMLAssertion='string',
PolicyArns=[
{
'arn': 'string'
},
],
Policy='string',
DurationSeconds=123
)
The max duration you can enter here would be as per max session duration setting for your role. You can view it in your AWS console at IAM>Roles>{RoleName}>Summary>MaximumSessionDuration.

Related

Correct way to connect AWS Secret Manager

I am trying to connect to AWS Secret manage, on local it works fine on the python server it gives "botocore.exceptions.NoCredentialsError: Unable to locate credentials"
error
session = boto3.session.Session()
client = session.client(
service_name='secretsmanager',
region_name=region_name
)
So I had 2 ways to correct this :
First Method:
session = boto3.session.Session()
client = session.client(
service_name='secretsmanager',
region_name=region_name,aws_access_key_id=Xxxxxx,
aws_secret_access_key=xxxxxx
)
Second Method: To have this in a config file (Which will again expose keys)
session = boto3.session.Session()
client = session.client(
service_name='secretsmanager',
region_name=region_name,aws_access_key_id=confg.access,
aws_secret_access_key=confg.key
)
Arent, we exposing our key and access keys in github if we are specifying it here.?
What is the correct way to access Secret Manager without specifying it here
You are correct you shouldn't pass your access key and secret key to any running server or service in AWS to avoid exposing it. On your local machine, it worked because your environment is getting your user's permissions via AWS CLI.
What you need to do for a server is to add to the service role a policy allowing it to access the Secrets Manager, then you won't face permissions issues anymore
On Permissions policy examples - AWS Secrets Manager can find examples of how those policies need to be.
And on Assign an IAM Role to an EC2 Instance you can see how to attach a role with a specific policy to an EC2 instance.

Python - Create AWS Signature with temporary security credentials

I have read the AWS documentation but I couldn't find an example of using Temporary Security Credentials to authenticate to AWS with Python.
I would like an example of using a temporary security credentials provided by the AWS Security Token Service (AWS STS) to sign a request.
There are several ways you can use STS to get temporary credential. The two most common ones would be:
get_session_token - used to get temp credentials for existing IAM user or account
assume_role - used to get credentials when assuming iam role
In both cases the call to these function will give you temp credentials, e.g.:
{
"Credentials": {
"AccessKeyId": "AddsdfsdfsdxxxxxxKJ",
"SecretAccessKey": "TEdsfsdfSfdsfsdfsdfsdclkb/",
"SessionToken": "FwoGZXIvYXdzEFkaDGgIUSvDdfgsdfgsdfgsMaVYgsSxO8OqRfjHc4se90WbaspOwCtdgZNgeasdfasdfasdf5wrtChz2QCTnR643exObm/zOJzXe9TUkcdODajHtxcgR8r+unzMo+7WxgQYyKGN9kfbCqv3kywk0EvOBCapusYo81fpv8S7j4JQxEwOGC9JZQL6umJ8=",
"Expiration": "2021-02-17T11:53:31Z"
}
}
Having these credentials, you create new boto3 session, e.g.:
new_session = boto3.session.Session(<temp credentails>)
The new_session will allow you to make new boto3 client or resource, e.g.:
ec2 = new_session.client('ec2')
s3r = new_session.resource('s3')
And then you can use these new clients/resource as you would normally use them.

AWS S3 bucket access issue with switching role

My login to AWS console is MFA & for that I am using Google Authenticator.
I have S3 DEV bucket and to access that DEV bucket, I have to switch role and after switching i can access DEV bucket.
I need help how to achieve same in python with boto3.
There are many csv file that I need to open in dataframe and without that resolving access, I cannot proceed.
I tried configuring AWS credentials & config and using that in my python code but didn't helped.
AWS document is not clear about how to do switching role while using & doing in python.
import boto3
import s3fs
import pandas as pd
import boto.s3.connection
access_key = 'XXXXXXXXXXX'
secret_key = 'XXXXXXXXXXXXXXXXX'
# bucketName = 'XXXXXXXXXXXXXXXXX'
s3 = boto3.resource('s3')
for bucket in s3.buckets.all():
print(bucket.name)
Expected result should be to access that bucket after switching role in python code along with MFA.
In general, it is a bad for security to put credentials in your program code. It is better to store them in a configuration file. You can do this by using the AWS Command-Line Interface (CLI) aws configure command.
Once the credentials are stored this way, any AWS SDK (eg boto3) will automatically retrieve the credentials without having to reference them in code.
See: Configuring the AWS CLI - AWS Command Line Interface
There is an additional capability with the configuration file, that allows you to store a role that you wish to assume. This can be done by specifying a profile with the Role ARN:
# In ~/.aws/credentials:
[development]
aws_access_key_id=foo
aws_access_key_id=bar
# In ~/.aws/config
[profile crossaccount]
role_arn=arn:aws:iam:...
source_profile=development
The source_profile points to the profile that contains credentials that will be used to make the AssumeRole() call, and role_arn specifies the Role to assume.
See: Assume Role Provider
Finally, you can tell boto3 to use that particular profile for credentials:
session = boto3.Session(profile_name='crossaccount')
# Any clients created from this session will use credentials
# from the [crossaccount] section of ~/.aws/credentials.
dev_s3_client = session.client('s3')
An alternative to all the above (which boto3 does for you) is to call assume_role() in your code, then use the temporary credentials that are returned to define a new session that you can use to connect to a service. However, the above method using profiles is a lot easier.

Cannot read a key from S3 with boto, but can with aws cli

Using this aws cli command (with access keys configured), I'm able to copy a key from S3 locally:
aws s3 cp s3://<bucketname>/test.txt test.txt
Using the following code in boto, I get S3ResponseError: 403 Forbidden, whether I allow boto to use configured credentials, or explicitly pass it keys.
import boto
c = boto.connect_s3()
b = c.get_bucket('<bucketname>')
k = b.get_key('test.txt')
d = k.get_contents_as_string() # exception thrown here
I've seen the other SO posts about not validating the key with validate=False etc, but none of these are my issue. I get similar results when copying the key to another location in the same bucket. Succeeds with the cli, but not with boto.
I've looked at the boto source to see if it's doing anything that requires extra permissions, but nothing stands out to me.
Does anyone have any suggestions? How does boto resolve its credentials?
Explicitly set your credentials so that our the same as the CLI with the ENV variables.
echo $ACCESS_KEY
echo $SECRET_KEY
import boto3
client = boto3.client(
's3',
aws_access_key_id=ACCESS_KEY,
aws_secret_access_key=SECRET_KEY
)
b = client.get_bucket('<bucketname>')
k = b.get_key('test.txt')
d = k.get_contents_as_string()
How boto resolves its credential.
The mechanism in which boto3 looks for credentials is to search through a list of possible locations and stop as soon as it finds credentials. The order in which Boto3 searches for credentials is:
Passing credentials as parameters in the boto.client() method
Passing credentials as parameters when creating a Session object
Environment variables
Shared credential file (~/.aws/credentials)
AWS config file (~/.aws/config)
Assume Role provider
Boto2 config file (/etc/boto.cfg and ~/.boto)
Instance metadata service on an Amazon EC2 instance that has an IAM role configured.
http://boto3.readthedocs.io/en/latest/guide/configuration.html#guide-configuration

boto3 and connecting to custom url

I have a test environment that mimics the S3 envrionment, and I want to write some test scripts using boto3. How can I connect to that service?
I tried:
client = boto3.client('s3', region_name="us-east-1", endpoint_url="http://mymachine")
client = boto3.client('iam', region_name="us-east-1", endpoint_url="http://mymachine")
Both fail to work.
The service is setup to use IAM authentication.
My error:
botocore.exceptions.NoCredentialsError: Unable to locate credentials
Any ideas?
Thanks
Please use as below :
import boto3
client = boto3.client( 's3', aws_access_key_id=ACCESS_KEY, aws_secret_access_key=SECRET_KEY,)
Please check this link for more ways to configure AWS credentials.
http://boto3.readthedocs.io/en/latest/guide/configuration.html
1.
boto API always look for credential to pass on to connecting services, there is no way you can access AWS resources using bot without a access key and password.
If you intended to use some other method, e.g. Temporary Security Credentials, your AWS admin must setup roles and etc , to allow the VM instance connect to AWS using AWS Security Token Service.
Otherwise, you must request a restricted credential key from your AWS account admin.
2.On the other hand, if you want to mimics S3 and test rapid upload/download huge amount of data for development, then you should setup FakeS3. It will take any dummy access key. However, there is few drawback of FakeS3 : you can't setup and test S3 bucket policy.
3.Even you configure your S3 bucket to allow anyone to take the file, it is only through the url, it is a file access permission, not bucket access permission.

Categories