boto3 and connecting to custom url - python

I have a test environment that mimics the S3 envrionment, and I want to write some test scripts using boto3. How can I connect to that service?
I tried:
client = boto3.client('s3', region_name="us-east-1", endpoint_url="http://mymachine")
client = boto3.client('iam', region_name="us-east-1", endpoint_url="http://mymachine")
Both fail to work.
The service is setup to use IAM authentication.
My error:
botocore.exceptions.NoCredentialsError: Unable to locate credentials
Any ideas?
Thanks

Please use as below :
import boto3
client = boto3.client( 's3', aws_access_key_id=ACCESS_KEY, aws_secret_access_key=SECRET_KEY,)
Please check this link for more ways to configure AWS credentials.
http://boto3.readthedocs.io/en/latest/guide/configuration.html

1.
boto API always look for credential to pass on to connecting services, there is no way you can access AWS resources using bot without a access key and password.
If you intended to use some other method, e.g. Temporary Security Credentials, your AWS admin must setup roles and etc , to allow the VM instance connect to AWS using AWS Security Token Service.
Otherwise, you must request a restricted credential key from your AWS account admin.
2.On the other hand, if you want to mimics S3 and test rapid upload/download huge amount of data for development, then you should setup FakeS3. It will take any dummy access key. However, there is few drawback of FakeS3 : you can't setup and test S3 bucket policy.
3.Even you configure your S3 bucket to allow anyone to take the file, it is only through the url, it is a file access permission, not bucket access permission.

Related

Python Generating an IAM authentication token boto3.session

I am trying to use the documentation on https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/UsingWithRDS.IAMDBAuth.Connecting.Python.html. Right now I am stuck at session = boto3.session(profile_name='RDSCreds'). What is profile_name and how do I find that in my RDS?
import sys
import boto3
ENDPOINT="mysqldb.123456789012.us-east-1.rds.amazonaws.com"
PORT="3306"
USR="jane_doe"
REGION="us-east-1"
os.environ['LIBMYSQL_ENABLE_CLEARTEXT_PLUGIN'] = '1'
#gets the credentials from .aws/credentials
session = boto3.Session(profile_name='RDSCreds')
client = session.client('rds')
token = client.generate_db_auth_token(DBHostname=ENDPOINT, Port=PORT, DBUsername=USR, Region=REGION)
session = boto3.Session(profile_name='RDSCreds')
profile_name here means the name of the profile you have configured to use for your aws cli.
usually when you run aws configure it creates a default profile.But sometime users want to manage aws cli with another account credentials or amange request for another region so they configure separate profile. docs for creating configuring multiple profiles
aws configure --profile RDSCreds #enter your access keys for this profile
in case if you think you have already created RDSCreds profile to check that profile less ~/.aws/config
the documentation which you have mentioned for rds using boto3 also says "The code examples use profiles for shared credentials. For information about the specifying credentials, see Credentials in the AWS SDK for Python (Boto3) documentation."

AWS S3 bucket access issue with switching role

My login to AWS console is MFA & for that I am using Google Authenticator.
I have S3 DEV bucket and to access that DEV bucket, I have to switch role and after switching i can access DEV bucket.
I need help how to achieve same in python with boto3.
There are many csv file that I need to open in dataframe and without that resolving access, I cannot proceed.
I tried configuring AWS credentials & config and using that in my python code but didn't helped.
AWS document is not clear about how to do switching role while using & doing in python.
import boto3
import s3fs
import pandas as pd
import boto.s3.connection
access_key = 'XXXXXXXXXXX'
secret_key = 'XXXXXXXXXXXXXXXXX'
# bucketName = 'XXXXXXXXXXXXXXXXX'
s3 = boto3.resource('s3')
for bucket in s3.buckets.all():
print(bucket.name)
Expected result should be to access that bucket after switching role in python code along with MFA.
In general, it is a bad for security to put credentials in your program code. It is better to store them in a configuration file. You can do this by using the AWS Command-Line Interface (CLI) aws configure command.
Once the credentials are stored this way, any AWS SDK (eg boto3) will automatically retrieve the credentials without having to reference them in code.
See: Configuring the AWS CLI - AWS Command Line Interface
There is an additional capability with the configuration file, that allows you to store a role that you wish to assume. This can be done by specifying a profile with the Role ARN:
# In ~/.aws/credentials:
[development]
aws_access_key_id=foo
aws_access_key_id=bar
# In ~/.aws/config
[profile crossaccount]
role_arn=arn:aws:iam:...
source_profile=development
The source_profile points to the profile that contains credentials that will be used to make the AssumeRole() call, and role_arn specifies the Role to assume.
See: Assume Role Provider
Finally, you can tell boto3 to use that particular profile for credentials:
session = boto3.Session(profile_name='crossaccount')
# Any clients created from this session will use credentials
# from the [crossaccount] section of ~/.aws/credentials.
dev_s3_client = session.client('s3')
An alternative to all the above (which boto3 does for you) is to call assume_role() in your code, then use the temporary credentials that are returned to define a new session that you can use to connect to a service. However, the above method using profiles is a lot easier.

boto3 ec2 & django [duplicate]

On boto I used to specify my credentials when connecting to S3 in such a way:
import boto
from boto.s3.connection import Key, S3Connection
S3 = S3Connection( settings.AWS_SERVER_PUBLIC_KEY, settings.AWS_SERVER_SECRET_KEY )
I could then use S3 to perform my operations (in my case deleting an object from a bucket).
With boto3 all the examples I found are such:
import boto3
S3 = boto3.resource( 's3' )
S3.Object( bucket_name, key_name ).delete()
I couldn't specify my credentials and thus all attempts fail with InvalidAccessKeyId error.
How can I specify credentials with boto3?
You can create a session:
import boto3
session = boto3.Session(
aws_access_key_id=settings.AWS_SERVER_PUBLIC_KEY,
aws_secret_access_key=settings.AWS_SERVER_SECRET_KEY,
)
Then use that session to get an S3 resource:
s3 = session.resource('s3')
You can get a client with new session directly like below.
s3_client = boto3.client('s3',
aws_access_key_id=settings.AWS_SERVER_PUBLIC_KEY,
aws_secret_access_key=settings.AWS_SERVER_SECRET_KEY,
region_name=REGION_NAME
)
This is older but placing this here for my reference too. boto3.resource is just implementing the default Session, you can pass through boto3.resource session details.
Help on function resource in module boto3:
resource(*args, **kwargs)
Create a resource service client by name using the default session.
See :py:meth:`boto3.session.Session.resource`.
https://github.com/boto/boto3/blob/86392b5ca26da57ce6a776365a52d3cab8487d60/boto3/session.py#L265
you can see that it just takes the same arguments as Boto3.Session
import boto3
S3 = boto3.resource('s3', region_name='us-west-2', aws_access_key_id=settings.AWS_SERVER_PUBLIC_KEY, aws_secret_access_key=settings.AWS_SERVER_SECRET_KEY)
S3.Object( bucket_name, key_name ).delete()
I'd like expand on #JustAGuy's answer. The method I prefer is to use AWS CLI to create a config file. The reason is, with the config file, the CLI or the SDK will automatically look for credentials in the ~/.aws folder. And the good thing is that AWS CLI is written in python.
You can get cli from pypi if you don't have it already. Here are the steps to get cli set up from terminal
$> pip install awscli #can add user flag
$> aws configure
AWS Access Key ID [****************ABCD]:[enter your key here]
AWS Secret Access Key [****************xyz]:[enter your secret key here]
Default region name [us-west-2]:[enter your region here]
Default output format [None]:
After this you can access boto and any of the api without having to specify keys (unless you want to use a different credentials).
If you rely on your .aws/credentials to store id and key for a user, it will be picked up automatically.
For instance
session = boto3.Session(profile_name='dev')
s3 = session.resource('s3')
This will pick up the dev profile (user) if your credentials file contains the following:
[dev]
aws_access_key_id = AAABBBCCCDDDEEEFFFGG
aws_secret_access_key = FooFooFoo
region=op-southeast-2
There are numerous ways to store credentials while still using boto3.resource().
I'm using the AWS CLI method myself. It works perfectly.
https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html?fbclid=IwAR2LlrS4O2gYH6xAF4QDVIH2Q2tzfF_VZ6loM3XfXsPAOR4qA-pX_qAILys
you can set default aws env variables for secret and access keys - that way you dont need to change default client creation code - though it is better to pass it as a parameter if you have non-default creds

AWS Boto / Warrant library: SRP authentication and credentials error

I have been stuck on the following issue for quite some time now. Within Python I want users to retrieve a token based upon their username and password from the AWS cognito-identity-pool making use of srp authentication. With this token I want the users to upload data to s3.
This is part of the code I use (from the warrant library): https://github.com/capless/warrant
self.client = boto3.client('cognito-idp', region_name="us-east-1")
response = boto_client.initiate_auth(
AuthFlow='USER_SRP_AUTH',
AuthParameters=auth_params,
ClientId=self.client_id
)
def get_auth_params(self):
auth_params = {'USERNAME': self.username,
'SRP_A': long_to_hex(self.large_a_value)}
if self.client_secret is not None:
auth_params.update({
"SECRET_HASH":
self.get_secret_hash(self.username,self.client_id, self.client_secret)})
return auth_params
However, I keep on getting:
botocore\auth.py", line 352, in add_auth raise NoCredentialsError
botocore.exceptions.NoCredentialsError: Unable to locate credentials
I was able to get rid of this error by adding credentials in the .aws/credentials file. But this is not in line with the purpose of this program. It seems like there is a mistake in the warrant or botocore library and the it keeps on attempting to use the AWS Access Key ID and AWS Secret Access Key from the credentials file, rather than that the given credentials (username and password) are used.
Any help is appreciated
I am on to Cognito team. initiate auth is an unauthenticated call so it shouldn't require you to provide AWS credentials. The service endpoint will not validate the sigv4 signature for these calls.
That being said, some client libraries have certain peculiarities in the sense that you need to provide some dummy credentials otherwise the client library will throw an exception. However you can provide anything for the credentials.
I too ran into this, using warrant.
The problem is that the boto3 libraries are trying to sign the request to aws, but this request is not supposed to be signed. To prevent that, create the identity pool client with a config that specifies no signing.
import boto3
from botocore import UNSIGNED
from botocore.config import Config
client = boto3.client('cognito-idp', region_name='us-east-1', config=Config(signature_version=UNSIGNED))
AWS Access Key ID and AWS Secret Access Key are totally different from username and password.
The Boto3 client has to connect to the AWS service endpoint (in your case: cognito-idp.us-east-1.amazonaws.com) to execute any API. Before executing an API, the API credentials (key+secret) have to provided to authenticate your AWS account. Without autheticating your account, you cannot call cognito-idp APIs.
There is one AWS account (key/secret) but there can be multiple users (username/password).

Cannot read a key from S3 with boto, but can with aws cli

Using this aws cli command (with access keys configured), I'm able to copy a key from S3 locally:
aws s3 cp s3://<bucketname>/test.txt test.txt
Using the following code in boto, I get S3ResponseError: 403 Forbidden, whether I allow boto to use configured credentials, or explicitly pass it keys.
import boto
c = boto.connect_s3()
b = c.get_bucket('<bucketname>')
k = b.get_key('test.txt')
d = k.get_contents_as_string() # exception thrown here
I've seen the other SO posts about not validating the key with validate=False etc, but none of these are my issue. I get similar results when copying the key to another location in the same bucket. Succeeds with the cli, but not with boto.
I've looked at the boto source to see if it's doing anything that requires extra permissions, but nothing stands out to me.
Does anyone have any suggestions? How does boto resolve its credentials?
Explicitly set your credentials so that our the same as the CLI with the ENV variables.
echo $ACCESS_KEY
echo $SECRET_KEY
import boto3
client = boto3.client(
's3',
aws_access_key_id=ACCESS_KEY,
aws_secret_access_key=SECRET_KEY
)
b = client.get_bucket('<bucketname>')
k = b.get_key('test.txt')
d = k.get_contents_as_string()
How boto resolves its credential.
The mechanism in which boto3 looks for credentials is to search through a list of possible locations and stop as soon as it finds credentials. The order in which Boto3 searches for credentials is:
Passing credentials as parameters in the boto.client() method
Passing credentials as parameters when creating a Session object
Environment variables
Shared credential file (~/.aws/credentials)
AWS config file (~/.aws/config)
Assume Role provider
Boto2 config file (/etc/boto.cfg and ~/.boto)
Instance metadata service on an Amazon EC2 instance that has an IAM role configured.
http://boto3.readthedocs.io/en/latest/guide/configuration.html#guide-configuration

Categories