boto3 generate_presigned_url SignatureDoesNotMatch - python

I want to download files from S3 in a web application. Therefore I create an URL using boto3 generate_presigned_url
import boto3
s3Client = boto3.client(
's3',
region_name='eu-central-1',
config=boto3.session.Config(signature_version='v4'),
aws_access_key_id='xxxxxxxxxxxxxxx',
aws_secret_access_key='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'
)
url = s3Client.generate_presigned_url(
'get_object',
Params={'Bucket': 'bucketname', 'Key': 'test.txt'},
ExpiresIn=100)
but always get this error message back:
<Error>
<Code>SignatureDoesNotMatch</Code>
<Message>The request signature we calculated does not match the
signature you provided. Check your key and signing method.</Message>
any ideas what to do?

signature_version='s3v4' works for me.
I got the same error if use signature_version='v4'
s3_client = boto3.client('s3', region_name=REGION,
config=Config(signature_version='s3v4'),
aws_access_key_id=AWS_ACCESS_KEY_ID,
aws_secret_access_key=AWS_SECRET_ACCESS_KEY)

Related

Error when generating presigned url from aws s3 using boto3

When I try to return a generate presigned url using boto3 from bucket in aws s3 and the code:
import fastapi
import boto3
from botocore.exceptions import ClientError
s3 = boto3.client("s3",
aws_access_key_id="...",
aws_secret_access_key="...")
BUCKET_NAME = "tayibat-files"
app = FastAPI()
#app.get('/{file_name}')
async def method_name(file_name: str):
try:
url = s3.generate_presigned_url(
'get_object',
Params={'Bucket': BUCKET_NAME,
'Key': f"products/{file_name}"},
ExpiresIn=3600
)
except ClientError as e:
logging.error(e)
return url
the get request return an url, but when I try to open it in browsers, It generate:
This XML file does not appear to have any style information associated with it. The document
tree is shown below.
<Error>
<Code>InvalidRequest</Code>
<Message>The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-
SHA256.</Message>
<RequestId>ZW269CV1TAYC7CWC</RequestId>
<HostId>1yozjolBbu4difnOjjopLeOk79i34WDOFwp1VQA4Nqd0RBdLNkaOkb/uJVjFtyNu78fx06JfCbI=</HostId>
</Error>
The issue is not your code but your method of authentication or region.
I ran your code sample successfully:
import boto3
session = boto3.session.Session(profile_name="<my-profile>")
client = session.client('s3')
BUCKET_NAME = "<bucket>"
file_name = "<file>"
url = client.generate_presigned_url(
'get_object',
Params={'Bucket': BUCKET_NAME,
'Key': f"products/{file_name}"},
ExpiresIn=3600
)
print(url)
It worked fine because the region of my bucket aligned with the region of my credentials. When I tried to generate a presigned url from another region I got your same error:
<Error>
<Code>InvalidRequest</Code>
<Message>The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256.</Message>
<RequestId>JJPZYSMZTC7Z8H</RequestId>
<HostId>TsgdZIibKxZ4GVL3h28OJYIvh59yfgeZwVf+eGPXVEzIJsAxdp1VQL67vw20LR/r9uIBxro=</HostId>
</Error>

Simple PUT operation via presigned URL keep getting error "The request signature we calculated does not match the signature you provided"

I am using
boto3==1.17.57
I have read AWS S3 - How to fix 'The request signature we calculated does not match the signature' error?
But I am not sure why I am still getting
The request signature we calculated does not match the signature you
provided. Check your key and signing method.
when I try to perform PUT operation.
My code snippet is pretty straightforward.
import requests
import boto3
from botocore.client import ClientError
s3 = boto3.resource('s3')
bucket_name = 'cloud-storage-1'
file_name = 'car.jpg'
########
# BUCKET
########
try:
s3.meta.client.head_bucket(Bucket=bucket_name)
except ClientError as e:
print(str(e))
print("Try to create bucket for the first time")
s3.create_bucket(
Bucket=bucket_name,
CreateBucketConfiguration={
'LocationConstraint': 'eu-central-1'
}
)
##################
# GET FILE CONTENT
##################
with open(file_name, 'rb') as object_file:
body = object_file.read()
object_file.close()
###############
# PRESIGNED URL
###############
s3_client = boto3.client('s3')
url = s3_client.generate_presigned_url(
ClientMethod='put_object',
Params={
'Bucket': bucket_name,
'Key': file_name
},
ExpiresIn=3600
)
print(url)
#####
# PUT
#####
response = requests.put(url, data=body)
print(response.content)
I can confirm the ~/.aws/credentials is correct, as I can see the bucket being created with no issue.
Bucket created without issue
~/.aws/credentials
[default]
aws_access_key_id=XXX
aws_secret_access_key=YYY
region=eu-central-1
Does anyone has any idea why it is so?

Python Boto3 there is a difference in AWS S3 presign url response for bucket in us-east-1 and us-east-2

With Python Boto3 i create post presign Url, below is sample code.
client = boto3.client('s3', region_name="us-east-1")
response = client.generate_presigned_post(Bucket="tes_bucket", Key=filename, ExpiresIn=300)
There is difference in response fields for the bucket in us-east-1 and us-east-2
With the same code, if i try on bucket with us-east-1 i get ressponse fields.
AWSAccessKeyId, key, policy, signature, and x-amz-security-token
Where as when created with bucket in us-east-2 region i get response fields
key, policy, x-amz-algorithm, x-amz-credential, x-amz-date, x-amz-security-token, x-amz-signature
There is no differecen in bucket configuraion, other than region, but still why there is such difference in response fields.
What we change to get same response across all region
As i checked this two scenario.
lambda code :
import boto3
def lambda_handler(event, context):
filename = "example.pdf"
client = boto3.client('s3', region_name="us-east-1")
response = client.generate_presigned_post(Bucket="bucket1", Key=filename, ExpiresIn=300)
print(response)
client1 = boto3.client('s3', region_name="ap-south-1")
response1 = client1.generate_presigned_post(Bucket="bucket2", Key=filename, ExpiresIn=300)
print(response1)
in response only for ap-south-1 region bucket got extra params :
'x-amz-algorithm': 'AWS4-HMAC-SHA256',
'x-amz-credential': 'xxxxxxxxxxxxxxx/xxxxxxx/ap-south-1/s3/aws4_request',
'x-amz-date': '20200928T183454Z',
Reason behind this you are using generate_presigned_post boto3 S3 function which is used for either API call or form action or CURL request. When you are using same region and hand shaking resource internally in same region this extra check are not required to validate resource access policy. Where as if two AWS resources are handshaking to each other which having different region or different AWS account then required extra params to access resources.
This all params are part of AWS signature to validate resources having proper access control to hand shake.
For getting same params here is approach :
import boto3
import datetime
def lambda_handler(event, context):
filename = "example.pdf"
date_short = datetime.datetime.utcnow().strftime('%Y%m%d')
date_long = datetime.datetime.utcnow().strftime('%Y%m%dT000000Z')
client = boto3.client('s3', region_name="us-east-1")
fields = {
'acl': 'private',
'date': date_short,
'region': "us-east-1",
'x-amz-algorithm': 'AWS4-HMAC-SHA256',
'x-amz-date': date_long
}
response = client.generate_presigned_post(Bucket="bucket1",Fields = fields, Key=filename, ExpiresIn=300)
print(response)
client1 = boto3.client('s3', region_name="ap-south-1")
fields = {
'acl': 'private',
'date': date_short,
'region': "ap-south-1",
'x-amz-algorithm': 'AWS4-HMAC-SHA256',
'x-amz-date': date_long
}
response1 = client1.generate_presigned_post(Bucket="bucket2", Fields = fields,Key=filename, ExpiresIn=300)
print(response1)
Botocore uses s3v2 while generating presigned post for us-east-1 region and uses s3v4 for other region. That's why you are not getting some parameter in fields.
So if you explicitly specify the signature version to s3v4 then you can get the same field. Something like this:
https://github.com/boto/boto3/issues/2606#issuecomment-701587119
from botocore.client import Config
s3 = boto3.client('s3', 'us-east-1', config=Config(signature_version='s3v4'))
response = s3.generate_presigned_post(Bucket="bucket2", Key=filename, ExpiresIn=300)
I tried this got same fields in both request.
Reference : https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#S3.Client.generate_presigned_post
Amazon AWS S3 browser-based upload using POST -

Download a file from S3 Bucket using boto3 by passing credentials as parameters

As per Boto3 official documentation, we can connect S3 bucket by passing credentials as a parameters. But I am facing issues.
Working Scenario : Hardcoding Key ID & Secret Key
s3r = boto3.resource('s3', aws_access_key_id='XXXXXXXXXXXXXXXXXXXX',
aws_secret_access_key='XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX')
buck = s3r.Bucket('bucket name')
buck.download_file(filename,filename)
Non Working Scenario : Passing as parameters
AccessKey = 'XXXXXXXXXXXXXXXXXXXX'
SecretKey = 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX'
s3r = boto3.resource('s3', aws_access_key_id=AccessKey, aws_secret_access_key=SecretKey)
buck = s3r.Bucket('bucket name')
buck.download_file(filename,filename)
I am facing below error for non-working scenario.
botocore.exceptions.ClientError: An error occurred (403) when calling the HeadObject operation: Forbidden
http://boto3.readthedocs.io/en/latest/guide/configuration.html
session = boto3.Session(aws_access_key_id=AccessKey, aws_secret_access_key=SecretKey)
s3 = session.resource('s3')

How to generate url from boto3 in amazon web services

I have a Bucket in s3 and I am trying to pull the url of the image that is in there.
I am using boto3 and boto3 doesn't seem to have an implemented generate url method.
They have a core method, that generates url like this,
import botocore.session
session = botocore.session.get_session()
client = session.create_client('s3')
presigned_url = client.generate_presigned_url(
'get_object', Params={'Bucket': self.bucket_name, 'Key': self.key})
One thing I am forced to do is, I have to send the parameters along with each request using session object. And the above method does not allow me to set the session variables (ie .. aws credentials)
The closest I can get is this
session = Session(aws_access_key_id='342342342342', aws_secret_access_key='3434234322', region_name='us-east-1')
s3 = session.resource('s3')
object = s3.Object('my-dev-bucket', 'amazonKeyString')
print object.get()["Body"]
This gets me amazon s3 object which is an object called
botocore.response.StreamingBody object at 0x7ffaff8cef50
Can I get a url of the image this way?
Able to get results and did not face any issues in getting the signed URL.
I used the default session since my aws creds were stored locally in "~/.aws/credentials" file and my default region is set as needed ~/.aws/config
import boto3
s3Client = boto3.client('s3')
s3Client.generate_presigned_url('get_object', Params = {'Bucket': 'www.mybucket.com', 'Key': 'hello.txt'}, ExpiresIn = 100)
If you need to pass params for Session, import boto3.session and create custom session
import boto3.session
session = boto3.session.Session(region_name='eu-central-1')
s3Client = session.client('s3')
If you don't want to use aws configure command, you can pass the credentials directly like this and generate the public URL.
def generate_public_url(bucket_name, file_name, aws_region, aws_key_id, aws_secret, timeout=300):
#if not object_exists(bucket_name, file_name):
# raise Exception(f"0 or many items found in bucket '{bucket_name}' with key '{file_name}')")
s3_client = boto3.client('s3', config=Config(signature_version='s3v4'),
region_name=aws_region, aws_access_key_id=aws_key_id, aws_secret_access_key=aws_secret)
url = s3_client.generate_presigned_url(
ClientMethod='get_object',
Params={
'Bucket': bucket_name,
'Key': file_name
},
ExpiresIn=timeout # seconds
)
return url

Categories