How can I remove expiry from the S3 bucket image object URL? - python

I am uploading multiple images to S3 bucket but once the images are attached and I receive the image URLs, they have certain expiry date. I don't want them to expire at all. What should I do?
Python code:
from werkzeug.utils import secure_filename
url_attach = []
image_file = request.files.getlist('files')
for item in image_file:
filename = secure_filename(item.filename)
url = upload2s3(item, filename)
url_attach.append(url)
upload function:
def upload2s3(img_content, key_name):
try:
s3_conn = boto3.client(
"s3",
aws_access_key_id=AWS_ACCESS_KEY_ID,
aws_secret_access_key=AWS_SECRET_KEY,
)
x = s3_conn.put_object(Bucket=BUCKET_NAME, Body=img_content.read(), Key=key_name)
url = create_url(BUCKET_NAME, key_name)
return url
except Exception as ex:
return {"status": False, "message": ex}
url function:
def create_url(bucket, object):
client = boto3.client(
"s3", aws_access_key_id=AWS_ACCESS_KEY_ID, aws_secret_access_key=AWS_SECRET_KEY
)
return client.generate_presigned_url(
"get_object", Params={"Bucket": bucket, "Key": object}
)
sample url: https://test_dir.s3.amazonaws.com/XYZ.png?AWSAccessKeyId=###########&Signature=##############&Expires=#########
Somewhere, I read that generate_presigned_url has max expiry as 7 days. is there any alternative to that?

I don't want them to expire at all. What should I do?
Nothing, because its not possible. You have develop a custom solution to regenerate the new links when they about to expire. Otherwise do not use S3 Presigned URLs.Instead server your files through CloudFront.

Related

Boto3 S3 list_objects_v2 Not Returning Any Objects

I'm using Boto3 to try to get a list of keys from an S3 bucket via an AWS Lambda Python script. No matter what I try, the bucket returns no objects.
import json, boto3, os
def getConfig():
cfg = {
"aws_key_id": os.getenv("AWS_KEY_ID", ""),
"aws_secret": os.getenv("AWS_SECRET", ""),
}
return cfg
def lambda_handler(event, context):
cfg = getConfig()
bucket_name = "zachs-taxi"
session = boto3.Session(
aws_access_key_id=cfg.get('aws_key_id'),
aws_secret_access_key=cfg.get('aws_secret')
)
s3 = session.client('s3')
I've tried both of the following but both return empty:
response = s3.list_objects_v2(
Bucket=bucket_name)
for content in response.get('Contents', []):
print(content['Key'])
And
paginator = s3.get_paginator("list_objects_v2")
for page in paginator.paginate(Bucket=bucket_name):
for content in page.get('Contents', ()):
print(content['Key'])
The S3 bucket is public and I can access it. Inside there is a folder called content and within that folder is a .png file.
Any help would be appreciated. Thanks!
Your code ran perfectly fine for me (with a different bucket name) when I ran it on my own computer:
import boto3
bucket_name = "my-bucketname"
s3 = boto3.client('s3')
response = s3.list_objects_v2(Bucket=bucket_name)
for content in response.get('Contents', []):
print(content['Key'])

Error when generating presigned url from aws s3 using boto3

When I try to return a generate presigned url using boto3 from bucket in aws s3 and the code:
import fastapi
import boto3
from botocore.exceptions import ClientError
s3 = boto3.client("s3",
aws_access_key_id="...",
aws_secret_access_key="...")
BUCKET_NAME = "tayibat-files"
app = FastAPI()
#app.get('/{file_name}')
async def method_name(file_name: str):
try:
url = s3.generate_presigned_url(
'get_object',
Params={'Bucket': BUCKET_NAME,
'Key': f"products/{file_name}"},
ExpiresIn=3600
)
except ClientError as e:
logging.error(e)
return url
the get request return an url, but when I try to open it in browsers, It generate:
This XML file does not appear to have any style information associated with it. The document
tree is shown below.
<Error>
<Code>InvalidRequest</Code>
<Message>The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-
SHA256.</Message>
<RequestId>ZW269CV1TAYC7CWC</RequestId>
<HostId>1yozjolBbu4difnOjjopLeOk79i34WDOFwp1VQA4Nqd0RBdLNkaOkb/uJVjFtyNu78fx06JfCbI=</HostId>
</Error>
The issue is not your code but your method of authentication or region.
I ran your code sample successfully:
import boto3
session = boto3.session.Session(profile_name="<my-profile>")
client = session.client('s3')
BUCKET_NAME = "<bucket>"
file_name = "<file>"
url = client.generate_presigned_url(
'get_object',
Params={'Bucket': BUCKET_NAME,
'Key': f"products/{file_name}"},
ExpiresIn=3600
)
print(url)
It worked fine because the region of my bucket aligned with the region of my credentials. When I tried to generate a presigned url from another region I got your same error:
<Error>
<Code>InvalidRequest</Code>
<Message>The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256.</Message>
<RequestId>JJPZYSMZTC7Z8H</RequestId>
<HostId>TsgdZIibKxZ4GVL3h28OJYIvh59yfgeZwVf+eGPXVEzIJsAxdp1VQL67vw20LR/r9uIBxro=</HostId>
</Error>

Image file type is not supported when downloading using urllib3

hi I am downloading image like this
import urllib3
http = urllib3.PoolManager()
r = http.request('GET', 'https://i.picsum.photos/id/192/536/354.png?hmac=a22QkdSZ7zXUHpV4-gnB48PPYaLlcvaTMeDXxcPRxs8')
print(r.data)
then uploading it to s3 using this
s3 = boto3.resource(s3')
key = 'file_name + '.png'
bucket = s3.Bucket(bucket_name)
bucket.upload_fileobj(io.BytesIO(r.data), key)
but when I open I get error on image opening "File type is not supported " when I open using photo opener
**EDIT:** I did as suggested by passing
ContentType='text/png'
and when I opening image by url getting this on aws : I opened this using presinged url
You did not actually mention how you add the ContentType. This is working example, using boto3.resource()
bucketname = <my-bucket>
filename = <filename>
s3 = boto3.resource('s3')
key = 'image.png'
bucket = s3.Bucket(bucketname)
with open(filename, "rb") as fd:
bucket.upload_fileobj(
fd,
Key=key,
ExtraArgs={
"ContentType": "image/png",
}
)
I then generate a pre-signed URL (which you didn't include, but as long as you get the permissions right, I'm sure you're fine)
expiration = 3600
s3 = boto3.client('s3',
region_name='us-east-2',
config=boto3.session.Config(signature_version='s3v4'))
response = s3.generate_presigned_url(
'get_object',
Params={
'Bucket': bucketname,
'Key': key
},
ExpiresIn=expiration
)
print(response)
Opening the resulting URL in my browser, it loads the PNG image that I uploaded.

Simple PUT operation via presigned URL keep getting error "The request signature we calculated does not match the signature you provided"

I am using
boto3==1.17.57
I have read AWS S3 - How to fix 'The request signature we calculated does not match the signature' error?
But I am not sure why I am still getting
The request signature we calculated does not match the signature you
provided. Check your key and signing method.
when I try to perform PUT operation.
My code snippet is pretty straightforward.
import requests
import boto3
from botocore.client import ClientError
s3 = boto3.resource('s3')
bucket_name = 'cloud-storage-1'
file_name = 'car.jpg'
########
# BUCKET
########
try:
s3.meta.client.head_bucket(Bucket=bucket_name)
except ClientError as e:
print(str(e))
print("Try to create bucket for the first time")
s3.create_bucket(
Bucket=bucket_name,
CreateBucketConfiguration={
'LocationConstraint': 'eu-central-1'
}
)
##################
# GET FILE CONTENT
##################
with open(file_name, 'rb') as object_file:
body = object_file.read()
object_file.close()
###############
# PRESIGNED URL
###############
s3_client = boto3.client('s3')
url = s3_client.generate_presigned_url(
ClientMethod='put_object',
Params={
'Bucket': bucket_name,
'Key': file_name
},
ExpiresIn=3600
)
print(url)
#####
# PUT
#####
response = requests.put(url, data=body)
print(response.content)
I can confirm the ~/.aws/credentials is correct, as I can see the bucket being created with no issue.
Bucket created without issue
~/.aws/credentials
[default]
aws_access_key_id=XXX
aws_secret_access_key=YYY
region=eu-central-1
Does anyone has any idea why it is so?

How to generate url from boto3 in amazon web services

I have a Bucket in s3 and I am trying to pull the url of the image that is in there.
I am using boto3 and boto3 doesn't seem to have an implemented generate url method.
They have a core method, that generates url like this,
import botocore.session
session = botocore.session.get_session()
client = session.create_client('s3')
presigned_url = client.generate_presigned_url(
'get_object', Params={'Bucket': self.bucket_name, 'Key': self.key})
One thing I am forced to do is, I have to send the parameters along with each request using session object. And the above method does not allow me to set the session variables (ie .. aws credentials)
The closest I can get is this
session = Session(aws_access_key_id='342342342342', aws_secret_access_key='3434234322', region_name='us-east-1')
s3 = session.resource('s3')
object = s3.Object('my-dev-bucket', 'amazonKeyString')
print object.get()["Body"]
This gets me amazon s3 object which is an object called
botocore.response.StreamingBody object at 0x7ffaff8cef50
Can I get a url of the image this way?
Able to get results and did not face any issues in getting the signed URL.
I used the default session since my aws creds were stored locally in "~/.aws/credentials" file and my default region is set as needed ~/.aws/config
import boto3
s3Client = boto3.client('s3')
s3Client.generate_presigned_url('get_object', Params = {'Bucket': 'www.mybucket.com', 'Key': 'hello.txt'}, ExpiresIn = 100)
If you need to pass params for Session, import boto3.session and create custom session
import boto3.session
session = boto3.session.Session(region_name='eu-central-1')
s3Client = session.client('s3')
If you don't want to use aws configure command, you can pass the credentials directly like this and generate the public URL.
def generate_public_url(bucket_name, file_name, aws_region, aws_key_id, aws_secret, timeout=300):
#if not object_exists(bucket_name, file_name):
# raise Exception(f"0 or many items found in bucket '{bucket_name}' with key '{file_name}')")
s3_client = boto3.client('s3', config=Config(signature_version='s3v4'),
region_name=aws_region, aws_access_key_id=aws_key_id, aws_secret_access_key=aws_secret)
url = s3_client.generate_presigned_url(
ClientMethod='get_object',
Params={
'Bucket': bucket_name,
'Key': file_name
},
ExpiresIn=timeout # seconds
)
return url

Categories