How to handle PutObject operation: Access Denied for Lamba - python

I am trying to use a lambda function to write some text to a file in S3.
Below is the function.
I get this error
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
The bucket is
import boto3
def lambda_handler(event, context):
bucket='tessstinggbucccket'
key='june/22/testlog1.txt'
some_binary_data = b'Here we have some data'
s3 = boto3.resource("s3")
object = s3.Object(bucket, key)
object.put(Body=some_binary_data)

You are required to provide write/put permissions to your lambda functions to give them the capability to write on your s3 buckets. You can achieve it using the code below:
# Add this property on your lambda function definition inside the SAM template
Policies:
- S3WritePolicy:
BucketName: "YourBucketNameHere"

Related

use boto3 to upload file to s3 boto3.exceptions.S3UploadFailedError: An error occurred (QuotaExceeded) when calling the PutObject operation: Unknown

params = dict(service_name="s3", endpoint_url="****",
aws_access_key_id="****",
aws_secret_access_key="****")
s3 = boto3.client(**params)
response = s3.upload_file(localfile, bucket, key=key)
boto3.exceptions.S3UploadFailedError: An error occurred (QuotaExceeded) when calling the PutObject operation: Unknown
There is a limit on quotas that you can operate when using Boto3 for AWS. Some are adjustable, some are not. From your code, we don't see the file size, so this might be it as well. Here is the page for S3 and its quotas limit here

Pulling data from S3 using an address and access key from Python

I've been given a path to an S3 bucket and a key to access it, how can I access the bucket? and how can I do it from Python?
The name looks like this solutions/accounts/services and the key is some string.
I tried doing this:
import boto3
response = client.get_object(
Bucket='solutions',
Key='accounts/services'
)
print(response)
This yields:
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the GetObject operation: Access Denied
I was provided a key to this S3 bucket in the form of : OKIA4RBSCI236N869IJG
Where does this key need to be inserted?
The Key should include all the folders. Example: accounts/services/file.txt.
The Bucket name should only be the bucket name. Example: solutions.
This will get you s3://solutions/accounts/services/file.txt.

Python boto 3 - download a file from S3 and reupload to a new folder

I want to download a file from a lambda and reupload it then to a new folder.
s3_resource = boto3.resource('s3')
s3_resource.Bucket('mybucket').download_file('hello.txt', '/tmp/hello.txt')
I have given the lambda full S3 access but still getting an access denied error
"errorMessage": "An error occurred (AccessDenied) when calling the GetObject operation: Access Denied",

I cannot delete s3 object using one method, but i can delete using another. (obj.delete vs delete_object)

What is the difference between deleting s3 objects using client session delete_object method and obj.delete method?
While calling
import boto3
session = boto3.Session(aws_access_key_id=aws_access_key_id,
aws_secret_access_key=aws_secret_access_key_emr,
region_name=region_name)
s3_client = session.client('s3')
s3_client.delete_object(Bucket=bucket_name, Key='input/df.parquet')
the code run without any error.
But the following code
s3 = boto3.resource('s3')
obj = s3.Object(bucket_name, "input/df.parquet")
obj.delete()
lead to the ClientError; An error occurred (AccessDenied) when calling the DeleteObject operation: Access Denied
The similar thing happens while using aws data wrangler library. The running of following code
import awswrangler as wr
wr.s3.delete_objects(f"s3://{bucket_name}/input/df.parquet")
doesn't delete anything, and doesn't show any error.
Ok it's solved, i have defined s3 as a boto3.resource('s3') instead of session.resource('s3')

Getting the exception : An error occurred (AccessDenied) when calling the PutObject operation: Access Denied in AWS EB

I am new to Flask and Python. I am trying to upload a file in my AWS S3 bucket. While this works fine in my local, I get the following exception when I try to do the same after deploying on Elastic Beanstalk.
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
app.py
#app.route('/snap/ingredient', methods=['POST'])
def findIngredient():
s3 = boto3.resource('s3')
response = s3.Bucket('<bucket-name>').put_object(Key="image.jpeg", Body=request.files['myFile'], ACL='public-read')
print(response.key)
return response
I am not sure if I am missing something. My Bucket access is public.

Categories