IAM Policy not allowing me to upload to S3 - python

I am uploading a file to s3 using the following code:
s3.meta.client.upload_file(file_location, bucket_name, key,ExtraArgs={'ACL': 'public-read'})
When I use ACL: Public read, my code returns with the following error that I do not have permission to do this.
"errorMessage": "Failed to upload test.xlsx: An error occurred (AccessDenied) when calling the PutObject operation: Access Denied"
"errorType": "S3UploadFailedError"
Below is an IAM policy attached to my user.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": "s3:*",
"Resource": "*"
}
]
}

Amazon S3 block public access prevents the application of any settings that allow public access to data within S3 buckets. Right now ACL operation is denied.
Please turn on the
"Block public access to buckets and objects granted through new access control lists (ACLs)" settings from Permissions >> Block Public access

Related

AWS CLI Can List S3 Bucket But Access Denied For Python Lambda

I've used terraform to setup infra for an s3 bucket and my containerised lambda. I want to trigger the lambda to list the items in my s3 bucket. When I run the aws cli it's fine:
aws s3 ls
returns
2022-11-08 23:04:19 bucket-name
This is my lambda:
import logging
import boto3
LOGGER = logging.getLogger(__name__)
LOGGER.setLevel(logging.DEBUG)
s3 = boto3.resource('s3')
def lambda_handler(event, context):
LOGGER.info('Executing function...')
bucket = s3.Bucket('bucket-name')
total_objects = 0
for i in bucket.objects.all():
total_objects = total_objects + 1
return {'total_objects': total_objects}
When I run the test in the AWS console, I'm getting this:
[ERROR] ClientError: An error occurred (AccessDenied) when calling the ListObjects operation: Access Denied
No idea why this is happening. These are my terraform lambda policies, roles and the s3 setup:
resource "aws_s3_bucket" "statements_bucket" {
bucket = "bucket-name"
acl = "private"
}
resource "aws_s3_object" "object" {
bucket = aws_s3_bucket.statements_bucket.id
key = "excel/"
}
resource "aws_iam_role" "lambda" {
name = "${local.prefix}-lambda-role"
path = "/"
assume_role_policy = <<EOF
{
"Version": "2012-10-17",
"Statement": [
{
"Action": "sts:AssumeRole",
"Principal": {
"Service": "lambda.amazonaws.com"
},
"Effect": "Allow"
}
]
}
EOF
}
resource "aws_iam_policy" "lambda" {
name = "${local.prefix}-lambda-policy"
description = "S3 specified access"
policy = <<EOF
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::bucket-name"
]
},
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
],
"Resource": [
"arn:aws:s3:::bucket-name/*"
]
}
]
}
EOF
}
As far as I can tell, your Lambda function has the correct IAM role (the one indicated in your Terraform template) but that IAM role has no attached policies.
You need to attach the S3 policy, and any other IAM policies needed, to the IAM role. For example:
resource "aws_iam_role_policy_attachment" "lambda-attach" {
role = aws_iam_role.role.name
policy_arn = aws_iam_policy.policy.arn
}
In order to run aws s3 ls, you would need to authorize the action s3:ListAllMyBuckets. This is because aws s3 ls lists all of your buckets.
You should be able to list the contents of your bucket using aws s3 ls s3://bucket-name. However, you're probably going to have to add "arn:aws:s3:::bucket-name/*" to the resource list for your role as well. Edit: nevermind!

AWS S3 PutObject Access denied problem in Python

I'm trying to upload an image to AWS S3. This code previously worked fine (and still working for another project). This is a brand new project with a new AWS S3 bucket. I noticed they again changed a lot and maybe it's a problem.
This is the code:
s3_client.upload_fileobj(
uploaded_file,
files_bucket_name,
key_name,
ExtraArgs={
'ContentType': uploaded_file.content_type
}
)
This is the permission policy for the bucket:
{
"Version": "2012-10-17",
"Id": "Policy1204",
"Statement": [
{
"Sid": "Stmt15612",
"Effect": "Allow",
"Principal": "*",
"Action": [
"s3:GetObject",
"s3:PutObject"
],
"Resource": "arn:aws:s3:::bucket-name/*"
}
]
}
The upload did not work until I added the "PutObject" here but it was working for another project. I don't like about this policy that PutObject is now public available.
How to make:
all images are public available
but only owner can upload files?
This are screenshots from AWS permissions for this bucket:
The problem has gone as soon as I created an IAM user and granted it full access to S3. Not sure if this solution is good or not but at least it's working now.
It appears that your requirement is:
Allow everyone to see the files
Only allow an owner to upload them
There is a difference between "seeing the files" -- ListObjects allows listing of the objects in a bucket while GetObject allows downloading of an object.
If you want to make all objects available for download assuming that the user knows the name of the object, then you could use a policy like this:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": "*",
"Action": [
"s3:GetObject"
],
"Resource": "arn:aws:s3:::bucket-name/*"
}
]
}
Note that this policy will not permit viewing the contents of the bucket.
If you wish to allow a specific IAM User permission to upload files to the bucket, then put this policy on the IAM User (not on the Bucket):
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:PutObject"
],
"Resource": "arn:aws:s3:::bucket-name/*"
}
]
}

Getting AccessDeniedException from Lambda function when calling AWS SSO Permission set

Following is my Python code to add/update an inline policy for an AWS SSO permission set:
# In actual code adding escape characters
Inline_Policy="
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"s3:Get*",
"s3:List*"
],
"Effect": "Allow",
"Resource": "*"
}
] "
response = client.put_inline_policy_to_permission_set(
InstanceArn='arn:aws:sso:::instance/ssoins-sssss',
PermissionSetArn='arn:aws:sso:::permissionSet/ssoins-sssss/ps-sssss',
InlinePolicy=Inline_Policy)
I am getting the error:
"errorMessage": "An error occurred (AccessDeniedException) when calling the PutInlinePolicyToPermissionSet operation: User: arn:aws:sts::ddddddd:assumed-role/Modify_Permission_Set-role-ssss/Modify_Permission_Set is not authorized to perform: sso:PutInlinePolicyToPermissionSet on resource: arn:aws:sso:::permissionSet/ssoins-sssss/ps-sssss"
I tried adding the Admin policy for the Lambda role executing the function and I still get permission denied.
Is there a different way to handle SSO permission sets than regular IAM permissions?
Admin Policy attached to Lambda
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "*",
"Resource": "*"
}
]
}
Have you checked if there is a Service Control Policy (SCP) denying access to SSO which applies to your account or Organizational Unit (OU) please? https://docs.aws.amazon.com/organizations/latest/userguide/orgs_manage_policies_scps.html
It is likely due to your region if you have ensured that the policy and permissions are correct.
Make sure you are defining the sso client to the region where your SSO or Identity Center is activated
e.g. for Python sso = boto3.client('sso-admin', region_name='deployed_sso_region')

Boto3: aws credentials with limited permissions

I was provisioned some AWS keys. These keys give me access to certain directories in a s3 bucket. I want to use boto3 to interact with the directories that were exposed to me, however it seems that I can't actually do anything with the bucket at all, since I don't have access to the entire bucket.
This works for me from my terminal:
aws s3 ls s3://the_bucket/and/this/specific/path/
but if I do:
aws s3 ls s3://the_bucket/
I get:
An error occurred (AccessDenied) when calling the ListObjects
operation: Access Denied
which also happens when I try to access the directory via boto3.
session = boto3.Session(profile_name=my_creds)
client=session.client('s3')
list_of_objects = client.list_objects(Bucket='the_bucket', Prefix='and/this/specific/path', Delimiter='/')
Do I need to request access to the entire bucket for boto3 to be usable?
You need to set this Bucket Policy:
{
"Sid": "<SID>",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::<account>:user/<user_name>"
},
"Action": [
"s3:GetBucketLocation",
"s3:ListBucket"
],
"Resource": "arn:aws:s3:::<bucket_name>"
}
For more information about Specifying Permissions in a Policy

AWS S3 Lambda Access Denied

Trying to follow this tutorial and I keep getting "Access Denied" when running my Lambda. The Lambda is the default s3-python-get-object.
The role for the lambda is
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:DeleteObject"
],
"Resource": [
"arn:aws:s3:::*"
]
}
]
}
The user has admin privileges. I just don't get why it's going wrong.
From the docs:
If the object you request does not exist, the error Amazon S3 returns depends on whether you also have the s3:ListBucket permission.
If you have the s3:ListBucket permission on the bucket, Amazon S3 returns an HTTP status code 404 ("no such key") error.
If you don’t have the s3:ListBucket permission, Amazon S3 returns an HTTP status code 403 ("access denied") error.
The code above seems right for the operation you do.
Please make sure you have the key you are calling or add s3:ListBucket permission to be sure of the kind of error.

Categories