Trying to follow this tutorial and I keep getting "Access Denied" when running my Lambda. The Lambda is the default s3-python-get-object.
The role for the lambda is
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:DeleteObject"
],
"Resource": [
"arn:aws:s3:::*"
]
}
]
}
The user has admin privileges. I just don't get why it's going wrong.
From the docs:
If the object you request does not exist, the error Amazon S3 returns depends on whether you also have the s3:ListBucket permission.
If you have the s3:ListBucket permission on the bucket, Amazon S3 returns an HTTP status code 404 ("no such key") error.
If you don’t have the s3:ListBucket permission, Amazon S3 returns an HTTP status code 403 ("access denied") error.
The code above seems right for the operation you do.
Please make sure you have the key you are calling or add s3:ListBucket permission to be sure of the kind of error.
Related
I'm trying to upload an image to AWS S3. This code previously worked fine (and still working for another project). This is a brand new project with a new AWS S3 bucket. I noticed they again changed a lot and maybe it's a problem.
This is the code:
s3_client.upload_fileobj(
uploaded_file,
files_bucket_name,
key_name,
ExtraArgs={
'ContentType': uploaded_file.content_type
}
)
This is the permission policy for the bucket:
{
"Version": "2012-10-17",
"Id": "Policy1204",
"Statement": [
{
"Sid": "Stmt15612",
"Effect": "Allow",
"Principal": "*",
"Action": [
"s3:GetObject",
"s3:PutObject"
],
"Resource": "arn:aws:s3:::bucket-name/*"
}
]
}
The upload did not work until I added the "PutObject" here but it was working for another project. I don't like about this policy that PutObject is now public available.
How to make:
all images are public available
but only owner can upload files?
This are screenshots from AWS permissions for this bucket:
The problem has gone as soon as I created an IAM user and granted it full access to S3. Not sure if this solution is good or not but at least it's working now.
It appears that your requirement is:
Allow everyone to see the files
Only allow an owner to upload them
There is a difference between "seeing the files" -- ListObjects allows listing of the objects in a bucket while GetObject allows downloading of an object.
If you want to make all objects available for download assuming that the user knows the name of the object, then you could use a policy like this:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": "*",
"Action": [
"s3:GetObject"
],
"Resource": "arn:aws:s3:::bucket-name/*"
}
]
}
Note that this policy will not permit viewing the contents of the bucket.
If you wish to allow a specific IAM User permission to upload files to the bucket, then put this policy on the IAM User (not on the Bucket):
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:PutObject"
],
"Resource": "arn:aws:s3:::bucket-name/*"
}
]
}
Following is my Python code to add/update an inline policy for an AWS SSO permission set:
# In actual code adding escape characters
Inline_Policy="
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"s3:Get*",
"s3:List*"
],
"Effect": "Allow",
"Resource": "*"
}
] "
response = client.put_inline_policy_to_permission_set(
InstanceArn='arn:aws:sso:::instance/ssoins-sssss',
PermissionSetArn='arn:aws:sso:::permissionSet/ssoins-sssss/ps-sssss',
InlinePolicy=Inline_Policy)
I am getting the error:
"errorMessage": "An error occurred (AccessDeniedException) when calling the PutInlinePolicyToPermissionSet operation: User: arn:aws:sts::ddddddd:assumed-role/Modify_Permission_Set-role-ssss/Modify_Permission_Set is not authorized to perform: sso:PutInlinePolicyToPermissionSet on resource: arn:aws:sso:::permissionSet/ssoins-sssss/ps-sssss"
I tried adding the Admin policy for the Lambda role executing the function and I still get permission denied.
Is there a different way to handle SSO permission sets than regular IAM permissions?
Admin Policy attached to Lambda
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "*",
"Resource": "*"
}
]
}
Have you checked if there is a Service Control Policy (SCP) denying access to SSO which applies to your account or Organizational Unit (OU) please? https://docs.aws.amazon.com/organizations/latest/userguide/orgs_manage_policies_scps.html
It is likely due to your region if you have ensured that the policy and permissions are correct.
Make sure you are defining the sso client to the region where your SSO or Identity Center is activated
e.g. for Python sso = boto3.client('sso-admin', region_name='deployed_sso_region')
I am uploading a file to s3 using the following code:
s3.meta.client.upload_file(file_location, bucket_name, key,ExtraArgs={'ACL': 'public-read'})
When I use ACL: Public read, my code returns with the following error that I do not have permission to do this.
"errorMessage": "Failed to upload test.xlsx: An error occurred (AccessDenied) when calling the PutObject operation: Access Denied"
"errorType": "S3UploadFailedError"
Below is an IAM policy attached to my user.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": "s3:*",
"Resource": "*"
}
]
}
Amazon S3 block public access prevents the application of any settings that allow public access to data within S3 buckets. Right now ACL operation is denied.
Please turn on the
"Block public access to buckets and objects granted through new access control lists (ACLs)" settings from Permissions >> Block Public access
I was provisioned some AWS keys. These keys give me access to certain directories in a s3 bucket. I want to use boto3 to interact with the directories that were exposed to me, however it seems that I can't actually do anything with the bucket at all, since I don't have access to the entire bucket.
This works for me from my terminal:
aws s3 ls s3://the_bucket/and/this/specific/path/
but if I do:
aws s3 ls s3://the_bucket/
I get:
An error occurred (AccessDenied) when calling the ListObjects
operation: Access Denied
which also happens when I try to access the directory via boto3.
session = boto3.Session(profile_name=my_creds)
client=session.client('s3')
list_of_objects = client.list_objects(Bucket='the_bucket', Prefix='and/this/specific/path', Delimiter='/')
Do I need to request access to the entire bucket for boto3 to be usable?
You need to set this Bucket Policy:
{
"Sid": "<SID>",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::<account>:user/<user_name>"
},
"Action": [
"s3:GetBucketLocation",
"s3:ListBucket"
],
"Resource": "arn:aws:s3:::<bucket_name>"
}
For more information about Specifying Permissions in a Policy
I am using Python and Boto to upload images to S3. I can get it to work if I add a grantee of "Any Authenticated AWS User" and give this grantee permission to upload/delete. However, my impression from the documentation and several different posts on this site is that this would allow literally any authenticated AWS user, not just those authenticated to my account, to access the bucket, which I do not want. However, I am unable to upload files (403) if I only give upload/delete permission to the owner of the account, even though I authenticate like this:
s3 = boto.connect_s3(aws_access_key_id=AWS_ACCESS_KEY_ID, aws_secret_access_key=AWS_SECRET_ACCESS_KEY)
im = Image.open(BytesIO(urllib.urlopen(self.url).read()))
filename = self.url.split('/')[-1].split('.')[0]
extension = self.url.split('.')[-1]
out_im2 = cStringIO.StringIO()
im.save(out_im2, im.format)
key = bucket.new_key(filename + "." + extension)
key.set_contents_from_string(out_im2.getvalue(), headers={
"Content-Type": extension_contenttype_mapping[extension],
})
key.set_acl('public-read')
self.file = bucket_url + filename + "." + extension
What am I doing wrong in this situation?
I found an answer at least, if not the one that I was looking for. I created a user specific to this bucket and added that user to a group with AmazonS3FullAccess permissions, which I also had to create. Then I modified my boto requests so that they use this user instead of the owner of the account, and I added this bucket policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::111111111111:root"
},
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::media.example.com",
"arn:aws:s3:::media.example.com/*"
]
}
]
}
This worked for me, although I don't know if the bucket policy was part of the solution or not, and I still don't know why it did not work when I was trying as the owner user. This is, however, the more proper and secure way to do things anyway.