AmazonS3 - connecting with Python Boto according specific permissions - python

I am trying to connect Amazon S3 via Boto 2.38.0 and python 3.4.3.
The S3 account is owned by another company and they grants just these permissions :
"Statement":
[
{
"Effect": "Allow",
"Action": "s3:ListBucket",
"Resource": "arn:axs:s3:::GA-Exports",
"Condition":{
"StringLike":
{
"s3.prefix": "Events_3112/*"
}
}
},{
"Effect": "Allow",
"Action":
[
"s3:GetObject",
"s3.GetObjectAcl",
"s3.GetBucketAcl"
],
"Resource": "arn:axs:s3:::GA-Exports/Events_3112/*",
"Condition": {}
}
]
I can connect and retrieve a specific file if I set the name. But I need to retrieve all data from S3 (for example to determine -through a script- which files I have not yet downloaded).
from boto.s3.connection import S3Connection
from boto.s3.connection import OrdinaryCallingFormat
s3_connection = S3Connection(access_key, secret_key,calling_format=OrdinaryCallingFormat())
bucket = s3_connection.get_bucket(__bucket_name, validate=False)
key = bucket.get_key(file_name)
works, but
all_buckets = s3_connection.get_all_buckets()
raise an error
S3ResponseError: S3ResponseError: 403 Forbidden
<?xml version="1.0" encoding="UTF-8"?>
<Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>19D20ADCFFC899ED</RequestId><HostId>eI4CzQqAvOnjcXJNZyyk+drFHjO9+yj0EtP+vJ5f/D7D4Dh2HFL3UvCacy9nP/wT</HostId></Error>
With the software S3 Browser, I can right click > "export file list", and get what I need. But how can I do this in python ?
EDIT :
Finally found the answer :
bucket_name = 'GA-Exports'
s3_connection = S3Connection(access_key, secret_key, calling_format=OrdinaryCallingFormat())
bucket = s3_connection.get_bucket(bucket_name, validate=False)
for key in bucket.list(prefix='Events_3112/DEV/'):
print(key.name, key.size, key.last_modified)
Thanks for your help! :)

You won't be allowed to get all buckets, permissions says that you are allowed to list bucket contents only for "GA-Exports":
from boto.s3.connection import S3Connection
from boto.s3.connection import OrdinaryCallingFormat
# this is to avoid a 301 mover permanently when used OrdinaryCallingFormat
if '.' in __bucket_name:
conn = S3Connection(access_key, secret_key, calling_format=OrdinaryCallingFormat())
else:
conn = S3Connection(access_key, secret_key)
bucket = conn.get_bucket(__bucket_name, validate=False)
l = bucket.list(prefix='Events_3112/') # now l is a list of objects within the bucket
# other option is to use bucket.get_all_keys()
for key in l:
print l # or whatever you want to do with each file name
# Recall this is only the filename not the file perse :-D
see complete bucket object reference in http://boto.readthedocs.org/en/latest/ref/s3.html#module-boto.s3.bucket
Edit: added a fix when a 301 moved permanently error is received when accessing S3 via ordinarycallingformat. Added #garnaat comment on prefix aswell (thx!)

Related

cross-account file upload in S3 bucket using boto3 and python

I have an S3 bucket with a given access_key and secret_access_key. I use the following code to upload files into my S3 bucket successfully.
import boto3
import os
client = boto3.client('s3',
aws_access_key_id = access_key,
aws_secret_access_key = secret_access_key)
upload_file_bucket = 'my-bucket'
upload_file_key = 'my_folder/' + str(my_file)
client.upload_file(file, upload_file_bucket, upload_file_key)
Now, I want to upload my_file into another bucket that is owned by a new team. Therefore, I do not have access to access_key and secret_access_key. What is the best practice to do cross-account file upload using boto3 and Python?
You can actually use the same code, but the owner of the other AWS Account would need to add a Bucket Policy to the destination bucket that permits access from your IAM User. It would look something like this:
{
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"s3:PutObject",
"s3:PutObjectAcl"
],
"Effect": "Allow",
"Resource": "arn:aws:s3:::their-bucket/*",
"Principal": {
"AWS": [
"arn:aws:iam::YOUR-ACCOUNT-ID:user/username"
]
}
}
]
}
When uploading objects to a bucket owned by another AWS Account I recommend adding ACL= bucket-owner-full-control , like this:
client.upload_file(file, upload_file_bucket, upload_file_key, ExtraArgs={'ACL':'bucket-owner-full-control'})
This grants ownership of the object to the bucket owner, rather than the account that did the upload.

Downloading Files on a Public S3 Bucket Without Authentication Using Python

Im trying to download a file from an S3 bucket that is public and requires no authentication (meaning no need to hardcode Access and Secret keys to access nor store it in AWS CLI), yet I still cannot access it via boto3.
Python code
import boto3
import botocore
from botocore import UNSIGNED
from botocore.config import Config
BUCKET_NAME = 'converted-parquet-bucket'
PATH = 'json-to-parquet/names.snappy.parquet'
s3 = boto3.client('s3', config=Config(signature_version=UNSIGNED))
try:
s3.Bucket(BUCKET_NAME).download_file(PATH, 'names.snappy.parquet')
except botocore.exceptions.ClientError as e:
if e.response['Error']['Code'] == "404":
print("The object does not exist.")
else:
raise
I get this error code when I execute the code
AttributeError: 'S3' object has no attribute 'Bucket'
If it helps, here is my bucket public policy
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicReadGetObject",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::converted-parquet-bucket/*"
}
]
}
If your suggestion is to store keys, please dont, that is not what I'm trying to do.
try resource s3 = boto3.resource('s3') instead of s3 = boto3.client('s3')

Copy from S3 bucket in one account to S3 bucket in another account using Boto3 in AWS Lambda

I have created a S3 bucket and created a file under my aws account. My account has trust relationship established with another account and I am able to put objects into the bucket in another account using Boto3. How can I copy objects from bucket in my account to bucket in another account using Boto3?
I see "access denied" when I use the code below -
source_session = boto3.Session(region_name = 'us-east-1')
source_conn = source_session.resource('s3')
src_conn = source_session.client('s3')
dest_session = __aws_session(role_arn=assumed_role_arn, session_name='dest_session')
dest_conn = dest_session.client ( 's3' )
copy_source = { 'Bucket': bucket_name , 'Key': key_value }
dest_conn.copy ( copy_source, dest_bucket_name , dest_key,ExtraArgs={'ServerSideEncryption':'AES256'}, SourceClient = src_conn )
In my case , src_conn has access to source bucket and dest_conn has access to destination bucket.
I believe the only way to achieve this by downloading and uploading the files.
AWS Session
client = boto3.client('sts')
response = client.assume_role(RoleArn=role_arn, RoleSessionName=session_name)
session = boto3.Session(
aws_access_key_id=response['Credentials']['AccessKeyId'],
aws_secret_access_key=response['Credentials']['SecretAccessKey'],
aws_session_token=response['Credentials']['SessionToken'])
Another approach is to attach a policy to the destination bucket permitting access from the account hosting the source bucket. eg. something like the following should work (although you may want to tighten up the permissions as appropriate):
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::<source account ID>:root"
},
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::dst_bucket",
"arn:aws:s3:::dst_bucket/*"
]
}
]
}
Then your Lambda hosted in your source AWS account should have no problems writing to the bucket(s) in the destination AWS account.

S3 Boto 403 Forbidden Unless Access Given to "Any Authenticated AWS User"

I am using Python and Boto to upload images to S3. I can get it to work if I add a grantee of "Any Authenticated AWS User" and give this grantee permission to upload/delete. However, my impression from the documentation and several different posts on this site is that this would allow literally any authenticated AWS user, not just those authenticated to my account, to access the bucket, which I do not want. However, I am unable to upload files (403) if I only give upload/delete permission to the owner of the account, even though I authenticate like this:
s3 = boto.connect_s3(aws_access_key_id=AWS_ACCESS_KEY_ID, aws_secret_access_key=AWS_SECRET_ACCESS_KEY)
im = Image.open(BytesIO(urllib.urlopen(self.url).read()))
filename = self.url.split('/')[-1].split('.')[0]
extension = self.url.split('.')[-1]
out_im2 = cStringIO.StringIO()
im.save(out_im2, im.format)
key = bucket.new_key(filename + "." + extension)
key.set_contents_from_string(out_im2.getvalue(), headers={
"Content-Type": extension_contenttype_mapping[extension],
})
key.set_acl('public-read')
self.file = bucket_url + filename + "." + extension
What am I doing wrong in this situation?
I found an answer at least, if not the one that I was looking for. I created a user specific to this bucket and added that user to a group with AmazonS3FullAccess permissions, which I also had to create. Then I modified my boto requests so that they use this user instead of the owner of the account, and I added this bucket policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::111111111111:root"
},
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::media.example.com",
"arn:aws:s3:::media.example.com/*"
]
}
]
}
This worked for me, although I don't know if the bucket policy was part of the solution or not, and I still don't know why it did not work when I was trying as the owner user. This is, however, the more proper and secure way to do things anyway.

AWS S3 policies confusions

I would like to give read (download) right to a single user.
I am confused about what I should use:
Should I use
The Bucket Policy Editor from the S3 interface
The inline policies for the user and specify read permissions (from IAM interface)
Activate "Any Authenticated AWS User" has the right to read (from s3 interface) and then use inline permissions for more granularity ?
I used the inline policies and it won't work:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowUserToReadObject",
"Action": [
"s3:GetObject",
"s3:GetObjectVersion",
"s3:GetObjectTorrent"
],
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::staging/*",
"arn:aws:s3:::prod/*"
]
}
]
}
When I use Boto:
#!/usr/bin/env python
# -*- coding: utf-8 -*-
from boto.s3.connection import S3Connection
from boto.s3.key import Key
import sys, os
AWS_KEY = ''
AWS_SECRET = ''
from boto.s3.connection import S3Connection
conn = S3Connection(AWS_KEY, AWS_SECRET)
bucket = conn.get_bucket('staging')
for key in bucket.list():
print key.name.encode('utf-8')
I got the following error:
Traceback (most recent call last):
File "listing_bucket_files.py", line 20, in <module>
bucket = conn.get_bucket('staging')
File "/usr/local/lib/python2.7/dist-packages/boto/s3/connection.py", line 503, in get_bucket
return self.head_bucket(bucket_name, headers=headers)
File "/usr/local/lib/python2.7/dist-packages/boto/s3/connection.py", line 536, in head_bucket
raise err
boto.exception.S3ResponseError: S3ResponseError: 403 Forbidden
You didn't assign "s3:ListBucket" permission, that the account was stopped at the access to buckets of staging and prod, then has no permission to access the files/folders in these buckets.
Remember you have to seperate the code as below, and don't add /* after bucket name.
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::staging",
"arn:aws:s3:::prod",
]
},

Categories