Pulling data from S3 using an address and access key from Python - python

I've been given a path to an S3 bucket and a key to access it, how can I access the bucket? and how can I do it from Python?
The name looks like this solutions/accounts/services and the key is some string.
I tried doing this:
import boto3
response = client.get_object(
Bucket='solutions',
Key='accounts/services'
)
print(response)
This yields:
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the GetObject operation: Access Denied
I was provided a key to this S3 bucket in the form of : OKIA4RBSCI236N869IJG
Where does this key need to be inserted?

The Key should include all the folders. Example: accounts/services/file.txt.
The Bucket name should only be the bucket name. Example: solutions.
This will get you s3://solutions/accounts/services/file.txt.

Related

Cannot Access Subfolder of S3 bucket – Python, Boto3

I have been given access to a subfolder of an S3 bucket, and want to access all files inside using Python and boto3. I am new to S3 and have read the docs to death, but haven't been able to figure out how to successfully access just one subfolder. I understand that s3 does not use unix-like directory structure, but I don't have access to the root bucket.
How can I configure boto3 to just connect to this subfolder?
I have successfully used this AWS CLI command to download the entire subfolder to my machine:
aws s3 cp --recursive s3://s3-bucket-name/SUB_FOLDER/ /Local/Path/Where/Files/Download/To --profile my-profile
This code:
AWS_BUCKET='s3-bucket-name'
s3 = boto3.client("s3", region_name='us-east-1', aws_access_key_id=AWS_KEY_ID, aws_secret_access_key=AWS_SECRET)
response = s3.list_objects(Bucket=AWS_BUCKET)
Returns this error:
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the ListObjects operation: Access Denied
I have also tried specifying the 'prefix' option in the call to list_objects, but this produces the same error.
You want to aws configure and save have your credentials and region then using boto3 is simple and easy.
Use boto3.resource and get the client like this:
s3_resource = boto3.resource('s3')
s3_client = s3_resource.meta.client
s3_client.list_objects(Bucket=AWS_BUCKET)
You should be good to go.

How to get the URL of all the objects in our aws s3 bucket programmatically using python?

I am posting this here because I found it really hard to find the function to get all objects from our s3 bucket using python. When I tried to find get_object_data function, I was directed to downloading the object function.
So, how do we get the data of all the objects in our AWS s3 bucket using boto3(aws sdk for python)?
import boto3 to your python shell
make a connection to your AWS account and specify the resource(s3-bucket here) you want to access?
(make sure that the IAM credentials you are giving have access to that resource)
get the data required
The code looks something like this
import boto3
s3_resource = boto3.resource(service_name='s3',
region_name='<your bucket region>'
aws_access_key_id='<your access key id>'
aws_secret_access_key='<your secret access key>')
a = s3_resource.Bucket('<your bucket name>')
for obj in a.objects.all():
#object URL
print("https://<your bucket name>.s3.<your bucket region>.amazonaws.com/" + obj.key)
#if you want to print all the data of object, just print obj

How to handle PutObject operation: Access Denied for Lamba

I am trying to use a lambda function to write some text to a file in S3.
Below is the function.
I get this error
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
The bucket is
import boto3
def lambda_handler(event, context):
bucket='tessstinggbucccket'
key='june/22/testlog1.txt'
some_binary_data = b'Here we have some data'
s3 = boto3.resource("s3")
object = s3.Object(bucket, key)
object.put(Body=some_binary_data)
You are required to provide write/put permissions to your lambda functions to give them the capability to write on your s3 buckets. You can achieve it using the code below:
# Add this property on your lambda function definition inside the SAM template
Policies:
- S3WritePolicy:
BucketName: "YourBucketNameHere"

How to read the content of a file in boto3 from a bucket at specific key

I need to read the content of an audio file which is stored in AWS using boto3. To do that right now I am doing something like this:
client = boto3.client('s3')
client.download_file(obj.bucket, obj.key, "temp.mp3")
However it is not downloading the file and giving me a clientError something like this:
An error occurred (404) when calling the HeadObject operation: Not Found
I am not sure what HeadObject is. is there any alternate way to read the content of a file stored in specific bucket at a specific key?

s3 connection to 'folder' via boto

I'm trying to upload a file to a specific location using boto and python. I'm accessing using something to this effect:
from boto.s3.connection import S3Connection
from boto.s3.key import Key
conn = S3Connection(AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY)
bucket = conn.get_bucket('the_bucket_name')
for key in bucket:
print key.name
Here's the trick. I have been provisioned credentials to a 'folder' within a bucket. per this - Amazon S3 boto - how to create a folder? I understand that there actually aren't folders in s3, rather keys like "foo/bar/my_key.txt". when i try to execute get_bucket() I get
boto.exception.S3ResponseError: S3ResponseError: 403 Forbidden
because i dont actually have credentials to the base bucket, rather a subset of the bucket keys. my_bucket/the_area_I_have_permission/*
Does anyone know how I could pass the specific 'area' in the bucket i have access to in the connection step? or an alternative method i can use to access my_bucket/the_area_I_have_permission/* ?
Does this help :
bucket = conn.get_bucket('the_bucket_name',validate=False)
key = bucket.get_key("ur_key_name")
if key is not None:
print key.get_contents_as_string()
keys = bucket.list(prefix='the_area_I_have_permission')
for key in keys:
print key.name
Found the problem. RequestTimeTooSkewed Error using PHP S3 Class
The issue was my VM date was off and amazon uses the date to validate the request. 1+ to #bdonlan.

Categories