boto3 S3: get_object error handling - python

What is the best way to do error handling when getting an object from S3 using Python boto3?
My approach:
from botocore.exceptions import ClientError
import boto3
s3_client = boto3.client('s3')
try:
s3_object = s3_client.get_object("MY_BUCKET", "MY_KEY")
except ClientError, e:
error_code = e.response["Error"]["Code"]
# do error code checks here
I am not sure if ClientError is the best Exception to use here. I know there is a Boto3Error class, but I do not think you can do error code checks similarly to ClientError.

I think your approach is sufficient. If you can narrow your errors to a few, you can break it down into if blocks, and handle accordingly.
except ClientError as e:
error_code = e.response["Error"]["Code"]
if error_code == "AccessDenied":
# do code
elif error_code == "InvalidLocationConstraint":
# do more code
This is just an experimental approach. Because most error responses are API-driven, I don't think you'll find them anywhere directly in the code (ie: doing except AccessDenied:). You can find all error responses for Amazon S3 here.

Related

how to import boto3 athena client exceptions

I am working with athena from within my python code, using boto3, as follows:
def query_athena(query, output_path):
client = boto3.client('athena')
client.start_query_execution(
ResultConfiguration={'OutputLocation': output_path},
QueryString=query
)
As stated in the docs, start_query_execution may raise InternalServerException, InvalidRequestException or TooManyRequestsException. I'd like to treat this as follows:
def query_athena(query, output_path):
client = boto3.client('athena')
try:
client.start_query_execution(
ResultConfiguration={'OutputLocation': output_path},
QueryString=query
)
except <AthenaException> as e:
deal with e
where <AthenaException> being one of the three exceptions I mentioned or, better yet, their superclass.
My question is how do I import these exceptions? The docs show them as Athena.Client.exceptions.InternalServerException, but I can't seem to find this Athena.Client in any boto3 module.
I ran into the same confusion, but figured it out. The exceptions listed in the docs aren't internal to boto3, but rather contained in the response when boto3 throws a client error.
My first shot at a solution looks like this. It assumes you've handled s3 output location, a boto3 session, etc already:
import boto3
from botocore.exceptions import ClientError
try:
client = session.client('athena')
response = client.start_query_execution(
QueryString=q,
QueryExecutionContext={
'Database': database
},
ResultConfiguration={
'OutputLocation': s3_output,
}
)
filename = response['QueryExecutionId']
print('Execution ID: ' + response['QueryExecutionId'])
except ClientError as e:
response = e.response
code = response['Error']['Code']
message = response['Error']['Message']
if code == 'InvalidRequestException':
print(f'Error in query, {code}:\n{message}')
raise e
elif code == 'InternalServerException':
print(f'AWS {code}:\n{message}')
raise e
elif code == 'TooManyRequestsException':
# Handle a wait, retry, etc here
pass

How to handle try and exception in python for bucket list empty

I am trying to handle not none or empty list exception while using boto3. I want to know is there any good way of pythonic code to write this.
buckets_list= None
try:
my_region = os.environ['AWS_REGION']
if my_region == 'us-east-1':
try:
s3 = boto3.client('s3')
buckets_list = s3.list_buckets()
except Exception as err:
logging.error('Exception was thrown in connection %s' % err)
print("Error in connecting and listing bucket{}".format(err))
if buckets_list['Buckets']:
# Search for all buckets.
for bucket in buckets_list['Buckets']:
# my code follow to get other things...
else:
print("Buckets are empty in this region")
else:
print("Region not available")
raise Exception("Exception was thrown in Region")
except Exception as err:
logging.error('Exception was thrown %s' % err)
print("Error is {}".format(err))
raise err
Is this the right way or any suggestions would help.
You can use else block of the try except suite. It is useful for code that must be executed if the try clause does not raise an exception.
try:
s3 = boto3.client('s3')
buckets_list = s3.list_buckets()
except Exception as err:
logging.error('Exception was thrown in connection %s' % err)
print("Error is {}".format(err))
else:
# This means the try block is succeeded, hence `buckets_list` variable is set.
for bucket in buckets_list['Buckets']:
# Do something with the bucket
One issue I am seeing from your code is that, if the list_buckets call is failed there is a chance to get NameError at the line if buckets_list['Buckets'] is not None:. Because buckets_list is undefined if the buckets_list call is failed. To understand this try to run the following snippet :)
try:
a = (1/0)
except Exception as e:
print(e)
print(a)
UPDATE
This is how I would implement this,
Use .get method to avoid KeyError
Use else block of the try except suite to specify the code must be executed if the try clause does not raise an exception.
my_region = os.environ.get('AWS_REGION')
if my_region == 'us-east-1':
try:
s3 = boto3.client('s3')
buckets_list = s3.list_buckets()
except Exception as err:
logging.error('Exception was thrown in connection %s' % err)
print("Error in connecting and listing bucket{}".format(err))
else:
buckets_list = buckets_list['Buckets']
if not buckets_list:
print("Buckets are empty in this region")
else:
# Search for all buckets.
for bucket in buckets_list:
pass
# Do something with the bucket
else:
print("Region not available")
Your code is somewhat not detailed enough, at least the beginning of the function should have been shared. Anyway, did you consider:
try:
s3 = boto3.client('s3')
buckets_list = s3.list_buckets()
print("buckets_list", buckets_list['Buckets'])
print("Error is {}".format(err))
if buckets_list['Buckets'] is not None: ##Here I am trying to check if buckets are empty
# Search for all buckets.
for bucket in buckets_list['Buckets']:
(your other codes)
except Exception as err:
logging.error('Exception was thrown in connection %s' % err)

DynamoDB Python API: Way to check result of conditional expression?

Using DynamoDB, I know I can specify a conditional expression to control updates, for example, using attribute_not_exists() to prevent overwriting an existing item. However, is there any way to check the result of this? I.e. if there was indeed an existing item and the create failed, I'd like to know that so I can return an error code in my HTTP response. However, looking at the documentation in Boto3, by default put_item returns nothing, so I'm unsure of how I'd be able to monitor the success of the operation. Anyone found a way to do this?
To provide a code example
import boto3
from botocore.exceptions import ClientError
dynamodb = boto3.client('dynamodb')
try:
response = dynamodb.put_item(
TableName='my-table',
Item={
"MyPartitionKey": {
"S": 'some-unique-value'
}
},
ConditionExpression="attribute_not_exists(MyPartitionKey)",
)
print('this worked!')
except ClientError as e:
if e.response['Error']['Code'] != 'ConditionalCheckFailedException':
print('This was not a unique key')
else:
print('some other error')
Found it, ConditionalCheckFailedException is thrown. Disappointed the docs don't mention this, other kinds of exceptions are detailed in the boto3 docs, but not this one!

How do I use python to download an S3 file from a link with a signature and expiration?

I have a s3 link provided to me by a third-party with the following structure: http://s3.amazonaws.com/bucket_name_possibly/path/to/file_possibly/filename?AWSAccessKeyId=SomeKey&Expires=888888&Signature=SomeCharactersPossiblyHTMLencoded
Clicking on the link downloads the file for me. However, in python when I try to use urllib.request.urlretrieve(link_string) on the link I get the error HTTP Error 403: Forbidden
I have also tried using boto3 and manually parsing out the bucket_name, key, AWSAccessKeyID as well as the signature(treating it as the AWSSecretAccessKey - I know that this is probably wrong). I setup a client with the credentials and try to run a get_object method. Something similar to below:
client= boto3.client(
's3',
aws_access_key_id='AWSACCESSKEY',
aws_secret_access_key='SomeCharactersPossiblyHTMLencoded',
config=Config(signature_version='s3v4') # tried with/without this option
)
client.get_object(
Bucket='bucket_name_possibly',
Key='path/to/file_possibly/filename'
)
The resulting error is An error occurred (SignatureDoesNotMatch) when calling the GetObject operation: The request signature we calculated does not match the signature you provided. Check your key and signing method.
I am stuck, how can I get python to programmatically download the link?
You can use boto to download file as follows.
import boto3
import botocore
BUCKET_NAME = 'my-bucket' # replace with your bucket name
KEY = 'my_image_in_s3.jpg' # replace with your object key
s3 = boto3.resource('s3')
try:
s3.Bucket(BUCKET_NAME).download_file(KEY, 'my_local_image.jpg')
except botocore.exceptions.ClientError as e:
if e.response['Error']['Code'] == "404":
print("The object does not exist.")
else:
raise
for more info you can refer this

check ftplib response code

I have a python application that's accessing an ftp server. There are several error cases I'd like to catch in a fashion similar to httplib2:
try:
urllib2.urlopen("http://google.com")
except urllib2.HTTPError, e:
if e.code == 304:
#do 304 stuff
if e.code == 404:
#do 404 stuff
else:
pass
Does a a construct like this exist in ftplib.err_perm? I know that could return a code of 500-599 according to the docs but I don't see anything in the docs about how to access that value. Did I miss something?
You can access error reponse string using <exception_obj>.args[0]. It contains strings like '550 /no-such-dir: No such file or directory'.
To get error code (only leading three chracters), use <exception_obj>.args[0][:3].
For example:
import ftplib
ftp = ftplib.FTP('ftp.hq.nasa.gov')
ftp.login('anonymous', 'user#example.com')
try:
ftp.cwd('/no-such-dir')
except ftplib.error_perm as e:
print('Error {}'.format(e.args[0][:3]))
finally:
ftp.quit()

Categories