Boto3 AWS lambda not triggering - python

I am using code similar to below to trigger an AWS Lambda function on my AWS educate account, when running this nothing triggers on the lambda (the lambda works with the same payload through the test configuration). My session and permissions are also correct as I am able to use boto3 to access S3 resources with the same credentials. What can I try to attempt to fix/troubleshoot this issue?
Apologies if this is vague (I know it is) but I am very confused on why this is happening
import boto3
import json
AWS_ACCESS_KEY_ID ="XXXXXXXXXXXXXXXXXX"
AWS_SECRET_ACCESS_KEY ="XXXXXXXXXXXXXXXXXXXXXXXXXX"
REGION = 'us-east-1'
session = "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"
payload = json.dumps({"function":"tweets","amount":10,"time":10})
client = boto3.client('lambda',
region_name=REGION,
aws_access_key_id=AWS_ACCESS_KEY_ID,
aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
aws_session_token = session)
response = client.invoke(
FunctionName="MyFunctionARN",
InvocationType="RequestResponse",
Payload=payload
)

Every lambda function has an handler function which is the entry for the code. By default it is lambda_handler. You can also change the default handler function under Runtime settings. The following code will solve your problem.
import boto3
import json
AWS_ACCESS_KEY_ID ="XXXXXXXXXXXXXXXXXX"
AWS_SECRET_ACCESS_KEY ="XXXXXXXXXXXXXXXXXXXXXXXXXX"
REGION = 'us-east-1'
session = "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"
def lambda_handler(event,context):
payload = json.dumps({"function":"tweets","amount":10,"time":10})
client = boto3.client('lambda',
region_name=REGION,
aws_access_key_id=AWS_ACCESS_KEY_ID,
aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
aws_session_token = session)
response = client.invoke(
FunctionName="MyFunctionARN",
InvocationType="RequestResponse",
Payload=payload
)

Related

Managing AWS authentication in Python requests using boto3

I'm using the following method in order to auto-refresh AWS Tokens using IAM role and boto3 (based on, these resources):
import boto3
from botocore.credentials import RefreshableCredentials
from botocore.session import get_session
def get_aws_credentials():
aws_role_arn = 'AWS_ROLE_ARN'
sts_client = boto3.client('sts')
assumed_role_object = sts_client.assume_role(
RoleArn = aws_role_arn,
RoleSessionName = "SessioName",
DurationSeconds = 900
)
return {
'access_key': assumed_role_object['Credentials']['AccessKeyId'],
'secret_key': assumed_role_object['Credentials']['SecretAccessKey'],
'token': assumed_role_object['Credentials']['SessionToken'],
'expiry_time': assumed_role_object['Credentials']['Expiration'].isoformat()
}
session_credentials = RefreshableCredentials.create_from_metadata(
metadata = get_aws_credentials(),
refresh_using = get_aws_credentials,
method = 'sts-assume-role'
)
session = get_session()
session._credentials = session_credentials
autorefresh_session = boto3.Session(botocore_session=session)
This is fine for managing AWS resources such as a S3 bucket, for which I'm using:
autorefresh_session.resource('s3')
And RefreshableCredentials is managing the job of refreshing the credentials.
However, I have also the need to do some requests like this one:
import requests
r = requests.post(
"https://myawsgatewayurl.com/endpoint",
json=json_message,
auth=aws_auth
)
Where I would be using:
from aws_requests_auth.aws_auth import AWSRequestsAuth
generated_credentials = get_aws_credentials()
aws_auth = AWSRequestsAuth(
aws_access_key=generated_credentials['access_key'],
aws_secret_access_key=generated_credentials['secret_key'],
aws_token=generated_credentials['token'],
aws_host=my_aws_host,
aws_region=my_aws_region,
aws_service=my_aws_service
)
However, this approach seems to me not efficient, because I should call get_aws_credentials() each time I do a request, to be sure they are not expired (or check expiration before), while I would like to rely on RefreshableCredentials for this.
Is there a better approach?

AWS SES - Listing email identities with AWS Lambda

I'm sending emails to verified identities in AWS SES with ASW Lambda without problems.
Now I'm just trying to list verified identities and getting no response.
Here is my code:
import boto3
from botocore.exceptions import ClientError
def list_identities():
ses = boto3.client('ses')
response = ses.list_identities(
IdentityType = 'EmailAddress',
MaxItems=10
)
def lambda_handler(event, context):
# TODO implement
print("Listing EMAILS:")
list_identities()
In function log I see printed Listing email: and nothing else.
Lambda function is invoked in same region as AWS SES.
You don't return anything from your function.
Try this:
import boto3
def list_identities():
ses = boto3.client('ses')
response = ses.list_identities(
IdentityType='EmailAddress',
MaxItems=10
)
return response
def lambda_handler(event, context):
# TODO implement
print("Listing EMAILS:")
print(list_identities())

"error message" with lambda handler in aws lambda

Hi I am learning how to use AWS lambda functions and I don't understand how to use the lambda handler. When I use this function in the results I get the expected return in function logs.
import boto3
session = boto3.Session(
aws_access_key_id='XXXXXXXXXXXXX',
aws_secret_access_key='XXXXXXXXXXXXXXX')
#Then use the session to get the resource
s3 = session.resource('s3')
my_bucket = s3.Bucket('XXXXX')
for my_bucket_object in my_bucket.objects.all():
print(my_bucket_object.key)
But when I added the lambda_handler doesn't work
import boto3
def lambda_handler(event, context):
session = boto3.Session(
aws_access_key_id='XXXXXXXXXXXXX',
aws_secret_access_key='XXXXXXXXXXXX')
#Then use the session to get the resource
s3 = session.resource('s3')
my_bucket = s3.Bucket('XXXXXX')
for my_bucket_object in my_bucket.objects.all():
print(my_bucket_object.key)
For this case I received as response
{
"errorMessage": "2022-05-10T14:50:10.023Z a840a005-9af0-4827-919a-7e2bd7eb0aae Task timed out after 3.02 seconds"
}
If anyone has knowledge of what I am doing wrong I would appreciate it.
Try this without any security settings on a S3 bucket that you make public for testing purposes. You can then run/test this from you local machine to help you with debugging before you deploy it.
def list_file_in_bucket(event, context):
bucket_name = event['bucket_name']
s3 = boto3.resource('s3')
my_bucket = s3.Bucket(bucket_name)
for my_bucket_object in my_bucket.objects.all():
print(my_bucket_object.key)
Tested with
def test_s3():
test_bucket = 'public-test-bucket'
event = {}
event['bucket_name'] = test_bucket
function.list_file_in_bucket(event, {})
assert True == False
Obviously you want to change the assert.

boto3 generate_presigned_url SignatureDoesNotMatch

I want to download files from S3 in a web application. Therefore I create an URL using boto3 generate_presigned_url
import boto3
s3Client = boto3.client(
's3',
region_name='eu-central-1',
config=boto3.session.Config(signature_version='v4'),
aws_access_key_id='xxxxxxxxxxxxxxx',
aws_secret_access_key='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'
)
url = s3Client.generate_presigned_url(
'get_object',
Params={'Bucket': 'bucketname', 'Key': 'test.txt'},
ExpiresIn=100)
but always get this error message back:
<Error>
<Code>SignatureDoesNotMatch</Code>
<Message>The request signature we calculated does not match the
signature you provided. Check your key and signing method.</Message>
any ideas what to do?
signature_version='s3v4' works for me.
I got the same error if use signature_version='v4'
s3_client = boto3.client('s3', region_name=REGION,
config=Config(signature_version='s3v4'),
aws_access_key_id=AWS_ACCESS_KEY_ID,
aws_secret_access_key=AWS_SECRET_ACCESS_KEY)

How to generate url from boto3 in amazon web services

I have a Bucket in s3 and I am trying to pull the url of the image that is in there.
I am using boto3 and boto3 doesn't seem to have an implemented generate url method.
They have a core method, that generates url like this,
import botocore.session
session = botocore.session.get_session()
client = session.create_client('s3')
presigned_url = client.generate_presigned_url(
'get_object', Params={'Bucket': self.bucket_name, 'Key': self.key})
One thing I am forced to do is, I have to send the parameters along with each request using session object. And the above method does not allow me to set the session variables (ie .. aws credentials)
The closest I can get is this
session = Session(aws_access_key_id='342342342342', aws_secret_access_key='3434234322', region_name='us-east-1')
s3 = session.resource('s3')
object = s3.Object('my-dev-bucket', 'amazonKeyString')
print object.get()["Body"]
This gets me amazon s3 object which is an object called
botocore.response.StreamingBody object at 0x7ffaff8cef50
Can I get a url of the image this way?
Able to get results and did not face any issues in getting the signed URL.
I used the default session since my aws creds were stored locally in "~/.aws/credentials" file and my default region is set as needed ~/.aws/config
import boto3
s3Client = boto3.client('s3')
s3Client.generate_presigned_url('get_object', Params = {'Bucket': 'www.mybucket.com', 'Key': 'hello.txt'}, ExpiresIn = 100)
If you need to pass params for Session, import boto3.session and create custom session
import boto3.session
session = boto3.session.Session(region_name='eu-central-1')
s3Client = session.client('s3')
If you don't want to use aws configure command, you can pass the credentials directly like this and generate the public URL.
def generate_public_url(bucket_name, file_name, aws_region, aws_key_id, aws_secret, timeout=300):
#if not object_exists(bucket_name, file_name):
# raise Exception(f"0 or many items found in bucket '{bucket_name}' with key '{file_name}')")
s3_client = boto3.client('s3', config=Config(signature_version='s3v4'),
region_name=aws_region, aws_access_key_id=aws_key_id, aws_secret_access_key=aws_secret)
url = s3_client.generate_presigned_url(
ClientMethod='get_object',
Params={
'Bucket': bucket_name,
'Key': file_name
},
ExpiresIn=timeout # seconds
)
return url

Categories