SQS to AWS Lambda Function with AWS Chalice and BOTO3 - python

I am using AWS SQS to store information coming in from an external server and then sending it to a Lambda function to process it and dequeue the information.
The information that I am sending in is in the form of a JSON and is being used as a python dictionary.
def lambda_handler(event, context):
for record in event['Records']:
messageHandler(record)
return {
'statusCode': 200,
'body': json.dumps('Batch Processed')
}
Assuming that the code for the messageHandler is working and properly implemented, how do I catch the messages from the queue in their batches. This is all being deployed by AWS Chalice without the use of CLI.
I am well out of my depth right now and have no idea why this is not working when I deploy it but is working when I trigger a normal Lambda Function in the AWS Console through the SQS Send/Recieve Message feature. As far as I know the triggers are set up correctly and they should have no issue.
If you have any questions please let me know.

The event that you are processing will look something like this:
{
"Records": [
{
"messageId": "11d6ee51-4cc7-4302-9e22-7cd8afdaadf5",
"receiptHandle": "AQEBBX8nesZEXmkhsmZeyIE8iQAMig7qw...",
"body": "Test message.",
"attributes": {
"ApproximateReceiveCount": "1",
"SentTimestamp": "1573251510774",
"SequenceNumber": "18849496460467696128",
"MessageGroupId": "1",
"SenderId": "AIDAIO23YVJENQZJOL4VO",
"MessageDeduplicationId": "1",
"ApproximateFirstReceiveTimestamp": "1573251510774"
},
"messageAttributes": {},
"md5OfBody": "e4e68fb7bd0e697a0ae8f1bb342846b3",
"eventSource": "aws:sqs",
"eventSourceARN": "arn:aws:sqs:us-east-2:123456789012:fifo.fifo",
"awsRegion": "us-east-2"
}
]
}
where the "body" is your json encoded message. You'll want your message handler function to do something like this:
def message_handler(event):
message = json.loads(event["body"])
# do stuff...
The return value from the lambda is pretty pointless if it is being used as the event target from sqs.

Related

How to retrieve a URL query string parameter from inside an AWS Lambda Python function?

How do you access URL querystring parameters from inside an AWS Lambda function served though an API Gateway?
I have both the API gateway + Lambda function setup so I can call it from a public URL. My Python function is simply:
def lambda_handler(event, context):
print('event:', event)
print('context:', context)
I've configured the API's GET "Method Request" handler to pass through the "abc" querystring parameter.
I've also configured the API's GET "Integration Request" handler to map "abc" from "method.request.querystring.abc".
However, when I access my URL, e.g. https://myapp.execute-api.us-east-1.amazonaws.com/prod/myfunc?abc=123, the only thing logged is:
event: {}
context: <bootstrap.LambdaContext object at 0x7fc7a6cb0850>
What am I doing wrong? Why isn't "abc" being passed through in the event dictionary?
Check Use Lambda Proxy integration in the Integration Request to have it pass all request details in the event.
I have a similar problem and I know how frustrating it is. Use this mapping template:
{
"method": "$context.httpMethod",
"body" : $input.json('$'),
"headers": {
#foreach($param in $input.params().header.keySet())
"$param": "$util.escapeJavaScript($input.params().header.get($param))" #if($foreach.hasNext),#end
#end
},
"queryStringParameters": {
#foreach($param in $input.params().querystring.keySet())
"$param": "$util.escapeJavaScript($input.params().querystring.get($param))" #if($foreach.hasNext),#end
#end
},
"pathParameters": {
#foreach($param in $input.params().path.keySet())
"$param": "$util.escapeJavaScript($input.params().path.get($param))" #if($foreach.hasNext),#end
#end
}
}
Then you should find your event looks like this:
{
"method":"GET",
"body":{
},
"headers":{
},
"queryParams":{
"id":"459463732",
"command":"join_session"
},
"pathParams":{
}
}
The context is used for other information, such as IP addresses and timeout settings.

How can I make multiple HTTP RequestResponse requests to the same long running AWS Lambda function from Python?

I understand I can invoke AWS Lambda function synchronously and read results:
data = invoke_response['Payload'].read()
How do I keep connecting to the same long running Lambda function invocation to get partial results until it finishes?
In short, you can't using the boto3 lambda client in your example. What you can do however is monitor your CloudWatch logs.
1) Post the partial results you would like to see to the logger inside your lambda function.
import logging
logger = logging.getLogger()
logger.setLevel(logging.INFO)
def my_logging_handler(event, context):
results = "something half done"
logger.info('MY PARTIAL RESULTS'.format(results))
logger.error('something went wrong')
return 'Hello from Lambda!'
2) Start your lambda asynchronously
response = client.invoke(
FunctionName='string',
InvocationType='Event' # To invoke asynchronously InvocationType must be Event,
LogType='None'|'Tail',
ClientContext='string',
Payload=b'bytes'|file,
Qualifier='string'
)
3) Create a loop that calls GetLogEvents over a specified time interval for the log stream associated with your lambda.
Request Syntax:
{
"endTime": number,
"limit": number,
"logGroupName": "string",
"logStreamName": "string",
"nextToken": "string",
"startFromHead": boolean,
"startTime": number
}
4) Extract the partial results from the log stream response
Response Syntax:
{
"events": [
{
"ingestionTime": number,
"message": "string", # The partial response you posted will be seen here
"timestamp": number
}
],
"nextBackwardToken": "string",
"nextForwardToken": "string"
}

Execute Lambda from AWS SSM Automation with Parameters

I currently have a lambda function that updates a DynamoDB table with a value passed as a parameter. I am able to run the following within the Lambda console with a test parameter set to "TEST":
import boto3
import json
def lambda_handler(event, context):
# TODO implement
update_ami(event)
def update_ami(ami_id):
#DO STUFF
I am attempting to call this from an SSM Automation built from the following JSON document:
{
"description":"Test Execute Lambda Function.",
"schemaVersion":"0.3",
"assumeRole":"MYARN",
"parameters":{},
"mainSteps":[
{
"name": "invokeMyLambdaFunction",
"action": "aws:invokeLambdaFunction",
"maxAttempts": 3,
"timeoutSeconds": 120,
"onFailure": "Abort",
"inputs": {
"FunctionName": "MyLambdaFunction",
"Payload": "TESTER"
}
}
]
}
Executing this automation results in the following error:
Automation Step Execution fails when it is invoking the lambda function. Get Exception from Invoke API of lambda Service. Exception Message from Invoke API: [Could not parse request body into json: Unrecognized token 'TESTER': was expecting ('true', 'false' or 'null')
I have also tried passing the Payload input as a JSON object instead of a string, and adjusted my lambda method accordingly:
JSON Automation:
...
"inputs": {
"FunctionName": "MyLambdaFunction",
"Payload": {
"ami_id": "AMI-TESTER"
}
}
...
Lambda Python:
def lambda_handler(event, context):
# TODO implement
update_ami(event['ami-id'])
This results in the following error coming from the Automation Document editor within the SSM console:
Input {ami_id=TESTER} is of type class java.util.LinkedHashMap, but expected type is String.
So in a nutshell... How do I pass a single string from an Automation document to a Lambda Function?
The error which is shown below looks to be issue with payload not passed as string:
Input {ami_id=TESTER} is of type class java.util.LinkedHashMap, but expected type is String
Please try to use escape character before double quotes.
"inputs": {
"FunctionName": "MyLambdaFunction",
"Payload": "{
\"ami_id\": \"AMI-TESTER\"
}"
}
AWS has provided proper syntax, if you see this Url
{
"name":"updateSsmParam",
"action":"aws:invokeLambdaFunction",
"timeoutSeconds":1200,
"maxAttempts":1,
"onFailure":"Abort",
"inputs":{
"FunctionName":"Automation-UpdateSsmParam",
"Payload":"{\"parameterName\":\"latestAmi\", \"parameterValue\":\"{{createImage.ImageId}}\"}"
}
}

Azure Function - Python - ServiceBus Output Binding - Setting Custom Properties

I have an Azure Function written in Python that has an Service Bus (Topic) output binding. The function is triggered by another queue, we process some files from a blobl storage and then put another message in a queue.
My function.json file looks like that:
{
"bindings": [
{
"type": "serviceBus",
"connection": "Omnibus_Input_Send_Servicebus",
"name": "outputMessage",
"queueName": "validation-output-queue",
"accessRights": "send",
"direction": "out"
}
],
"disabled": false
}
In my function, I can send a message to another queue like that:
with open(os.environ['outputMessage'], 'w') as output_message:
output_message.write('This is my output test message !')
It is working fine. Now I'd like to send a message to a topic. I've created a subscription with an SQLFilter and I need to set some custom properties to the BrokeredMessage.
From the azure sdk for python, I've found that I can add custom properties like that (I've installed the azure module using pip):
from azure.servicebus import Message
sent_msg = Message(b'This is the third message',
broker_properties={'Label': 'M3'},
custom_properties={'Priority': 'Medium',
'Customer': 'ABC'}
)
My new function.json file looks like that:
{
"bindings": [
{
"type": "serviceBus",
"connection": "Omnibus_Input_Send_Servicebus",
"name": "outputMessage",
"topicName": "validation-output-topic",
"accessRights": "send",
"direction": "out"
}
],
"disabled": false
}
And I've modify my function like that:
from azure.servicebus import Message
sent_msg = Message(b'This is the third message',
broker_properties={'Label': 'M3'},
custom_properties={'Priority': 'Medium',
'Customer': 'ABC'}
)
with open(os.environ['outputMessage'], 'w') as output_message:
output_message.write(sent_msg)
When I run the function, I get this exception:
TypeError: expected a string or other character buffer object
I tried to use the buffer and the memoryview function but still get another exception:
TypeError: cannot make memory view because object does not have the buffer interface
I am wondering if the actual binding supports BrokeredMessage and how to deal with it ?
The ServiceBus output binding for Python (and other script languages) only supports a simple string mapping, where the string you specify becomes the content of the BrokeredMessage created behind the scenes. To set any extended properties or do anything more sophisticated, you'll have to drop down to using the Azure Python SDK yourself in your function.
In the same situation, where I need to add user properties in the output service bus queue/topic, I used azure.servicebus.ServiceBusClient directly.
sb.Message class has a user_properties setter:
def main(
httpreq: func.HttpRequest,
context: func.Context ):
sbClient : sb.ServiceBusClient = sb.ServiceBusClient.from_connection_string( os.getenv("AzureWebJobsServiceBus") )
topicClient : sb.TopicClient = sbClient.get_topic('scoring-testtopic')
message = sb.Message( httpreq.get_body().decode( 'UTF-8' ))
message.user_properties = {
'#AzureWebJobsParentId' : context.invocation_id,
'Prom' : '31000001'
}
topicClient.send( message )

Handling S3 Bucket Trigger Event in Lambda Using Python

The AWS Lambda handler has a signature of
def lambda_handler(event, context):
However, I cannot find any documentation as to the event's structure when the trigger is an S3 Bucket receiving a put
I thought that it might be defined in the s3 console, but couldn't find that there.
Anyone have any leads?
The event from S3 to Lambda function will be in json format as shown below,
{
"Records":[
{
"eventVersion":"2.0",
"eventSource":"aws:s3",
"awsRegion":"us-east-1",
"eventTime":The time, in ISO-8601 format, for example, 1970-01-01T00:00:00.000Z, when S3 finished processing the request,
"eventName":"event-type",
"userIdentity":{
"principalId":"Amazon-customer-ID-of-the-user-who-caused-the-event"
},
"requestParameters":{
"sourceIPAddress":"ip-address-where-request-came-from"
},
"responseElements":{
"x-amz-request-id":"Amazon S3 generated request ID",
"x-amz-id-2":"Amazon S3 host that processed the request"
},
"s3":{
"s3SchemaVersion":"1.0",
"configurationId":"ID found in the bucket notification configuration",
"bucket":{
"name":"bucket-name",
"ownerIdentity":{
"principalId":"Amazon-customer-ID-of-the-bucket-owner"
},
"arn":"bucket-ARN"
},
"object":{
"key":"object-key",
"size":object-size,
"eTag":"object eTag",
"versionId":"object version if bucket is versioning-enabled, otherwise null",
"sequencer": "a string representation of a hexadecimal value used to determine event sequence,
only used with PUTs and DELETEs"
}
}
},
{
// Additional events
}
]
}
here is the link for aws documentation which can guide you. http://docs.aws.amazon.com/lambda/latest/dg/with-s3-example.html
I think your easiest route is just to experiment quickly:
Create a bucket using the console
Create a lambda that is triggered by puts to the bucket using the console
Ensure you choose the default execution role, so you create cloudwatch logs
The lambda function just needs to "print(event)" when called, which is then logged
Save an object to the bucket
You'll then see the event structure in the log - its pretty self explanatory.
Please refer this URL to get Event Message Structure: http://docs.aws.amazon.com/AmazonS3/latest/dev/notification-content-structure.html

Categories