Lab: Building a Resilient Order Processor with AWS SDK and SQS
Develop code for applications hosted on AWS
Lab: Building a Resilient Order Processor with AWS SDK and SQS
In this lab, you will develop a decoupled, event-driven application component. You will write code that uses the AWS SDK for JavaScript (v3) to process messages from an Amazon SQS queue and persist them into an Amazon DynamoDB table. This pattern is foundational for the AWS Certified Developer Associate (DVA-C02) exam, focusing on Skill 1.1.8 (Messaging Services) and Skill 1.1.9 (AWS SDKs).
[!WARNING] This lab involves creating resources that may incur costs if left running. Always perform the Teardown steps at the end.
Prerequisites
- AWS Account: An active AWS account with Administrator access.
- AWS CLI: Installed and configured with your credentials (
aws configure). - Node.js: Version 18.x or higher installed locally.
- IAM Permissions: Ability to create IAM Roles, Lambda functions, SQS queues, and DynamoDB tables.
Learning Objectives
- Provision decoupled infrastructure using the AWS CLI.
- Write Node.js code using the AWS SDK v3 to interact with DynamoDB.
- Configure an SQS trigger for an AWS Lambda function.
- Implement basic error handling to ensure application resilience.
Architecture Overview
Step-by-Step Instructions
Step 1: Create the DynamoDB Table
We need a persistent data store to hold our processed orders. We will use a simple table with orderId as the Partition Key.
aws dynamodb create-table \
--table-name brainybee-orders \
--attribute-definitions AttributeName=orderId,AttributeType=S \
--key-schema AttributeName=orderId,KeyType=HASH \
--billing-mode PAY_PER_REQUEST \
--region us-east-1▶Console alternative
- Navigate to DynamoDB > Tables > Create table.
- Table name:
brainybee-orders. - Partition key:
orderId(String). - Default settings: On-demand capacity mode.
- Click Create table.
Step 2: Create the SQS Queue
This queue will act as our asynchronous buffer, decoupling the producer from our processing logic.
aws sqs create-queue --queue-name brainybee-order-queue --region us-east-1Step 3: Create the IAM Role for Lambda
Your Lambda function needs permissions to read from SQS and write to DynamoDB.
- Create a trust policy file named
trust-policy.json:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": { "Service": "lambda.amazonaws.com" },
"Action": "sts:AssumeRole"
}
]
}- Create the role:
aws iam create-role --role-name brainybee-lambda-role --assume-role-policy-document file://trust-policy.json- Attach the managed policy for basic execution and custom permissions:
aws iam attach-role-policy --role-name brainybee-lambda-role --policy-arn arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole
aws iam attach-role-policy --role-name brainybee-lambda-role --policy-arn arn:aws:iam::aws:policy/AmazonSQSFullAccess
aws iam attach-role-policy --role-name brainybee-lambda-role --policy-arn arn:aws:iam::aws:policy/AmazonDynamoDBFullAccessStep 4: Develop the Lambda Function
Create a file named index.mjs. This code uses the AWS SDK v3 to parse the SQS message and put it into DynamoDB.
import { DynamoDBClient } from "@aws-sdk/client-dynamodb";
import { DynamoDBDocumentClient, PutCommand } from "@aws-sdk/lib-dynamodb";
const client = new DynamoDBClient({});
const docClient = DynamoDBDocumentClient.from(client);
export const handler = async (event) => {
for (const record of event.Records) {
const order = JSON.parse(record.body);
console.log("Processing order:", order.orderId);
const command = new PutCommand({
TableName: "brainybee-orders",
Item: {
orderId: order.orderId,
customer: order.customer,
amount: order.amount,
processedAt: new Date().toISOString()
}
});
try {
await docClient.send(command);
console.log("Order successfully saved.");
} catch (err) {
console.error("Error saving to DynamoDB:", err);
throw err; // Throwing error triggers SQS retry
}
}
};- Zip the file:
zip function.zip index.mjs- Deploy the function:
aws lambda create-function \
--function-name brainybee-order-processor \
--runtime nodejs18.x \
--role arn:aws:iam::<YOUR_ACCOUNT_ID>:role/brainybee-lambda-role \
--handler index.handler \
--zip-file fileb://function.zip \
--region us-east-1Step 5: Configure SQS Trigger
Map the queue to the Lambda function so that messages automatically trigger the code.
# Get the SQS Queue ARN first
QUEUE_ARN=$(aws sqs get-queue-attributes --queue-url https://sqs.us-east-1.amazonaws.com/<YOUR_ACCOUNT_ID>/brainybee-order-queue --attribute-names QueueArn --query 'Attributes.QueueArn' --output text)
aws lambda create-event-source-mapping \
--function-name brainybee-order-processor \
--batch-size 10 \
--event-source-arn $QUEUE_ARNCheckpoints
- Test Message: Send a test JSON to the queue.
bash
aws sqs send-message \ --queue-url https://sqs.us-east-1.amazonaws.com/<YOUR_ACCOUNT_ID>/brainybee-order-queue \ --message-body '{"orderId": "ABC-123", "customer": "Alice", "amount": 49.99}' - Verify Lambda Logs: Check CloudWatch Logs to see the "Processing order" message.
- Verify DynamoDB: Scan the table to ensure the record exists.
bash
aws dynamodb scan --table-name brainybee-orders
Troubleshooting
| Error | Cause | Fix |
|---|---|---|
AccessDeniedException | IAM Role lacks permissions. | Ensure AmazonDynamoDBFullAccess is attached to the Lambda role. |
Unexpected token in JSON | SQS message body is not valid JSON. | Ensure the --message-body string is correctly escaped in your CLI command. |
| Lambda not triggering | Event source mapping failed. | Check the status using aws lambda list-event-source-mappings. |
Concept Review
This lab demonstrated several core architectural patterns:
- Asynchronous Pattern: The producer doesn't wait for the database write; it just drops the message in SQS.
- Statelessness: The Lambda function doesn't store data locally; it persists it to DynamoDB.
- Loose Coupling: You could swap DynamoDB for RDS without changing the producer logic.
\begin{tikzpicture}[node distance=2cm, every node/.style={rectangle, draw, rounded corners, minimum height=1cm, text centered}] \node (start) {Message Received}; \node (proc) [below of=start] {Processing Loop}; \node (save) [below of=proc] {SDK: PutItem}; \node (fail) [right of=save, xshift=2cm] {Retry Logic}; \node (end) [below of=save] {Success};
\draw [->] (start) -- (proc); \draw [->] (proc) -- (save); \draw [->] (save) -- node[anchor=south] {Error} (fail); \draw [->] (fail) |- (proc); \draw [->] (save) -- node[anchor=left] {OK} (end); \end{tikzpicture}
Stretch Challenge
Implement a Circuit Breaker Pattern: Modify the Lambda code to check a "system-status" flag in DynamoDB before processing. If the flag is set to "DOWN" (e.g., simulating a downstream third-party failure), immediately throw an error to keep messages in SQS without wasting compute cycles on the database call.
Cost Estimate
| Service | Usage | Estimated Cost |
|---|---|---|
| AWS Lambda | 1M Requests/month Free | $0.00 (Free Tier) |
| SQS | 1M Requests/month Free | $0.00 (Free Tier) |
| DynamoDB | 25GB/month Free | $0.00 (Free Tier) |
| Total | $0.00 for this lab |
Clean-Up / Teardown
Execute these commands to avoid charges:
aws lambda delete-function --function-name brainybee-order-processor
aws sqs delete-queue --queue-url https://sqs.us-east-1.amazonaws.com/<YOUR_ACCOUNT_ID>/brainybee-order-queue
aws dynamodb delete-table --table-name brainybee-orders
aws iam detach-role-policy --role-name brainybee-lambda-role --policy-arn arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole
aws iam detach-role-policy --role-name brainybee-lambda-role --policy-arn arn:aws:iam::aws:policy/AmazonSQSFullAccess
aws iam detach-role-policy --role-name brainybee-lambda-role --policy-arn arn:aws:iam::aws:policy/AmazonDynamoDBFullAccess
aws iam delete-role --role-name brainybee-lambda-role