Hands-On Lab: Designing Cost-Optimized Storage Solutions
Design cost-optimized storage solutions
Hands-On Lab: Designing Cost-Optimized Storage Solutions
Welcome to this guided lab on designing cost-optimized storage solutions in AWS. In this 30-minute lab, you will deploy an Amazon S3 bucket, populate it with data, and implement automated S3 Lifecycle configurations to tier data to cheaper storage classes over time. This aligns directly with the AWS Certified Solutions Architect - Associate (SAA-C03) objectives for Domain 4: Design Cost-Optimized Architectures.
Prerequisites
Before beginning this lab, ensure you have the following:
- AWS Account: An active AWS account with billing enabled (this lab uses mostly free-tier eligible resources, but some minimal charges may apply if free tier limits are exceeded).
- IAM Permissions: An IAM user or role with full permissions for Amazon S3 (
AmazonS3FullAccess). - AWS CLI: Installed and configured locally with your credentials (run
aws configureif you haven't already). - Basic Terminal Knowledge: Familiarity with running basic commands in a bash or PowerShell terminal.
Learning Objectives
By completing this lab, you will be able to:
- Create and configure an Amazon S3 bucket for optimal storage.
- Manually interact with S3 objects using both the AWS CLI and AWS Management Console.
- Design and implement S3 Lifecycle rules to automatically transition data to cost-effective storage tiers (Standard-IA and Glacier).
- Verify storage configurations to ensure cost-optimization policies are actively applied.
Architecture Overview
The following diagram illustrates the lifecycle of an object as it moves through cost-optimized storage tiers based on our policy.
To determine which tier to transition your objects to, AWS Solutions Architects typically follow a decision matrix similar to this:
Step-by-Step Instructions
Step 1: Create the Target S3 Bucket
First, we need to create an S3 bucket to store our data. S3 bucket names must be globally unique.
# Set a unique bucket name by appending a random number or your initials
BUCKET_NAME="brainybee-cost-opt-lab-$RANDOM"
REGION="us-east-1"
# Create the S3 bucket
aws s3api create-bucket \
--bucket $BUCKET_NAME \
--region $REGION[!TIP] If you are using a region other than
us-east-1, you must specify aLocationConstraintin the create-bucket command.
▶Console alternative
- Log into the AWS Management Console and navigate to S3.
- Click Create bucket.
- Enter your unique bucket name (e.g.,
brainybee-cost-opt-lab-12345). - Select your preferred AWS Region.
- Leave all other settings as default (Block all public access should remain checked).
- Click Create bucket.
📸 Screenshot: The S3 Create Bucket configuration screen showing the unique bucket name.
Step 2: Upload Sample Data
Next, let's create a few dummy files and upload them to our new bucket to simulate an active workload.
# Create a dummy file
echo "This is sample data for cost optimization testing." > sample-data-01.txt
# Upload the file to the S3 bucket
aws s3 cp sample-data-01.txt s3://$BUCKET_NAME/reports/sample-data-01.txt▶Console alternative
- In the S3 console, click on your newly created bucket.
- Click Create folder, name it
reports/, and click Create folder. - Navigate inside the
reports/folder and click Upload. - Click Add files, select any small text file from your computer, and click Upload.
📸 Screenshot: The S3 console showing the successful upload of
sample-data-01.txtinside thereports/folder.
Step 3: Implement S3 Lifecycle Rules for Cost Optimization
To optimize costs, we will instruct AWS to automatically move files in the reports/ folder to S3 Standard-IA after 30 days, and then to S3 Glacier after 90 days.
Create a file named lifecycle.json on your local machine with the following content:
{
"Rules": [
{
"ID": "CostOptimizationRule",
"Filter": {
"Prefix": "reports/"
},
"Status": "Enabled",
"Transitions": [
{
"Days": 30,
"StorageClass": "STANDARD_IA"
},
{
"Days": 90,
"StorageClass": "GLACIER"
}
],
"Expiration": {
"Days": 365
}
}
]
}Now, apply this lifecycle configuration to your bucket using the AWS CLI:
aws s3api put-bucket-lifecycle-configuration \
--bucket $BUCKET_NAME \
--lifecycle-configuration file://lifecycle.json▶Console alternative
- In the S3 console, navigate to your bucket.
- Click on the Management tab.
- Under Lifecycle rules, click Create lifecycle rule.
- Rule name:
CostOptimizationRule. - Choose a rule scope: Select Limit the scope of this rule using one or more filters.
- Prefix: Enter
reports/. - Under Lifecycle rule actions, check:
- Move current versions of objects between storage classes
- Expire current versions of objects
- Under Transition current versions of objects between storage classes:
- Choose Standard-IA, Days after object creation:
30. - Click Add transition, Choose Glacier Flexible Retrieval, Days after object creation:
90.
- Choose Standard-IA, Days after object creation:
- Under Expire current versions of objects, set Days after object creation to
365. - Click Create rule.
📸 Screenshot: The Lifecycle rule creation summary showing the waterfall transition of storage tiers over time.
Checkpoints
Let's verify that our cost optimization configuration was successfully applied to the bucket.
Task: Verify the lifecycle configuration.
aws s3api get-bucket-lifecycle-configuration --bucket $BUCKET_NAMEExpected Result: You should receive a JSON output detailing the CostOptimizationRule matching the rules we set in Step 3. If you see the output with STANDARD_IA at 30 days and GLACIER at 90 days, you have successfully completed the core setup!
Troubleshooting
If you run into issues, check this matrix of common errors:
| Error Message | Cause | Solution |
|---|---|---|
BucketNameAlreadyExists | Someone else in AWS is already using this bucket name. | Change your $BUCKET_NAME variable to include more random numbers or your full initials. |
AccessDenied | Your IAM user lacks permissions to perform S3 operations. | Verify you are using the correct credentials with aws sts get-caller-identity. Ensure the user has the AmazonS3FullAccess policy attached. |
MalformedXML when putting lifecycle | The JSON file is incorrectly formatted or has a syntax error. | Validate lifecycle.json using a JSON linter tool. Ensure there are no trailing commas. |
Clean-Up / Teardown
[!WARNING] Remember to run the teardown commands to avoid ongoing charges. Even though S3 storage is cheap, leaving unused resources running is bad practice and against the principles of Cost-Optimized Architectures.
Execute the following commands to permanently delete the objects and the bucket.
# 1. Empty all objects from the bucket
aws s3 rm s3://$BUCKET_NAME --recursive
# 2. Delete the empty bucket
aws s3api delete-bucket --bucket $BUCKET_NAME
# 3. Clean up the local files
rm sample-data-01.txt lifecycle.json▶Console alternative
- In the S3 console, go to Buckets.
- Select the radio button next to your lab bucket.
- Click Empty, type
permanently deletein the confirmation box, and click Empty. - Return to the Buckets list, ensure your bucket is selected, and click Delete.
- Type the name of the bucket to confirm and click Delete bucket.