Hands-On Lab948 words

Hands-On Lab: Designing Cost-Optimized Storage Solutions

Design cost-optimized storage solutions

Hands-On Lab: Designing Cost-Optimized Storage Solutions

Welcome to this guided lab on designing cost-optimized storage solutions in AWS. In this 30-minute lab, you will deploy an Amazon S3 bucket, populate it with data, and implement automated S3 Lifecycle configurations to tier data to cheaper storage classes over time. This aligns directly with the AWS Certified Solutions Architect - Associate (SAA-C03) objectives for Domain 4: Design Cost-Optimized Architectures.

Prerequisites

Before beginning this lab, ensure you have the following:

  • AWS Account: An active AWS account with billing enabled (this lab uses mostly free-tier eligible resources, but some minimal charges may apply if free tier limits are exceeded).
  • IAM Permissions: An IAM user or role with full permissions for Amazon S3 (AmazonS3FullAccess).
  • AWS CLI: Installed and configured locally with your credentials (run aws configure if you haven't already).
  • Basic Terminal Knowledge: Familiarity with running basic commands in a bash or PowerShell terminal.

Learning Objectives

By completing this lab, you will be able to:

  1. Create and configure an Amazon S3 bucket for optimal storage.
  2. Manually interact with S3 objects using both the AWS CLI and AWS Management Console.
  3. Design and implement S3 Lifecycle rules to automatically transition data to cost-effective storage tiers (Standard-IA and Glacier).
  4. Verify storage configurations to ensure cost-optimization policies are actively applied.

Architecture Overview

The following diagram illustrates the lifecycle of an object as it moves through cost-optimized storage tiers based on our policy.

Loading Diagram...

To determine which tier to transition your objects to, AWS Solutions Architects typically follow a decision matrix similar to this:

Loading Diagram...

Step-by-Step Instructions

Step 1: Create the Target S3 Bucket

First, we need to create an S3 bucket to store our data. S3 bucket names must be globally unique.

bash
# Set a unique bucket name by appending a random number or your initials BUCKET_NAME="brainybee-cost-opt-lab-$RANDOM" REGION="us-east-1" # Create the S3 bucket aws s3api create-bucket \ --bucket $BUCKET_NAME \ --region $REGION

[!TIP] If you are using a region other than us-east-1, you must specify a LocationConstraint in the create-bucket command.

Console alternative
  1. Log into the AWS Management Console and navigate to S3.
  2. Click Create bucket.
  3. Enter your unique bucket name (e.g., brainybee-cost-opt-lab-12345).
  4. Select your preferred AWS Region.
  5. Leave all other settings as default (Block all public access should remain checked).
  6. Click Create bucket.

📸 Screenshot: The S3 Create Bucket configuration screen showing the unique bucket name.

Step 2: Upload Sample Data

Next, let's create a few dummy files and upload them to our new bucket to simulate an active workload.

bash
# Create a dummy file echo "This is sample data for cost optimization testing." > sample-data-01.txt # Upload the file to the S3 bucket aws s3 cp sample-data-01.txt s3://$BUCKET_NAME/reports/sample-data-01.txt
Console alternative
  1. In the S3 console, click on your newly created bucket.
  2. Click Create folder, name it reports/, and click Create folder.
  3. Navigate inside the reports/ folder and click Upload.
  4. Click Add files, select any small text file from your computer, and click Upload.

📸 Screenshot: The S3 console showing the successful upload of sample-data-01.txt inside the reports/ folder.

Step 3: Implement S3 Lifecycle Rules for Cost Optimization

To optimize costs, we will instruct AWS to automatically move files in the reports/ folder to S3 Standard-IA after 30 days, and then to S3 Glacier after 90 days.

Create a file named lifecycle.json on your local machine with the following content:

json
{ "Rules": [ { "ID": "CostOptimizationRule", "Filter": { "Prefix": "reports/" }, "Status": "Enabled", "Transitions": [ { "Days": 30, "StorageClass": "STANDARD_IA" }, { "Days": 90, "StorageClass": "GLACIER" } ], "Expiration": { "Days": 365 } } ] }

Now, apply this lifecycle configuration to your bucket using the AWS CLI:

bash
aws s3api put-bucket-lifecycle-configuration \ --bucket $BUCKET_NAME \ --lifecycle-configuration file://lifecycle.json
Console alternative
  1. In the S3 console, navigate to your bucket.
  2. Click on the Management tab.
  3. Under Lifecycle rules, click Create lifecycle rule.
  4. Rule name: CostOptimizationRule.
  5. Choose a rule scope: Select Limit the scope of this rule using one or more filters.
  6. Prefix: Enter reports/.
  7. Under Lifecycle rule actions, check:
    • Move current versions of objects between storage classes
    • Expire current versions of objects
  8. Under Transition current versions of objects between storage classes:
    • Choose Standard-IA, Days after object creation: 30.
    • Click Add transition, Choose Glacier Flexible Retrieval, Days after object creation: 90.
  9. Under Expire current versions of objects, set Days after object creation to 365.
  10. Click Create rule.

📸 Screenshot: The Lifecycle rule creation summary showing the waterfall transition of storage tiers over time.

Checkpoints

Let's verify that our cost optimization configuration was successfully applied to the bucket.

Task: Verify the lifecycle configuration.

bash
aws s3api get-bucket-lifecycle-configuration --bucket $BUCKET_NAME

Expected Result: You should receive a JSON output detailing the CostOptimizationRule matching the rules we set in Step 3. If you see the output with STANDARD_IA at 30 days and GLACIER at 90 days, you have successfully completed the core setup!

Troubleshooting

If you run into issues, check this matrix of common errors:

Error MessageCauseSolution
BucketNameAlreadyExistsSomeone else in AWS is already using this bucket name.Change your $BUCKET_NAME variable to include more random numbers or your full initials.
AccessDeniedYour IAM user lacks permissions to perform S3 operations.Verify you are using the correct credentials with aws sts get-caller-identity. Ensure the user has the AmazonS3FullAccess policy attached.
MalformedXML when putting lifecycleThe JSON file is incorrectly formatted or has a syntax error.Validate lifecycle.json using a JSON linter tool. Ensure there are no trailing commas.

Clean-Up / Teardown

[!WARNING] Remember to run the teardown commands to avoid ongoing charges. Even though S3 storage is cheap, leaving unused resources running is bad practice and against the principles of Cost-Optimized Architectures.

Execute the following commands to permanently delete the objects and the bucket.

bash
# 1. Empty all objects from the bucket aws s3 rm s3://$BUCKET_NAME --recursive # 2. Delete the empty bucket aws s3api delete-bucket --bucket $BUCKET_NAME # 3. Clean up the local files rm sample-data-01.txt lifecycle.json
Console alternative
  1. In the S3 console, go to Buckets.
  2. Select the radio button next to your lab bucket.
  3. Click Empty, type permanently delete in the confirmation box, and click Empty.
  4. Return to the Buckets list, ensure your bucket is selected, and click Delete.
  5. Type the name of the bucket to confirm and click Delete bucket.

Ready to study AWS Certified Solutions Architect - Associate (SAA-C03)?

Practice tests, flashcards, and all study notes — free, no sign-up needed.

Start Studying — Free