Curriculum Overview: Log Ingestion and Storage Strategies
Identify sources for log ingestion and storage based on requirements
Curriculum Overview: Log Ingestion and Storage Strategies
This curriculum covers the critical skills required to identify, ingest, and store logs across an AWS environment. It is aligned with the AWS Certified Security - Specialty (SCS-C03) exam, specifically focusing on Domain 1: Detection and Domain 2: Incident Response.
Prerequisites
Before starting this module, students should possess the following foundational knowledge:
- AWS Fundamentals: Basic understanding of AWS global infrastructure (Regions, Availability Zones).
- IAM Proficiency: Knowledge of IAM roles, policies, and the principle of least privilege for granting log access.
- Core Services: Familiarity with Amazon S3 (buckets/storage classes) and Amazon CloudWatch (basic metrics).
- Security Concepts: Basic understanding of encryption at rest (KMS) and data integrity (checksums/hashing).
Module Breakdown
| Module | Topic | Difficulty | Primary Focus |
|---|---|---|---|
| 1 | Core Management Logging | Beginner | CloudTrail, Config, and Account-level auditing |
| 2 | Network & Traffic Logging | Intermediate | VPC Flow Logs, Route 53, WAF, and CloudFront |
| 3 | Compute & Application Logs | Intermediate | CloudWatch Agent, Lambda logs, and API Gateway |
| 4 | Centralized Storage & Lakes | Advanced | Amazon Security Lake, S3 Tiers, and Cross-account aggregation |
| 5 | Ingestion & Transformation | Advanced | Kinesis Data Firehose, OCSF Schema, and Log Normalization |
Learning Objectives per Module
Upon completion of this curriculum, learners will be able to perform the following:
Module 1: Core Management Logging
- Identify management events vs. data events in AWS CloudTrail.
- Configure organization-wide trails for centralized auditing.
- Validate log file integrity using the CLI to ensure non-repudiation.
Module 2: Network & Traffic Logging
- Analyze network design to determine necessary log sources (e.g., VPC Flow Logs for east-west traffic, WAF for north-south traffic).
- Differentiate between Transit Gateway flow logs and standard VPC flow logs.
Module 3: Compute & Application Logs
- Deploy the Unified CloudWatch Agent to capture OS-level logs and custom application metrics.
- Troubleshoot permission issues preventing logs from reaching CloudWatch Logs groups.
Module 4: Centralized Storage & Lakes
- Architect a multi-account logging strategy using a dedicated Logging Account.
- Map log types to S3 storage classes (Standard, IA, Glacier) based on retention requirements and cost constraints.
Module 5: Ingestion & Transformation
- Utilize Amazon Kinesis Data Firehose to stream logs into OpenSearch or third-party SIEMs.
- Explain the benefits of the Open Cybersecurity Schema Framework (OCSF) used by Amazon Security Lake.
Visual Overview of Log Flow
Success Metrics
Learners can measure their mastery through the following performance indicators:
- Requirement Mapping: Ability to correctly select a log source for a given threat scenario (e.g., using DNS query logs to detect command-and-control communication).
- Configuration Accuracy: Successfully configuring a cross-account S3 bucket policy that allows multiple accounts to deliver CloudTrail logs without granting excessive permissions.
- Cost Optimization: Correctly identifying when to use S3 Lifecycle policies to move CloudWatch logs to Glacier for compliance-only retention.
- Integrity Verification: Demonstrating the use of
aws cloudtrail validate-logsto prove that logs have not been tampered with since delivery.
Real-World Application
[!IMPORTANT] In a production environment, logs are more than just data; they are forensic evidence.
- Incident Response: During a security breach, the ability to correlate VPC Flow Logs with CloudTrail events allows responders to see not just who made an API call, but what data left the network as a result.
- Compliance Audits: Automated logging and central storage are mandatory for frameworks like PCI-DSS, HIPAA, and SOC2. This curriculum provides the technical foundation to pass these audits.
- Forensics: By using Amazon Security Lake and OCSF, organizations can normalize logs from hybrid sources (AWS and On-prem), making it easier to run complex queries during a post-mortem analysis.
Technical Selection Logic
\begin{tikzpicture}[node distance=2cm, auto] \draw[thick, rounded corners, fill=blue!10] (0,0) rectangle (4,1.5) node[pos=0.5] {Requirement: Security Audit}; \draw[->, thick] (4,0.75) -- (6,0.75) node[midway, above] {Source}; \draw[thick, rounded corners, fill=green!10] (6,0) rectangle (10,1.5) node[pos=0.5] {AWS CloudTrail};
\draw[thick, rounded corners, fill=blue!10] (0,-2.5) rectangle (4,-1) node[pos=0.5] {Requirement: Traffic Analysis}; \draw[->, thick] (4,-1.75) -- (6,-1.75) node[midway, above] {Source}; \draw[thick, rounded corners, fill=green!10] (6,-2.5) rectangle (10,-1) node[pos=0.5] {VPC Flow Logs};
\draw[thick, rounded corners, fill=blue!10] (0,-5) rectangle (4,-3.5) node[pos=0.5] {Requirement: App Errors}; \draw[->, thick] (4,-4.25) -- (6,-4.25) node[midway, above] {Source}; \draw[thick, rounded corners, fill=green!10] (6,-5) rectangle (10,-3.5) node[pos=0.5] {CloudWatch Logs}; \end{tikzpicture}
[!TIP] Always prioritize automated ingestion. Manual log collection is prone to errors and creates gaps in your security visibility during critical incidents.