Courses 0%
14
Event Driven Architecture · Chapter 14 of 42

Event Stream Patterns

Akhil
Akhil Sharma
10 min

📊 Event Stream Patterns

Event streams enable powerful architectural patterns that solve complex distributed system challenges. Let's explore four fundamental patterns and understand when to use each one.


Pattern 1: Event Sourcing

Overview

Event sourcing stores events as the source of truth, then derives the current state by replaying those events. Instead of storing the current state directly, you store all the changes that led to that state.

Example: Bank Account

Event Stream:

Current State (computed):

Benefits

  • Complete audit trail - Every change is recorded permanently
  • Time travel - Reconstruct state at any point in history
  • No data loss - Events are never deleted or modified
  • Can add new views anytime - Project events into new models as needed

Code Example

python

When to Use Event Sourcing

  • Financial systems - Where audit trails are mandatory
  • Regulatory compliance - When you need to prove what happened
  • Complex business logic - Where understanding the sequence of events matters
  • Temporal queries - When you need to answer "what was the state at time X?"

Pattern 2: Change Data Capture (CDC)

Overview

Change Data Capture monitors database changes and publishes them as events to a stream. This allows other systems to react to database changes in real-time without polling.

How It Works

Database Operation:

sql

↓ CDC Captures the Change ↓

Event Stream Receives:

json

Consumers React to Database Changes

Each consumer processes the change independently and at their own pace.

Benefits

  • Real-time data propagation - Changes flow immediately to dependent systems
  • Decoupled systems - Database doesn't need to know about consumers
  • Guaranteed delivery - Events are not lost if consumers are temporarily down
  • Historical replay - Can reprocess changes if needed

CDC Technologies

  • Debezium - Open-source CDC connector for various databases
  • AWS DMS - Database Migration Service with CDC capabilities
  • Maxwell - Reads MySQL binlogs and writes to Kafka
  • Oracle GoldenGate - Enterprise CDC solution

When to Use CDC

  • Search index synchronization - Keep Elasticsearch in sync with your database
  • Cache invalidation - Automatically invalidate cache entries when data changes
  • Data replication - Sync data across multiple databases or data centers
  • Event-driven microservices - Trigger actions based on database changes

Pattern 3: CQRS (Command Query Responsibility Segregation)

Overview

CQRS separates the write model (commands) from the read model (queries). The write side optimizes for consistency and validation, while the read side optimizes for query performance.

Architecture Flow

Write Side (Commands):

Read Side (Queries):

Example: E-commerce Order

Write Model (Normalized):

sql

Read Model (Denormalized):

json

Benefits

  • Optimize reads separately from writes - Each side uses appropriate data structures
  • Multiple read models - Different views for different use cases
  • Scale reads independently - Add more read replicas without affecting writes
  • Better performance - Denormalized reads are faster, normalized writes maintain integrity

Trade-offs

Eventual consistency: Read models may be slightly behind write models (typically milliseconds to seconds).

Complexity: Maintaining separate models adds architectural complexity.

When to Use CQRS

  • High read/write ratio - When reads vastly outnumber writes
  • Complex queries - When queries require joins across many tables
  • Multiple views - When different users need different data representations
  • Performance requirements - When you need to scale reads independently

Pattern 4: Stream Processing

Overview

Stream processing analyzes events in real-time as they flow through the stream, enabling immediate reactions to patterns, anomalies, or thresholds.

Example: User Activity Monitoring

Input Event Stream:

↓ Stream Processor ↓

Processing Logic:

↓ Output Stream ↓

Processed Events:

Stream Processing Operations

Windowing:

python

Aggregations:

python

Joins:

python

Stream Processing Technologies

Use Cases

Real-time Analytics:

  • Dashboard metrics
  • Live leaderboards
  • Traffic monitoring

Fraud Detection:

  • Suspicious transaction patterns
  • Unusual user behavior
  • Account takeover detection

IoT Data Processing:

  • Sensor data aggregation
  • Anomaly detection
  • Predictive maintenance

Personalization:

  • Real-time recommendations
  • Dynamic content
  • A/B test results

When to Use Stream Processing

  • Real-time insights - When decisions need to be made immediately
  • Pattern detection - When you need to identify trends across multiple events
  • Aggregations - When you need to compute metrics over time windows
  • Complex event processing - When one event triggers multiple downstream actions

🎯 Real-World Parallels

Understanding these patterns through everyday analogies:

Event Sourcing

Like: An accounting ledger Why: Every transaction is recorded; you can calculate balance at any point

Change Data Capture (CDC)

Like: Security camera footage Why: Captures all changes automatically; can review what happened

CQRS

Like: A restaurant operation Why: Separate kitchen (write) from dining area (read); each optimized for its purpose

Stream Processing

Like: Real-time stock ticker Why: Continuously processes new data and shows live results


🤔 Choosing the Right Pattern

Decision Matrix

RequirementBest Pattern
Need complete audit trailEvent Sourcing
Sync multiple systemsCDC
Very high read loadCQRS
Real-time analyticsStream Processing
Time travel queriesEvent Sourcing
React to database changesCDC
Multiple read viewsCQRS
Pattern detectionStream Processing

Can You Combine Patterns?

Absolutely! Many systems use multiple patterns together:


🎓 Key Takeaways

  1. Event Sourcing preserves complete history by storing all state changes as events
  2. CDC automatically captures and publishes database changes without application code changes
  3. CQRS separates reads and writes for independent optimization and scaling
  4. Stream Processing enables real-time analysis and reactions to event patterns
  5. These patterns work together and can be combined based on your requirements
  6. Choose patterns based on your specific needs: audit requirements, performance characteristics, and consistency requirements

📚 Further Reading

  • Event Sourcing: Martin Fowler's article on Event Sourcing patterns
  • CDC: Debezium documentation and use cases
  • CQRS: Greg Young's CQRS documents
  • Stream Processing: Kafka Streams documentation, Flink architecture guide
Chapter complete!

Course Complete!

You've finished all 42 chapters of

System Design Indermediate

Browse courses
Up next Event Streaming Technologies
Continue