Message Queues & Event Systems - 1/2
The Nightmare That Keeps Senior Developers Awake
Picture this disaster waiting to happen: Your e-commerce platform is experiencing Black Friday traffic. A user places an order. Your system needs to:
- Process the payment
- Update inventory
- Send confirmation email
- Trigger fulfillment workflow
- Update analytics dashboard
- Send push notification
- Log the transaction
Your monolithic approach tries to do all of this synchronously. Payment processing takes 3 seconds. Email service is down. Analytics service is overwhelmed. The user stares at a loading spinner for 15 seconds before giving up and buying from your competitor instead.
Meanwhile, your server is choking on a backlog of 10,000 similar requests, each blocking the next, creating a cascading failure that brings down your entire platform.
Sound familiar? Welcome to the hell of tightly coupled, synchronous systems.
The Uncomfortable Truth About Modern Applications
Here’s what every senior developer eventually learns the hard way: In distributed systems, services will fail, networks will be slow, and dependencies will become unavailable. The question isn’t if this will happen, but when.
The difference between systems that survive and systems that collapse lies in one fundamental principle: decoupling through asynchronous communication.
This is where message queues and event systems become your lifeline. They’re not just nice-to-have architectural components; they’re essential infrastructure that separates robust production systems from fragile toys.
Ready to build systems that can handle real-world chaos? Let’s dive into the backbone of every scalable application.
Message Queues: The Foundation of Resilient Systems
Understanding the Core Problem
In synchronous systems, Service A directly calls Service B and waits for a response. This creates tight coupling and single points of failure. If Service B is slow, busy, or down, Service A suffers.
Message queues solve this by introducing an intermediary that decouples services through asynchronous communication.
// Synchronous approach - tight coupling, blocking operations
class OrderService {
async processOrder(orderData: OrderData): Promise<void> {
try {
// This blocks until payment completes (3+ seconds)
await this.paymentService.processPayment(orderData.payment);
// This blocks until inventory updates
await this.inventoryService.updateStock(orderData.items);
// This blocks until email sends
await this.emailService.sendConfirmation(orderData.customerEmail);
// This blocks until analytics updates
await this.analyticsService.trackOrder(orderData);
// User waits 10+ seconds for all of this to complete
return { success: true };
} catch (error) {
// If ANY service fails, the entire operation fails
throw new Error(`Order processing failed: ${error.message}`);
}
}
}
// Asynchronous approach using message queues
class OrderService {
constructor(private messageQueue: MessageQueue) {}
async processOrder(orderData: OrderData): Promise<void> {
// Validate critical data synchronously (fast)
this.validateOrderData(orderData);
// Store order in database (fast)
const order = await this.orderRepository.create(orderData);
// Publish events asynchronously (immediate response)
await this.messageQueue.publish("order.created", {
orderId: order.id,
customerId: order.customerId,
items: order.items,
totalAmount: order.total,
timestamp: new Date().toISOString(),
});
// Return immediately - user gets instant feedback
return { orderId: order.id, status: "processing" };
}
}
// Separate services handle events independently
class PaymentProcessor {
constructor(private messageQueue: MessageQueue) {
this.messageQueue.subscribe(
"order.created",
this.handleOrderCreated.bind(this)
);
}
private async handleOrderCreated(event: OrderCreatedEvent): Promise<void> {
try {
const payment = await this.processPayment(event.orderId);
// Publish success event
await this.messageQueue.publish("payment.processed", {
orderId: event.orderId,
paymentId: payment.id,
status: "success",
});
} catch (error) {
// Publish failure event with retry information
await this.messageQueue.publish("payment.failed", {
orderId: event.orderId,
error: error.message,
retryCount: 0,
});
}
}
}
The Benefits Are Immediate
With message queues, your order processing becomes:
- Fast: Users get immediate feedback instead of waiting for all operations
- Resilient: If the email service is down, payment processing continues
- Scalable: Each service can scale independently based on its workload
- Maintainable: Services are loosely coupled and can be developed independently
Message Queue Fundamentals: Beyond the Basics
Queue vs. Topic vs. Stream: The Architecture Decision That Matters
Understanding the difference between these patterns is crucial for choosing the right tool.
Queues: Point-to-Point Communication
// Queue pattern - one message, one consumer
interface MessageQueue {
// Producer adds message to queue
send(queueName: string, message: any): Promise<void>;
// Only ONE consumer receives each message
receive(queueName: string, handler: (message: any) => Promise<void>): void;
}
// Example: Order processing queue
class OrderProcessingQueue {
private queue = new Queue("order-processing");
async addOrder(order: Order): Promise<void> {
// Message goes to exactly one order processor
await this.queue.send("order-processing", {
type: "PROCESS_ORDER",
orderId: order.id,
priority: order.isPriority ? "high" : "normal",
data: order,
});
}
startProcessing(): void {
// Multiple workers can consume from the same queue
// Each message is processed by only ONE worker
this.queue.receive("order-processing", async (message) => {
const { orderId, data } = message;
await this.processOrder(data);
console.log(`Order ${orderId} processed by worker ${process.pid}`);
});
}
}
Use queues when: You need work distribution among multiple workers, task processing, or load balancing.
Topics: Publish-Subscribe Pattern
// Topic pattern - one message, multiple subscribers
interface MessageTopic {
// Publisher sends message to topic
publish(topicName: string, message: any): Promise<void>;
// MULTIPLE subscribers can receive the same message
subscribe(topicName: string, handler: (message: any) => Promise<void>): void;
}
// Example: User registration events
class UserEventSystem {
private topic = new Topic("user-events");
async publishUserRegistered(user: User): Promise<void> {
// All subscribers receive this message
await this.topic.publish("user-events", {
type: "USER_REGISTERED",
userId: user.id,
email: user.email,
registrationDate: new Date().toISOString(),
});
}
setupSubscribers(): void {
// Email service subscribes
this.topic.subscribe("user-events", async (event) => {
if (event.type === "USER_REGISTERED") {
await this.emailService.sendWelcomeEmail(event.email);
}
});
// Analytics service subscribes
this.topic.subscribe("user-events", async (event) => {
if (event.type === "USER_REGISTERED") {
await this.analyticsService.trackNewUser(event.userId);
}
});
// Marketing service subscribes
this.topic.subscribe("user-events", async (event) => {
if (event.type === "USER_REGISTERED") {
await this.marketingService.addToNewUserCampaign(event.email);
}
});
}
}
Use topics when: Multiple services need to react to the same event, broadcasting notifications, or event-driven architectures.
Streams: Event Sourcing and Replay
// Stream pattern - ordered, replayable sequence of events
interface EventStream {
// Append events to stream (immutable)
append(streamName: string, events: Event[]): Promise<void>;
// Read events from any point in history
read(streamName: string, fromPosition?: number): AsyncIterator<Event>;
// Subscribe to new events
subscribe(streamName: string, handler: (event: Event) => Promise<void>): void;
}
// Example: Order lifecycle stream
class OrderLifecycleStream {
private stream = new EventStream();
async recordOrderEvent(orderId: string, event: OrderEvent): Promise<void> {
await this.stream.append(`order-${orderId}`, [
{
id: uuidv4(),
type: event.type,
data: event.data,
timestamp: new Date().toISOString(),
version: event.version,
},
]);
}
async getOrderHistory(orderId: string): Promise<OrderEvent[]> {
const events = [];
for await (const event of this.stream.read(`order-${orderId}`)) {
events.push(event);
}
return events;
}
async replayOrdersFromDate(date: Date): Promise<void> {
// Replay events from a specific point in time
for await (const event of this.stream.read(
"all-orders",
this.getPositionFromDate(date)
)) {
await this.processHistoricalEvent(event);
}
}
}
Use streams when: You need event sourcing, audit trails, replay capability, or complex event processing.
Publisher-Subscriber Pattern: The Architecture That Scales
Understanding Event-Driven Architecture
The pub-sub pattern is the foundation of scalable, loosely-coupled systems. Publishers emit events without knowing who (if anyone) is listening. Subscribers react to events without knowing who published them.
// Event-driven e-commerce system
class EventDrivenEcommerce {
private eventBus: EventBus;
constructor() {
this.eventBus = new EventBus();
this.setupEventHandlers();
}
private setupEventHandlers(): void {
// Order events
this.eventBus.on("order.placed", this.handleOrderPlaced.bind(this));
this.eventBus.on("order.cancelled", this.handleOrderCancelled.bind(this));
// Payment events
this.eventBus.on(
"payment.completed",
this.handlePaymentCompleted.bind(this)
);
this.eventBus.on("payment.failed", this.handlePaymentFailed.bind(this));
// Inventory events
this.eventBus.on(
"inventory.updated",
this.handleInventoryUpdated.bind(this)
);
}
// Order service publishes events
async placeOrder(orderData: OrderData): Promise<Order> {
const order = await this.createOrder(orderData);
// Publish event - multiple services can react
await this.eventBus.publish("order.placed", {
orderId: order.id,
customerId: order.customerId,
items: order.items,
total: order.total,
shippingAddress: order.shippingAddress,
});
return order;
}
// Inventory service reacts to order events
private async handleOrderPlaced(event: OrderPlacedEvent): Promise<void> {
try {
// Reserve inventory
await this.inventoryService.reserve(event.items);
// Publish inventory reserved event
await this.eventBus.publish("inventory.reserved", {
orderId: event.orderId,
items: event.items,
reservationId: uuidv4(),
});
} catch (error) {
// Publish inventory failed event
await this.eventBus.publish("inventory.reservation.failed", {
orderId: event.orderId,
reason: error.message,
});
}
}
// Payment service reacts to inventory events
private async handleInventoryReserved(
event: InventoryReservedEvent
): Promise<void> {
const order = await this.orderService.getOrder(event.orderId);
try {
const payment = await this.paymentService.charge(
order.paymentMethod,
order.total
);
await this.eventBus.publish("payment.completed", {
orderId: event.orderId,
paymentId: payment.id,
amount: order.total,
});
} catch (error) {
// Release reserved inventory
await this.eventBus.publish("inventory.release", {
reservationId: event.reservationId,
});
}
}
// Shipping service reacts to payment events
private async handlePaymentCompleted(
event: PaymentCompletedEvent
): Promise<void> {
const order = await this.orderService.getOrder(event.orderId);
const shipment = await this.shippingService.createShipment({
orderId: order.id,
items: order.items,
address: order.shippingAddress,
});
await this.eventBus.publish("shipment.created", {
orderId: event.orderId,
shipmentId: shipment.id,
trackingNumber: shipment.trackingNumber,
});
}
}
The Power of Loose Coupling
Notice how each service only cares about the events relevant to its domain. The order service doesn’t know about shipping details, the payment service doesn’t know about inventory management, and the shipping service doesn’t know about payment processing.
This architecture enables:
- Independent scaling: Scale shipping service during peak season without affecting payments
- Independent deployment: Update the inventory service without touching order processing
- Independent development: Teams can work on different services without coordination
- Fault isolation: If shipping goes down, orders can still be processed
Message Brokers: RabbitMQ vs. Apache Kafka
RabbitMQ: The Swiss Army Knife of Messaging
RabbitMQ excels at traditional message queue patterns with rich routing capabilities.
// RabbitMQ implementation
import * as amqp from "amqplib";
class RabbitMQService {
private connection: amqp.Connection;
private channel: amqp.Channel;
async connect(): Promise<void> {
this.connection = await amqp.connect("amqp://localhost");
this.channel = await this.connection.createChannel();
}
// Direct exchange - route to specific queue
async sendToQueue(queueName: string, message: any): Promise<void> {
await this.channel.assertQueue(queueName, { durable: true });
await this.channel.sendToQueue(
queueName,
Buffer.from(JSON.stringify(message)),
{ persistent: true }
);
}
// Topic exchange - pattern-based routing
async publishToTopic(
exchangeName: string,
routingKey: string,
message: any
): Promise<void> {
await this.channel.assertExchange(exchangeName, "topic", { durable: true });
await this.channel.publish(
exchangeName,
routingKey,
Buffer.from(JSON.stringify(message)),
{ persistent: true }
);
}
// Subscribe with automatic acknowledgment
async subscribe(
queueName: string,
handler: (message: any) => Promise<void>
): Promise<void> {
await this.channel.assertQueue(queueName, { durable: true });
await this.channel.consume(queueName, async (msg) => {
if (msg) {
try {
const content = JSON.parse(msg.content.toString());
await handler(content);
this.channel.ack(msg);
} catch (error) {
console.error("Message processing failed:", error);
this.channel.nack(msg, false, true); // Requeue for retry
}
}
});
}
// Advanced routing example
async setupAdvancedRouting(): Promise<void> {
const exchange = "order-events";
await this.channel.assertExchange(exchange, "topic");
// Different services subscribe to different patterns
const paymentQueue = "payment-processing";
const shippingQueue = "shipping-processing";
const analyticsQueue = "analytics-processing";
await this.channel.assertQueue(paymentQueue);
await this.channel.assertQueue(shippingQueue);
await this.channel.assertQueue(analyticsQueue);
// Route payment events to payment service
await this.channel.bindQueue(paymentQueue, exchange, "order.payment.*");
// Route shipping events to shipping service
await this.channel.bindQueue(shippingQueue, exchange, "order.shipped.*");
// Route all order events to analytics
await this.channel.bindQueue(analyticsQueue, exchange, "order.*");
}
}
RabbitMQ strengths:
- Flexible routing (direct, topic, fanout, headers)
- Built-in clustering and high availability
- Rich management UI
- Message acknowledgments and guarantees
- Lower latency for smaller message volumes
Apache Kafka: The Event Streaming Powerhouse
Kafka is designed for high-throughput event streaming and log aggregation.
// Kafka implementation
import { Kafka, Producer, Consumer } from "kafkajs";
class KafkaService {
private kafka: Kafka;
private producer: Producer;
private consumers: Map<string, Consumer> = new Map();
constructor() {
this.kafka = new Kafka({
clientId: "my-app",
brokers: ["localhost:9092"],
retry: {
initialRetryTime: 100,
retries: 8,
},
});
this.producer = this.kafka.producer({
maxInFlightRequests: 1,
idempotent: true, // Prevent duplicate messages
transactionTimeout: 30000,
});
}
async connect(): Promise<void> {
await this.producer.connect();
}
// Publish event to topic with partitioning
async publishEvent(
topic: string,
event: any,
partitionKey?: string
): Promise<void> {
await this.producer.send({
topic,
messages: [
{
key: partitionKey || null,
value: JSON.stringify(event),
timestamp: Date.now().toString(),
headers: {
"event-type": event.type,
"source-service": "order-service",
},
},
],
});
}
// Batch publishing for high throughput
async publishBatch(topic: string, events: any[]): Promise<void> {
const messages = events.map((event) => ({
key: event.partitionKey || null,
value: JSON.stringify(event),
timestamp: Date.now().toString(),
}));
await this.producer.send({
topic,
messages,
});
}
// Consumer with offset management
async startConsumer(
groupId: string,
topics: string[],
handler: (message: any) => Promise<void>
): Promise<void> {
const consumer = this.kafka.consumer({
groupId,
sessionTimeout: 30000,
rebalanceTimeout: 60000,
heartbeatInterval: 3000,
});
await consumer.connect();
await consumer.subscribe({ topics, fromBeginning: false });
await consumer.run({
eachMessage: async ({ topic, partition, message }) => {
try {
const event = JSON.parse(message.value?.toString() || "{}");
await handler(event);
} catch (error) {
console.error(
`Failed to process message from ${topic}:${partition}`,
error
);
// In production, you'd implement dead letter queues here
}
},
});
this.consumers.set(groupId, consumer);
}
// Stream processing example
async setupOrderStreamProcessor(): Promise<void> {
await this.startConsumer(
"order-analytics-processor",
["order-events", "payment-events", "shipping-events"],
async (event) => {
switch (event.type) {
case "ORDER_PLACED":
await this.updateOrderMetrics(event);
break;
case "PAYMENT_COMPLETED":
await this.updateRevenuMetrics(event);
break;
case "ORDER_SHIPPED":
await this.updateFulfillmentMetrics(event);
break;
}
}
);
}
}
Kafka strengths:
- Extremely high throughput (millions of messages/second)
- Persistent message storage with configurable retention
- Built-in partitioning for parallel processing
- Exactly-once semantics with transactions
- Stream processing capabilities
- Excellent for event sourcing and log aggregation
Event Sourcing: Beyond Traditional State Management
The Paradigm Shift
Instead of storing current state, event sourcing stores a sequence of events that led to that state. The current state is derived by replaying events.
// Traditional approach - store current state
class TraditionalUserAccount {
constructor(
public id: string,
public balance: number,
public status: "active" | "suspended" | "closed"
) {}
async debit(amount: number): Promise<void> {
if (this.balance < amount) {
throw new Error("Insufficient funds");
}
// Update current state
this.balance -= amount;
// Lose historical information
await this.saveToDatabase();
}
}
// Event sourcing approach - store events
interface AccountEvent {
id: string;
accountId: string;
type: string;
data: any;
timestamp: Date;
version: number;
}
class EventSourcedAccount {
private events: AccountEvent[] = [];
private version = 0;
constructor(private accountId: string) {}
// Commands change state by creating events
debit(amount: number, description: string): void {
const currentBalance = this.getBalance();
if (currentBalance < amount) {
throw new Error("Insufficient funds");
}
this.applyEvent({
id: uuidv4(),
accountId: this.accountId,
type: "ACCOUNT_DEBITED",
data: { amount, description, previousBalance: currentBalance },
timestamp: new Date(),
version: this.version + 1,
});
}
credit(amount: number, description: string): void {
this.applyEvent({
id: uuidv4(),
accountId: this.accountId,
type: "ACCOUNT_CREDITED",
data: { amount, description, previousBalance: this.getBalance() },
timestamp: new Date(),
version: this.version + 1,
});
}
// State is derived from events
getBalance(): number {
return (
this.events
.filter((e) => e.type === "ACCOUNT_CREDITED")
.reduce((sum, e) => sum + e.data.amount, 0) -
this.events
.filter((e) => e.type === "ACCOUNT_DEBITED")
.reduce((sum, e) => sum + e.data.amount, 0)
);
}
getStatus(): "active" | "suspended" | "closed" {
const lastStatusEvent = this.events
.filter((e) =>
["ACCOUNT_SUSPENDED", "ACCOUNT_CLOSED", "ACCOUNT_ACTIVATED"].includes(
e.type
)
)
.sort((a, b) => b.timestamp.getTime() - a.timestamp.getTime())[0];
if (!lastStatusEvent) return "active";
switch (lastStatusEvent.type) {
case "ACCOUNT_SUSPENDED":
return "suspended";
case "ACCOUNT_CLOSED":
return "closed";
default:
return "active";
}
}
// Get account state at any point in time
getBalanceAt(date: Date): number {
return this.events
.filter((e) => e.timestamp <= date)
.reduce((balance, event) => {
switch (event.type) {
case "ACCOUNT_CREDITED":
return balance + event.data.amount;
case "ACCOUNT_DEBITED":
return balance - event.data.amount;
default:
return balance;
}
}, 0);
}
// Apply event and update version
private applyEvent(event: AccountEvent): void {
this.events.push(event);
this.version = event.version;
}
// Load events from storage
async loadFromHistory(events: AccountEvent[]): Promise<void> {
this.events = events.sort((a, b) => a.version - b.version);
this.version = events.length > 0 ? events[events.length - 1].version : 0;
}
// Get uncommitted events for persistence
getUncommittedEvents(): AccountEvent[] {
return [...this.events]; // Return copy
}
}
// Event store implementation
class EventStore {
private events: Map<string, AccountEvent[]> = new Map();
async saveEvents(
streamId: string,
events: AccountEvent[],
expectedVersion: number
): Promise<void> {
const existingEvents = this.events.get(streamId) || [];
if (existingEvents.length !== expectedVersion) {
throw new Error("Concurrency conflict");
}
this.events.set(streamId, [...existingEvents, ...events]);
// Publish events to message broker
for (const event of events) {
await this.eventBus.publish(`account.${event.type}`, event);
}
}
async getEvents(streamId: string, fromVersion = 0): Promise<AccountEvent[]> {
const events = this.events.get(streamId) || [];
return events.filter((e) => e.version > fromVersion);
}
async replayEvents(streamId: string, toDate: Date): Promise<AccountEvent[]> {
const events = this.events.get(streamId) || [];
return events.filter((e) => e.timestamp <= toDate);
}
}
Why Event Sourcing Matters
Benefits:
- Complete audit trail
- Time travel debugging
- Easy feature rollbacks
- Natural event publishing
- Conflict-free concurrent updates
Challenges:
- Complexity in event schema evolution
- Query performance for complex aggregations
- Storage requirements for event history
Building Your First Message Queue System
Let’s implement a practical order processing system that demonstrates these concepts:
// Complete order processing system with events
class OrderProcessingSystem {
private eventStore: EventStore;
private messageQueue: MessageQueue;
private eventBus: EventBus;
constructor() {
this.eventStore = new EventStore();
this.messageQueue = new MessageQueue();
this.eventBus = new EventBus();
this.setupEventHandlers();
}
private setupEventHandlers(): void {
// Order lifecycle handlers
this.eventBus.on("order.created", this.handleOrderCreated.bind(this));
this.eventBus.on(
"payment.completed",
this.handlePaymentCompleted.bind(this)
);
this.eventBus.on("payment.failed", this.handlePaymentFailed.bind(this));
this.eventBus.on(
"inventory.reserved",
this.handleInventoryReserved.bind(this)
);
this.eventBus.on("inventory.failed", this.handleInventoryFailed.bind(this));
}
// Main order processing entry point
async processOrder(
orderData: OrderData
): Promise<{ orderId: string; status: string }> {
const orderId = uuidv4();
// Create order event
const orderEvent = {
id: uuidv4(),
type: "ORDER_CREATED",
data: {
orderId,
customerId: orderData.customerId,
items: orderData.items,
total: orderData.total,
timestamp: new Date().toISOString(),
},
};
// Store event
await this.eventStore.saveEvents(`order-${orderId}`, [orderEvent], 0);
// Publish for async processing
await this.eventBus.publish("order.created", orderEvent.data);
return { orderId, status: "processing" };
}
// Handle order creation
private async handleOrderCreated(event: any): Promise<void> {
// Start inventory reservation
await this.messageQueue.send("inventory-processing", {
type: "RESERVE_INVENTORY",
orderId: event.orderId,
items: event.items,
});
// Start payment processing
await this.messageQueue.send("payment-processing", {
type: "PROCESS_PAYMENT",
orderId: event.orderId,
amount: event.total,
customerId: event.customerId,
});
}
// Handle successful payment
private async handlePaymentCompleted(event: any): Promise<void> {
const paymentEvent = {
id: uuidv4(),
type: "PAYMENT_COMPLETED",
data: {
orderId: event.orderId,
paymentId: event.paymentId,
amount: event.amount,
timestamp: new Date().toISOString(),
},
};
await this.eventStore.saveEvents(
`order-${event.orderId}`,
[paymentEvent],
event.expectedVersion
);
// Trigger shipping if inventory was already reserved
const orderEvents = await this.eventStore.getEvents(
`order-${event.orderId}`
);
const hasInventoryReserved = orderEvents.some(
(e) => e.type === "INVENTORY_RESERVED"
);
if (hasInventoryReserved) {
await this.triggerShipping(event.orderId);
}
}
// Handle successful inventory reservation
private async handleInventoryReserved(event: any): Promise<void> {
const inventoryEvent = {
id: uuidv4(),
type: "INVENTORY_RESERVED",
data: {
orderId: event.orderId,
items: event.items,
reservationId: event.reservationId,
timestamp: new Date().toISOString(),
},
};
await this.eventStore.saveEvents(
`order-${event.orderId}`,
[inventoryEvent],
event.expectedVersion
);
// Trigger shipping if payment was already completed
const orderEvents = await this.eventStore.getEvents(
`order-${event.orderId}`
);
const hasPaymentCompleted = orderEvents.some(
(e) => e.type === "PAYMENT_COMPLETED"
);
if (hasPaymentCompleted) {
await this.triggerShipping(event.orderId);
}
}
private async triggerShipping(orderId: string): Promise<void> {
const orderEvents = await this.eventStore.getEvents(`order-${orderId}`);
const orderData = this.reconstructOrderFromEvents(orderEvents);
await this.messageQueue.send("shipping-processing", {
type: "CREATE_SHIPMENT",
orderId,
items: orderData.items,
address: orderData.shippingAddress,
});
}
// Get order status by replaying events
async getOrderStatus(orderId: string): Promise<OrderStatus> {
const events = await this.eventStore.getEvents(`order-${orderId}`);
return this.reconstructOrderFromEvents(events);
}
private reconstructOrderFromEvents(events: any[]): OrderStatus {
return events.reduce((order, event) => {
switch (event.type) {
case "ORDER_CREATED":
return {
...order,
id: event.data.orderId,
status: "created",
items: event.data.items,
total: event.data.total,
};
case "PAYMENT_COMPLETED":
return {
...order,
paymentStatus: "completed",
paymentId: event.data.paymentId,
};
case "INVENTORY_RESERVED":
return { ...order, inventoryStatus: "reserved" };
case "SHIPMENT_CREATED":
return {
...order,
status: "shipped",
trackingNumber: event.data.trackingNumber,
};
default:
return order;
}
}, {} as OrderStatus);
}
}
Key Takeaways
Message queues and event systems aren’t just architectural patterns—they’re the foundation of scalable, resilient applications that can handle real-world complexity.
What we’ve covered:
- Decoupling: Breaking free from synchronous, tightly coupled systems
- Patterns: Understanding when to use queues, topics, or streams
- Tools: RabbitMQ for traditional messaging, Kafka for event streaming
- Event Sourcing: Storing events instead of state for audit trails and time travel
The architecture decision framework:
- Use queues for work distribution and load balancing
- Use topics for broadcasting events to multiple consumers
- Use streams for event sourcing and replay capabilities
- Choose RabbitMQ for complex routing and lower latency
- Choose Kafka for high throughput and stream processing
What’s Next?
In the next blog, we’ll dive deeper into the operational aspects of message systems—handling failures gracefully, implementing retry strategies, building monitoring systems, and ensuring message delivery guarantees.
We’ll also cover advanced patterns like saga orchestration, distributed transactions, and building resilient async workflows that can handle the chaos of production environments.
The goal isn’t just to understand these concepts—it’s to build systems that won’t wake you up at 3 AM with production outages.