Caching is the most impactful performance optimization available. Understanding where to cache, what to cache, and when to invalidate separates high-performance applications from sluggish ones.
The Caching Hierarchy#
User Request
│
▼
┌───────────────┐
│ Browser Cache │ ← Fastest, most limited
└───────┬───────┘
│
┌───────▼───────┐
│ CDN Cache │ ← Fast, geographically distributed
└───────┬───────┘
│
┌───────▼───────┐
│ Application │ ← In-memory (Redis, Memcached)
│ Cache │
└───────┬───────┘
│
┌───────▼───────┐
│ Database │ ← Query cache, materialized views
│ Cache │
└───────────────┘
Browser Caching#
Cache-Control Headers#
1// Express middleware
2app.use('/static', express.static('public', {
3 maxAge: '1y',
4 immutable: true,
5}));
6
7// Dynamic content
8app.get('/api/profile', (req, res) => {
9 res.set('Cache-Control', 'private, max-age=300');
10 res.json(profile);
11});
12
13// No caching
14app.get('/api/transactions', (req, res) => {
15 res.set('Cache-Control', 'no-store');
16 res.json(transactions);
17});Cache-Control Directives#
Directive │ Meaning
─────────────────┼─────────────────────────────────
public │ Can be cached by CDN/proxies
private │ Only browser can cache
no-cache │ Must revalidate before using
no-store │ Never cache
max-age=N │ Fresh for N seconds
immutable │ Never changes, skip revalidation
stale-while- │ Serve stale while fetching fresh
revalidate=N │
ETags and Conditional Requests#
1import etag from 'etag';
2
3app.get('/api/resource/:id', async (req, res) => {
4 const resource = await getResource(req.params.id);
5 const resourceEtag = etag(JSON.stringify(resource));
6
7 // Check if client has current version
8 if (req.headers['if-none-match'] === resourceEtag) {
9 return res.status(304).end();
10 }
11
12 res.set('ETag', resourceEtag);
13 res.json(resource);
14});CDN Caching#
Cache Keys#
1// Different cache entries based on:
2// - URL path
3// - Query parameters
4// - Headers (Accept-Language, Accept-Encoding)
5// - Cookies (with care)
6
7// Cloudflare example
8addEventListener('fetch', event => {
9 const cacheKey = new Request(event.request.url, {
10 method: 'GET',
11 headers: {
12 'Accept-Language': event.request.headers.get('Accept-Language'),
13 },
14 });
15
16 event.respondWith(
17 caches.default.match(cacheKey).then(cached => {
18 if (cached) return cached;
19 return fetch(event.request);
20 })
21 );
22});Vary Header#
1// Tell CDN which headers affect the response
2app.get('/api/content', (req, res) => {
3 const language = req.headers['accept-language'];
4
5 res.set('Vary', 'Accept-Language');
6 res.set('Cache-Control', 'public, max-age=3600');
7 res.json(getContent(language));
8});Cache Invalidation#
1// Purge specific URL
2await fetch('https://api.cloudflare.com/client/v4/zones/ZONE_ID/purge_cache', {
3 method: 'POST',
4 headers: {
5 'Authorization': `Bearer ${API_TOKEN}`,
6 'Content-Type': 'application/json',
7 },
8 body: JSON.stringify({
9 files: ['https://example.com/api/products/123'],
10 }),
11});
12
13// Purge by tag
14await fetch('...', {
15 body: JSON.stringify({
16 tags: ['products', 'featured'],
17 }),
18});Application Caching#
Redis Caching Patterns#
1import Redis from 'ioredis';
2
3const redis = new Redis(process.env.REDIS_URL);
4
5// Cache-aside pattern
6async function getUser(userId: string): Promise<User> {
7 // Check cache
8 const cached = await redis.get(`user:${userId}`);
9 if (cached) {
10 return JSON.parse(cached);
11 }
12
13 // Fetch from database
14 const user = await db.users.findById(userId);
15
16 // Store in cache
17 await redis.setex(`user:${userId}`, 3600, JSON.stringify(user));
18
19 return user;
20}
21
22// Write-through pattern
23async function updateUser(userId: string, data: Partial<User>): Promise<User> {
24 // Update database
25 const user = await db.users.update(userId, data);
26
27 // Update cache
28 await redis.setex(`user:${userId}`, 3600, JSON.stringify(user));
29
30 return user;
31}Cache Stampede Prevention#
1// Problem: Many requests miss cache simultaneously
2// Solution: Single-flight with locks
3
4async function getUserWithLock(userId: string): Promise<User> {
5 const cacheKey = `user:${userId}`;
6 const lockKey = `lock:${cacheKey}`;
7
8 // Try cache first
9 const cached = await redis.get(cacheKey);
10 if (cached) return JSON.parse(cached);
11
12 // Try to acquire lock
13 const acquired = await redis.set(lockKey, '1', 'NX', 'EX', 10);
14
15 if (acquired) {
16 try {
17 const user = await db.users.findById(userId);
18 await redis.setex(cacheKey, 3600, JSON.stringify(user));
19 return user;
20 } finally {
21 await redis.del(lockKey);
22 }
23 }
24
25 // Wait for other request to populate cache
26 await sleep(100);
27 return getUserWithLock(userId);
28}Stale-While-Revalidate#
1async function getUserSWR(userId: string): Promise<User> {
2 const cacheKey = `user:${userId}`;
3 const staleKey = `stale:${cacheKey}`;
4
5 const cached = await redis.get(cacheKey);
6 if (cached) return JSON.parse(cached);
7
8 // Check for stale data
9 const stale = await redis.get(staleKey);
10
11 if (stale) {
12 // Return stale data immediately
13 // Refresh in background
14 refreshUserAsync(userId);
15 return JSON.parse(stale);
16 }
17
18 // No cache at all
19 return refreshUser(userId);
20}
21
22async function refreshUser(userId: string): Promise<User> {
23 const user = await db.users.findById(userId);
24 const json = JSON.stringify(user);
25
26 await redis
27 .multi()
28 .setex(`user:${userId}`, 300, json) // 5 min fresh
29 .setex(`stale:user:${userId}`, 3600, json) // 1 hour stale
30 .exec();
31
32 return user;
33}Caching Strategies#
Cache-Aside (Lazy Loading)#
1. Check cache
2. If miss, load from database
3. Store in cache
4. Return data
Best for: Read-heavy workloads
Write-Through#
1. Write to cache
2. Write to database
3. Return success
Best for: Data that's read immediately after write
Write-Behind (Write-Back)#
1. Write to cache
2. Return success
3. Asynchronously write to database
Best for: Write-heavy workloads (with durability trade-off)
Refresh-Ahead#
1. Track cache expiration
2. Proactively refresh before expiry
3. Users always get cached data
Best for: Predictable access patterns
Query Caching#
Full Response Caching#
1import { createHash } from 'crypto';
2
3function getCacheKey(query: string, variables: object): string {
4 const hash = createHash('sha256')
5 .update(JSON.stringify({ query, variables }))
6 .digest('hex');
7 return `query:${hash}`;
8}
9
10async function executeQuery(query: string, variables: object) {
11 const cacheKey = getCacheKey(query, variables);
12
13 const cached = await redis.get(cacheKey);
14 if (cached) return JSON.parse(cached);
15
16 const result = await graphql(schema, query, null, null, variables);
17
18 await redis.setex(cacheKey, 60, JSON.stringify(result));
19
20 return result;
21}Entity Caching#
1// Cache entities, not queries
2class CachedRepository {
3 async findById(id: string): Promise<Entity> {
4 const cached = await redis.get(`entity:${id}`);
5 if (cached) return JSON.parse(cached);
6
7 const entity = await db.entities.findById(id);
8 await redis.setex(`entity:${id}`, 3600, JSON.stringify(entity));
9
10 return entity;
11 }
12
13 async findByIds(ids: string[]): Promise<Entity[]> {
14 // Multi-get from cache
15 const keys = ids.map(id => `entity:${id}`);
16 const cached = await redis.mget(...keys);
17
18 const result: Entity[] = [];
19 const missing: string[] = [];
20
21 cached.forEach((value, i) => {
22 if (value) {
23 result[i] = JSON.parse(value);
24 } else {
25 missing.push(ids[i]);
26 }
27 });
28
29 // Fetch missing from database
30 if (missing.length) {
31 const fetched = await db.entities.findByIds(missing);
32 // Cache and merge...
33 }
34
35 return result;
36 }
37}Invalidation Strategies#
Time-Based (TTL)#
// Simple but may serve stale data
await redis.setex('key', 3600, value);Event-Based#
1// Invalidate when data changes
2async function updateProduct(id: string, data: ProductUpdate) {
3 await db.products.update(id, data);
4
5 // Invalidate related caches
6 await redis.del(`product:${id}`);
7 await redis.del(`products:category:${data.categoryId}`);
8 await redis.del('products:featured');
9
10 // Or publish event
11 await redis.publish('cache:invalidate', JSON.stringify({
12 type: 'product',
13 id,
14 }));
15}Tag-Based#
1// Cache with tags for bulk invalidation
2async function cacheWithTags(key: string, value: any, tags: string[]) {
3 await redis.set(key, JSON.stringify(value));
4
5 for (const tag of tags) {
6 await redis.sadd(`tag:${tag}`, key);
7 }
8}
9
10async function invalidateByTag(tag: string) {
11 const keys = await redis.smembers(`tag:${tag}`);
12 if (keys.length) {
13 await redis.del(...keys);
14 }
15 await redis.del(`tag:${tag}`);
16}
17
18// Usage
19await cacheWithTags('product:123', product, ['products', 'featured']);
20await invalidateByTag('products'); // Invalidates all productsMonitoring#
Cache Metrics#
1// Track hit/miss rates
2async function getWithMetrics(key: string) {
3 const value = await redis.get(key);
4
5 if (value) {
6 metrics.increment('cache.hit', { key_prefix: key.split(':')[0] });
7 } else {
8 metrics.increment('cache.miss', { key_prefix: key.split(':')[0] });
9 }
10
11 return value;
12}
13
14// Alert on low hit rates
15// Target: > 90% hit rate for most cachesConclusion#
Caching is powerful but requires careful consideration of consistency, invalidation, and monitoring. Start with simple strategies, measure hit rates, and evolve based on actual usage patterns.
Remember: the best cache is one you don't need. Before caching, consider if the underlying data source can be made faster. Cache judiciously—every cache introduces complexity and potential consistency issues.