caching
4
总安装量
4
周安装量
#53550
全站排名
安装命令
npx skills add https://github.com/dsantiagomj/dsmj-ai-toolkit --skill caching
Agent 安装分布
github-copilot
4
codex
4
kimi-cli
4
amp
4
gemini-cli
4
opencode
4
Skill 文档
Caching – Performance & Invalidation Patterns
Production patterns for Redis, HTTP caching, cache invalidation, and memoization
When to Use This Skill
Use this skill when:
- Implementing Redis or in-memory caching
- Setting up HTTP cache headers
- Designing cache invalidation strategies
- Optimizing database query performance
- Configuring CDN caching
- Implementing memoization patterns
Don’t use this skill when:
- Data must always be fresh (real-time requirements)
- Cache adds more complexity than value
- Storage costs exceed compute savings
Critical Patterns
Pattern 1: Cache-Aside (Lazy Loading)
When: Most common pattern for database caching
// â
GOOD: Cache-aside pattern with Redis
import Redis from 'ioredis';
const redis = new Redis(process.env.REDIS_URL);
async function getUser(userId: string): Promise<User> {
const cacheKey = `user:${userId}`;
// 1. Try cache first
const cached = await redis.get(cacheKey);
if (cached) {
return JSON.parse(cached);
}
// 2. Cache miss - fetch from database
const user = await db.user.findUnique({ where: { id: userId } });
if (!user) {
throw new NotFoundError('User');
}
// 3. Store in cache with TTL
await redis.setex(cacheKey, 3600, JSON.stringify(user)); // 1 hour TTL
return user;
}
// â
GOOD: Generic cache wrapper
async function withCache<T>(
key: string,
ttlSeconds: number,
fetcher: () => Promise<T>
): Promise<T> {
const cached = await redis.get(key);
if (cached) {
return JSON.parse(cached);
}
const data = await fetcher();
await redis.setex(key, ttlSeconds, JSON.stringify(data));
return data;
}
// Usage
const user = await withCache(
`user:${userId}`,
3600,
() => db.user.findUnique({ where: { id: userId } })
);
// â BAD: No TTL (cache lives forever)
await redis.set(cacheKey, JSON.stringify(user)); // Never expires!
Pattern 2: Cache Invalidation
When: Keeping cache in sync with data changes
// â
GOOD: Invalidate on write
async function updateUser(userId: string, data: UpdateUserInput) {
// Update database
const user = await db.user.update({
where: { id: userId },
data,
});
// Invalidate cache
await redis.del(`user:${userId}`);
// Also invalidate related caches
await redis.del(`user-profile:${userId}`);
await redis.del(`user-settings:${userId}`);
return user;
}
// â
GOOD: Pattern-based invalidation
async function invalidateUserCaches(userId: string) {
// Find and delete all keys matching pattern
const keys = await redis.keys(`user:${userId}:*`);
if (keys.length > 0) {
await redis.del(...keys);
}
}
// â
GOOD: Write-through cache (update cache on write)
async function updateProduct(id: string, data: UpdateProductInput) {
const product = await db.product.update({
where: { id },
data,
});
// Update cache with new data (instead of just deleting)
await redis.setex(`product:${id}`, 3600, JSON.stringify(product));
return product;
}
// â BAD: Forgetting to invalidate
async function updateUser(userId: string, data: UpdateUserInput) {
const user = await db.user.update({ where: { id: userId }, data });
// Cache still has old data!
return user;
}
Pattern 3: HTTP Cache Headers
When: Caching API responses at CDN/browser level
// â
GOOD: Immutable static assets
// Next.js: next.config.js
module.exports = {
async headers() {
return [
{
source: '/_next/static/:path*',
headers: [
{
key: 'Cache-Control',
value: 'public, max-age=31536000, immutable',
},
],
},
];
},
};
// â
GOOD: API responses with revalidation
// Express/Next.js API route
export async function GET(request: Request) {
const products = await getProducts();
return Response.json(products, {
headers: {
'Cache-Control': 'public, s-maxage=60, stale-while-revalidate=300',
// CDN caches for 60s, serves stale for 5min while revalidating
},
});
}
// â
GOOD: Private user data (no CDN caching)
export async function GET(request: Request) {
const user = await getCurrentUser();
return Response.json(user, {
headers: {
'Cache-Control': 'private, no-cache, no-store, must-revalidate',
},
});
}
// â
GOOD: ETag for conditional requests
import { createHash } from 'crypto';
export async function GET(request: Request) {
const data = await getData();
const etag = createHash('md5').update(JSON.stringify(data)).digest('hex');
// Check if client has current version
if (request.headers.get('if-none-match') === etag) {
return new Response(null, { status: 304 });
}
return Response.json(data, {
headers: {
'ETag': etag,
'Cache-Control': 'public, max-age=0, must-revalidate',
},
});
}
// â BAD: Caching user-specific data publicly
return Response.json(userDashboard, {
headers: { 'Cache-Control': 'public, max-age=3600' }, // Other users see this!
});
Pattern 4: Memoization
When: Caching function results in memory
// â
GOOD: Simple memoization
function memoize<T extends (...args: any[]) => any>(fn: T): T {
const cache = new Map<string, ReturnType<T>>();
return ((...args: Parameters<T>): ReturnType<T> => {
const key = JSON.stringify(args);
if (cache.has(key)) {
return cache.get(key)!;
}
const result = fn(...args);
cache.set(key, result);
return result;
}) as T;
}
// Usage
const expensiveCalculation = memoize((n: number) => {
// Complex computation
return fibonacci(n);
});
// â
GOOD: LRU cache with size limit
import { LRUCache } from 'lru-cache';
const cache = new LRUCache<string, User>({
max: 500, // Max 500 items
ttl: 1000 * 60 * 5, // 5 minutes
});
async function getUser(id: string): Promise<User> {
const cached = cache.get(id);
if (cached) return cached;
const user = await db.user.findUnique({ where: { id } });
if (user) cache.set(id, user);
return user;
}
// â
GOOD: React useMemo for expensive renders
function ProductList({ products, filters }) {
const filteredProducts = useMemo(() => {
return products.filter(p => matchesFilters(p, filters));
}, [products, filters]);
return <div>{filteredProducts.map(p => <ProductCard key={p.id} product={p} />)}</div>;
}
// â BAD: Unbounded cache (memory leak)
const cache = new Map(); // Grows forever!
Pattern 5: Cache Stampede Prevention
When: Preventing thundering herd on cache expiry
// â
GOOD: Mutex lock to prevent stampede
import Redlock from 'redlock';
const redlock = new Redlock([redis], {
retryCount: 3,
retryDelay: 200,
});
async function getDataWithLock(key: string): Promise<Data> {
// Try cache first
const cached = await redis.get(key);
if (cached) {
return JSON.parse(cached);
}
// Acquire lock before fetching
const lockKey = `lock:${key}`;
let lock;
try {
lock = await redlock.acquire([lockKey], 5000);
// Check cache again (another process might have populated it)
const cachedAgain = await redis.get(key);
if (cachedAgain) {
return JSON.parse(cachedAgain);
}
// Fetch and cache
const data = await fetchExpensiveData();
await redis.setex(key, 3600, JSON.stringify(data));
return data;
} finally {
if (lock) await lock.release();
}
}
// â
GOOD: Stale-while-revalidate pattern
async function getWithSWR(key: string, ttl: number, staleTtl: number) {
const cached = await redis.get(key);
if (cached) {
const { data, timestamp } = JSON.parse(cached);
const age = Date.now() - timestamp;
// Fresh - return immediately
if (age < ttl * 1000) {
return data;
}
// Stale but usable - return and refresh in background
if (age < staleTtl * 1000) {
refreshInBackground(key, ttl); // Don't await
return data;
}
}
// No cache or too stale - fetch synchronously
return await fetchAndCache(key, ttl);
}
// â BAD: All requests hit database on cache miss
async function getData() {
const cached = await redis.get('data');
if (cached) return JSON.parse(cached);
// 1000 concurrent requests all hit database!
const data = await db.data.findMany();
await redis.setex('data', 60, JSON.stringify(data));
return data;
}
Code Examples
For complete, production-ready examples, see references/examples.md:
- Redis Session Store
- Query Result Caching
- CDN Cache with Vercel
- LRU Cache with TTL
Anti-Patterns
Don’t: Cache Without TTL
// â BAD: No expiration
await redis.set('user:123', JSON.stringify(user));
// â
GOOD: Always set TTL
await redis.setex('user:123', 3600, JSON.stringify(user));
Don’t: Cache Errors
// â BAD: Caching error responses
const result = await fetchData().catch(() => ({ error: true }));
await redis.setex(key, 3600, JSON.stringify(result)); // Error cached for 1 hour!
// â
GOOD: Only cache successful responses
try {
const result = await fetchData();
await redis.setex(key, 3600, JSON.stringify(result));
return result;
} catch (error) {
// Don't cache errors
throw error;
}
Don’t: Ignore Cache Serialization
// â BAD: Caching non-serializable data
const user = await getUser();
user.getFullName = () => `${user.firstName} ${user.lastName}`;
await redis.set(key, JSON.stringify(user)); // Method is lost!
// â
GOOD: Cache plain data, reconstruct on retrieval
await redis.set(key, JSON.stringify({ ...user }));
Quick Reference
| Strategy | Use Case | Example |
|---|---|---|
| Cache-aside | Database queries | Check cache, fetch if miss |
| Write-through | Frequent reads | Update cache on every write |
| Write-behind | High write volume | Queue writes, batch to DB |
| HTTP s-maxage | API responses | CDN caching |
| stale-while-revalidate | High availability | Serve stale, refresh background |
| LRU cache | Memory caching | Evict least recently used |
Resources
Related Skills:
- performance: Overall performance optimization
- redis: Redis-specific patterns (if exists)
- observability: Cache hit/miss monitoring
Keywords
caching, redis, cache-invalidation, http-cache, cdn, memoization, ttl, lru, stale-while-revalidate, cache-aside