What Is Caching?
Caching stores frequently accessed data in fast storage (memory) to avoid expensive recomputation or database queries. Browser caches store static assets. CDN caches serve content from edge locations. Application caches (Redis, Memcached) store query results and session data.
How Caching Works
Without caching: every page load queries the database (50ms), renders the template (20ms), total 70ms. With Redis cache: first request does the work and caches the result, subsequent requests return in 0.5ms. Cache for 60 seconds, invalidate on data changes.
Caching layers stack: browser cache → CDN → application cache (Redis) → database cache (query cache). Each layer prevents unnecessary work at the next level.
Why Developers Use Caching
Add caching when you identify performance bottlenecks. Cache database query results, API responses, computed values, and rendered HTML. But remember: cache invalidation is one of the hardest problems in computer science.
Key Concepts
- Cache Hit/Miss — Hit: data found in cache (fast). Miss: data not in cache, must be computed/fetched (slow)
- TTL (Time-To-Live) — How long cached data remains valid — balance freshness vs performance
- Cache Invalidation — Removing stale cached data when the underlying data changes — the hardest part of caching
- Cache-Aside Pattern — Application checks cache first; on miss, fetches from database and populates cache
Frequently Asked Questions
When should I add caching?
When you have repeated expensive operations — slow database queries, external API calls, or complex computations. Profile first, cache second.
How do I handle cache invalidation?
TTL-based expiration for most cases. Event-driven invalidation (delete cache on database write) for real-time accuracy. Both strategies combined for best results.