Skip to content

Commit 544221a

Browse files
DOC-5851 added initial tutorial text (needs a lot of work)
1 parent ba4becb commit 544221a

File tree

2 files changed

+264
-0
lines changed

2 files changed

+264
-0
lines changed
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
---
2+
title: Use cases
3+
description: Learn how to develop with Redis
4+
linkTitle: Use cases
5+
---
Lines changed: 259 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,259 @@
1+
---
2+
description: Learn how to implement the cache-aside pattern with Redis for improved application performance
3+
linkTitle: Cache-Aside Pattern
4+
title: Cache-Aside Pattern Tutorial
5+
weight: 1
6+
---
7+
8+
The **cache-aside pattern** (also called "lazy loading") is a caching strategy where your application manages both the cache and the underlying data store. This tutorial teaches you how to implement this pattern with Redis to improve application performance.
9+
10+
## What is the Cache-Aside Pattern?
11+
12+
The cache-aside pattern works by:
13+
14+
1. **Check the cache** - When your application needs data, it first checks Redis
15+
2. **Cache miss** - If the data isn't cached, fetch it from your data store (database, API, etc.)
16+
3. **Store in cache** - Save the data in Redis with a time-to-live (TTL)
17+
4. **Return the data** - Send the data to the client
18+
5. **Cache hit** - On subsequent requests, return the cached data immediately
19+
20+
```
21+
Request for Data
22+
23+
Check Redis Cache
24+
25+
├─ Cache Hit → Return cached value (fast!)
26+
27+
└─ Cache Miss
28+
29+
Fetch from Data Store
30+
31+
Store in Redis with TTL
32+
33+
Return value
34+
```
35+
36+
## When to Use Cache-Aside
37+
38+
Cache-aside is ideal for:
39+
40+
- **Read-heavy workloads** - Most requests are reads, not writes
41+
- **Tolerable staleness** - Your data can be slightly out of date
42+
- **Heterogeneous data** - Different data types with varying access patterns
43+
- **Resilient systems** - Cache failures shouldn't break your application
44+
45+
**Don't use cache-aside for:**
46+
- Write-heavy workloads (use write-through or write-behind instead)
47+
- Data requiring strict consistency
48+
- Small datasets that fit entirely in memory
49+
50+
## Basic Implementation
51+
52+
Here's a simple cache-aside implementation in Python:
53+
54+
```python
55+
import redis
56+
import json
57+
58+
# Connect to Redis
59+
r = redis.Redis(host='localhost', port=6379, decode_responses=True)
60+
61+
def get_user(user_id):
62+
"""Get user data with cache-aside pattern."""
63+
cache_key = f'user:{user_id}'
64+
65+
# Step 1: Check cache
66+
cached_user = r.get(cache_key)
67+
if cached_user:
68+
print(f"Cache hit for {cache_key}")
69+
return json.loads(cached_user)
70+
71+
# Step 2: Cache miss - fetch from database
72+
print(f"Cache miss for {cache_key}")
73+
user = fetch_from_database(user_id) # Your database query
74+
75+
# Step 3: Store in cache with 1-hour TTL
76+
r.setex(cache_key, 3600, json.dumps(user))
77+
78+
# Step 4: Return data
79+
return user
80+
```
81+
82+
## Cache Invalidation
83+
84+
When data changes, you need to invalidate the cache to prevent stale data:
85+
86+
```python
87+
def update_user(user_id, new_data):
88+
"""Update user and invalidate cache."""
89+
# Update database
90+
update_database(user_id, new_data)
91+
92+
# Invalidate cache
93+
cache_key = f'user:{user_id}'
94+
r.delete(cache_key)
95+
96+
# Next request will fetch fresh data
97+
```
98+
99+
### Pattern-Based Invalidation
100+
101+
Invalidate multiple related keys at once:
102+
103+
```python
104+
def invalidate_user_cache(user_id):
105+
"""Invalidate all cache keys for a user."""
106+
# Delete all keys matching pattern
107+
pattern = f'user:{user_id}:*'
108+
for key in r.scan_iter(match=pattern):
109+
r.delete(key)
110+
```
111+
112+
## TTL Management
113+
114+
Set appropriate time-to-live values for different data types:
115+
116+
```python
117+
# Short TTL for frequently changing data (5 minutes)
118+
r.setex('user:session:123', 300, session_data)
119+
120+
# Medium TTL for user profiles (1 hour)
121+
r.setex('user:profile:456', 3600, profile_data)
122+
123+
# Long TTL for reference data (24 hours)
124+
r.setex('product:catalog', 86400, catalog_data)
125+
```
126+
127+
## Error Handling
128+
129+
Always handle cache failures gracefully:
130+
131+
```python
132+
def get_user_safe(user_id):
133+
"""Get user with fallback to database on cache failure."""
134+
try:
135+
cache_key = f'user:{user_id}'
136+
cached_user = r.get(cache_key)
137+
if cached_user:
138+
return json.loads(cached_user)
139+
except redis.ConnectionError:
140+
print("Redis unavailable, falling back to database")
141+
except Exception as e:
142+
print(f"Cache error: {e}")
143+
144+
# Fallback: fetch from database
145+
return fetch_from_database(user_id)
146+
```
147+
148+
## Performance Metrics
149+
150+
Monitor cache effectiveness with hit/miss ratios:
151+
152+
```python
153+
class CacheMetrics:
154+
def __init__(self):
155+
self.hits = 0
156+
self.misses = 0
157+
158+
def record_hit(self):
159+
self.hits += 1
160+
161+
def record_miss(self):
162+
self.misses += 1
163+
164+
def hit_ratio(self):
165+
total = self.hits + self.misses
166+
return self.hits / total if total > 0 else 0
167+
168+
metrics = CacheMetrics()
169+
170+
def get_user_tracked(user_id):
171+
cache_key = f'user:{user_id}'
172+
cached_user = r.get(cache_key)
173+
174+
if cached_user:
175+
metrics.record_hit()
176+
return json.loads(cached_user)
177+
178+
metrics.record_miss()
179+
user = fetch_from_database(user_id)
180+
r.setex(cache_key, 3600, json.dumps(user))
181+
return user
182+
183+
# Check performance
184+
print(f"Hit ratio: {metrics.hit_ratio():.2%}")
185+
```
186+
187+
## Best Practices
188+
189+
1. **Use appropriate TTLs** - Balance freshness vs. cache efficiency
190+
2. **Handle cache failures** - Always fall back to the data store
191+
3. **Monitor hit ratios** - Aim for 80%+ hit ratio for optimal performance
192+
4. **Invalidate strategically** - Use pattern-based invalidation for related data
193+
5. **Compress large values** - Use gzip for large cached objects
194+
6. **Use key prefixes** - Organize keys by data type (e.g., `user:`, `product:`)
195+
7. **Implement retry logic** - Handle transient Redis failures gracefully
196+
197+
## Common Pitfalls
198+
199+
### Cache Stampede
200+
When a popular cache entry expires, many concurrent requests hit the database:
201+
202+
```python
203+
# Problem: Multiple requests fetch same data simultaneously
204+
# Solution: Use locks or probabilistic early expiration
205+
def get_user_with_lock(user_id):
206+
cache_key = f'user:{user_id}'
207+
lock_key = f'{cache_key}:lock'
208+
209+
cached_user = r.get(cache_key)
210+
if cached_user:
211+
return json.loads(cached_user)
212+
213+
# Try to acquire lock
214+
if r.set(lock_key, '1', nx=True, ex=10):
215+
try:
216+
user = fetch_from_database(user_id)
217+
r.setex(cache_key, 3600, json.dumps(user))
218+
return user
219+
finally:
220+
r.delete(lock_key)
221+
else:
222+
# Wait for lock holder to populate cache
223+
time.sleep(0.1)
224+
return get_user_with_lock(user_id)
225+
```
226+
227+
### Null Value Caching
228+
Cache null values to prevent repeated database queries:
229+
230+
```python
231+
def get_user_with_null_cache(user_id):
232+
cache_key = f'user:{user_id}'
233+
cached = r.get(cache_key)
234+
235+
if cached == 'NULL':
236+
return None # User doesn't exist
237+
238+
if cached:
239+
return json.loads(cached)
240+
241+
user = fetch_from_database(user_id)
242+
243+
if user is None:
244+
r.setex(cache_key, 300, 'NULL') # Cache null for 5 minutes
245+
else:
246+
r.setex(cache_key, 3600, json.dumps(user))
247+
248+
return user
249+
```
250+
251+
## Next Steps
252+
253+
- Explore [Redis data types]({{< relref "/develop/data-types" >}}) for different caching scenarios
254+
- Implement [Redis Streams]({{< relref "/develop/data-types/streams" >}}) for event logging
255+
256+
## Additional Resources
257+
258+
- [Redis Python Client Documentation](https://redis-py.readthedocs.io/)
259+
- [Redis Commands Reference]({{< relref "/commands" >}})

0 commit comments

Comments
 (0)