Skip to content

Commit 1e43c8a

Browse files
committed
feat: initial version of server side tower crate
1 parent 12dde46 commit 1e43c8a

File tree

16 files changed

+3982
-61
lines changed

16 files changed

+3982
-61
lines changed

.gitignore

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,4 +4,8 @@ Cargo.lock
44
http-cacache/
55
/.idea
66
/public
7-
/.DS_Store
7+
**/.DS_Store
8+
9+
# Cache directories from examples
10+
**/cache/
11+
**/cache-*/

Cargo.toml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,5 +6,6 @@ members = [
66
"http-cache-surf",
77
"http-cache-quickcache",
88
"http-cache-tower",
9+
"http-cache-tower-server",
910
"http-cache-ureq"
1011
]

docs/src/SUMMARY.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,11 +6,13 @@
66
- [Development](./development/development.md)
77
- [Supporting a Backend Cache Manager](./development/supporting-a-backend-cache-manager.md)
88
- [Supporting an HTTP Client](./development/supporting-an-http-client.md)
9-
- [Client Implementations](./clients/clients.md)
9+
- [Client-Side Caching](./clients/clients.md)
1010
- [reqwest](./clients/reqwest.md)
1111
- [surf](./clients/surf.md)
1212
- [ureq](./clients/ureq.md)
1313
- [tower](./clients/tower.md)
14+
- [Server-Side Caching](./server/server.md)
15+
- [tower-server](./server/tower-server.md)
1416
- [Backend Cache Manager Implementations](./managers/managers.md)
1517
- [cacache](./managers/cacache.md)
1618
- [moka](./managers/moka.md)

docs/src/clients/clients.md

Lines changed: 14 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,17 @@
1-
# Client Implementations
1+
# Client-Side Caching
2+
3+
These middleware implementations cache responses from external APIs that your application calls. This is different from server-side caching, which caches your own application's responses.
4+
5+
**Use client-side caching when:**
6+
- Calling external APIs
7+
- Reducing API rate limit consumption
8+
- Improving offline support
9+
- Reducing bandwidth usage
10+
- Speeding up repeated API calls
11+
12+
**For server-side caching** (caching your own app's responses), see [Server-Side Caching](../server/server.md).
13+
14+
## Available Client Implementations
215

316
The following client implementations are provided by this crate:
417

docs/src/introduction.md

Lines changed: 45 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,19 +1,62 @@
11
# Introduction
22

3-
`http-cache` is a library that acts as a middleware for caching HTTP responses. It is intended to be used by other libraries to support multiple HTTP clients and backend cache managers, though it does come with multiple optional manager implementations out of the box. `http-cache` is built on top of [`http-cache-semantics`](https://github.com/kornelski/rusty-http-cache-semantics) which parses HTTP headers to correctly compute cacheability of responses.
3+
`http-cache` is a comprehensive library for HTTP response caching in Rust. It provides both **client-side** and **server-side** caching middleware for multiple HTTP clients and frameworks. Built on top of [`http-cache-semantics`](https://github.com/kornelski/rusty-http-cache-semantics), it correctly implements HTTP cache semantics as defined in RFC 7234.
44

55
## Key Features
66

7+
- **Client-Side Caching**: Cache responses from external APIs you're calling
8+
- **Server-Side Caching**: Cache your own application's responses to reduce load
79
- **Traditional Caching**: Standard HTTP response caching with full buffering
810
- **Streaming Support**: Memory-efficient caching for large responses without full buffering
9-
- **Cache-Aware Rate Limiting**: Intelligent rate limiting that only applies on cache misses, not cache hits
11+
- **Cache-Aware Rate Limiting**: Intelligent rate limiting that only applies on cache misses
1012
- **Multiple Backends**: Support for disk-based (cacache) and in-memory (moka, quick-cache) storage
1113
- **Client Integrations**: Support for reqwest, surf, tower, and ureq HTTP clients
14+
- **Server Framework Support**: Tower-based servers (Axum, Hyper, Tonic)
1215
- **RFC 7234 Compliance**: Proper HTTP cache semantics with respect for cache-control headers
1316

17+
## Client-Side vs Server-Side Caching
18+
19+
### Client-Side Caching
20+
21+
Cache responses from external APIs your application calls:
22+
23+
```rust
24+
// Example: Caching API responses you fetch
25+
let client = reqwest::Client::new();
26+
let cached_client = HttpCache::new(client, cache_manager);
27+
let response = cached_client.get("https://api.example.com/users").send().await?;
28+
```
29+
30+
**Use cases:**
31+
- Reducing calls to external APIs
32+
- Offline support
33+
- Bandwidth optimization
34+
- Rate limit compliance
35+
36+
### Server-Side Caching
37+
38+
Cache responses your application generates:
39+
40+
```rust
41+
// Example: Caching your own endpoint responses
42+
let app = Router::new()
43+
.route("/users/:id", get(get_user))
44+
.layer(ServerCacheLayer::new(cache_manager)); // Cache your responses
45+
```
46+
47+
**Use cases:**
48+
- Reducing database queries
49+
- Caching expensive computations
50+
- Improving response times
51+
- Reducing server load
52+
53+
**Critical:** Server-side cache middleware must be placed **after** routing to preserve request context (path parameters, state, etc.).
54+
1455
## Streaming vs Traditional Caching
1556

1657
The library supports two caching approaches:
1758

1859
- **Traditional Caching** (`CacheManager` trait): Buffers entire responses in memory before caching. Suitable for smaller responses and simpler use cases.
1960
- **Streaming Caching** (`StreamingCacheManager` trait): Processes responses as streams without full buffering. Ideal for large files, media content, or memory-constrained environments.
61+
62+
Note: Streaming is currently only available for client-side caching. Server-side caching uses buffered responses.

docs/src/server/server.md

Lines changed: 201 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,201 @@
1+
# Server-Side Caching
2+
3+
Server-side HTTP response caching is fundamentally different from client-side caching. While client-side middleware caches responses from external APIs, server-side middleware caches your own application's responses to reduce load and improve performance.
4+
5+
## What is Server-Side Caching?
6+
7+
Server-side caching stores the responses your application generates so that subsequent identical requests can be served from cache without re-executing expensive operations like database queries or complex computations.
8+
9+
### Example Flow
10+
11+
**Without Server-Side Caching:**
12+
```
13+
Request → Routing → Handler → Database Query → Response (200ms)
14+
Request → Routing → Handler → Database Query → Response (200ms)
15+
Request → Routing → Handler → Database Query → Response (200ms)
16+
```
17+
18+
**With Server-Side Caching:**
19+
```
20+
Request → Routing → Cache MISS → Handler → Database Query → Response (200ms) → Cached
21+
Request → Routing → Cache HIT → Response (2ms)
22+
Request → Routing → Cache HIT → Response (2ms)
23+
```
24+
25+
## Key Differences from Client-Side Caching
26+
27+
| Aspect | Client-Side | Server-Side |
28+
|--------|-------------|-------------|
29+
| **What it caches** | External API responses | Your app's responses |
30+
| **Position** | Before making outbound requests | After routing, before handlers |
31+
| **Use case** | Reduce external API calls | Reduce internal computation |
32+
| **RFC 7234 behavior** | Client cache rules | Shared cache rules |
33+
| **Request extensions** | N/A | Must preserve (path params, state) |
34+
35+
## Available Implementations
36+
37+
Currently, server-side caching is available for:
38+
39+
- **Tower-based servers** (Axum, Hyper, Tonic) - See [tower-server](./tower-server.md)
40+
41+
## When to Use Server-Side Caching
42+
43+
### Good Use Cases ✅
44+
45+
1. **Public API endpoints** with expensive database queries
46+
2. **Read-heavy workloads** where data doesn't change frequently
47+
3. **Dashboard or analytics data** that updates periodically
48+
4. **Static-like content** that requires dynamic generation
49+
5. **Search results** for common queries
50+
6. **Rendered HTML** for public pages
51+
52+
### Avoid Caching ❌
53+
54+
1. **User-specific data** (unless using proper cache key differentiation)
55+
2. **Authenticated endpoints** (without user ID in cache key)
56+
3. **Real-time data** that must always be fresh
57+
4. **Write operations** (POST/PUT/DELETE requests)
58+
5. **Sensitive information** that shouldn't be shared
59+
6. **Session-dependent responses** (without session ID in cache key)
60+
61+
## Security Considerations
62+
63+
Server-side caches are **shared caches** - cached responses are served to ALL users. This is different from client-side caches which are per-client.
64+
65+
### Critical Security Rule
66+
67+
**Never cache user-specific data without including the user/session identifier in the cache key.**
68+
69+
### Safe Patterns
70+
71+
**Pattern 1: Mark user-specific responses as private**
72+
```rust
73+
async fn user_profile() -> Response {
74+
(
75+
[(header::CACHE_CONTROL, "private")], // Won't be cached
76+
"User profile data"
77+
).into_response()
78+
}
79+
```
80+
81+
**Pattern 2: Include user ID in cache key**
82+
```rust
83+
let keyer = CustomKeyer::new(|req: &Request<()>| {
84+
let user_id = extract_user_id(req);
85+
format!("{} {} user:{}", req.method(), req.uri().path(), user_id)
86+
});
87+
```
88+
89+
**Pattern 3: Don't cache at all**
90+
```rust
91+
async fn sensitive_data() -> Response {
92+
(
93+
[(header::CACHE_CONTROL, "no-store")],
94+
"Sensitive data"
95+
).into_response()
96+
}
97+
```
98+
99+
## RFC 7234 Compliance
100+
101+
Server-side caches implement **shared cache** semantics as defined in RFC 7234:
102+
103+
### Must NOT Cache
104+
105+
- Responses with `Cache-Control: private` (user-specific)
106+
- Responses with `Cache-Control: no-store` (sensitive)
107+
- Responses with `Cache-Control: no-cache` (requires revalidation)
108+
- Non-2xx status codes (errors)
109+
- Responses with `Authorization` header (unless explicitly allowed)
110+
111+
### Must Cache Correctly
112+
113+
- Prefer `s-maxage` over `max-age` (shared cache specific)
114+
- Respect `Vary` headers (content negotiation)
115+
- Handle `Expires` header as fallback
116+
- Support `max-age` and `public` directives
117+
118+
## Performance Characteristics
119+
120+
### Benefits
121+
122+
- **Reduced database load**: Cached responses don't hit the database
123+
- **Lower CPU usage**: Expensive computations run once
124+
- **Faster response times**: Cache hits are typically <5ms
125+
- **Better scalability**: Handle more requests with same resources
126+
127+
### Considerations
128+
129+
- **Memory usage**: Cached responses stored in memory or disk
130+
- **Stale data**: Cached data may become outdated
131+
- **Cache warming**: Initial requests (cache misses) are slower
132+
- **Invalidation complexity**: Updating cached data can be tricky
133+
134+
## Cache Invalidation Strategies
135+
136+
### Time-Based (TTL)
137+
138+
Set expiration times on cached responses:
139+
140+
```rust
141+
async fn handler() -> Response {
142+
(
143+
[(header::CACHE_CONTROL, "max-age=300")], // 5 minutes
144+
"Response data"
145+
).into_response()
146+
}
147+
```
148+
149+
### Event-Based
150+
151+
Manually invalidate cache entries when data changes:
152+
153+
```rust
154+
// After updating user data
155+
cache_manager.delete(&format!("GET /users/{}", user_id)).await?;
156+
```
157+
158+
### Hybrid Approach
159+
160+
Combine TTL with manual invalidation:
161+
- Use TTL for automatic expiration
162+
- Invalidate early when you know data changed
163+
164+
## Best Practices
165+
166+
1. **Start conservative**: Use shorter TTLs initially, increase as you gain confidence
167+
2. **Monitor cache hit rates**: Track X-Cache headers to measure effectiveness
168+
3. **Set size limits**: Prevent cache from consuming too much memory
169+
4. **Use appropriate keyers**: Match cache key strategy to your needs
170+
5. **Document caching behavior**: Make it clear which endpoints are cached
171+
6. **Test cache invalidation**: Ensure updates propagate correctly
172+
7. **Consider cache warming**: Pre-populate cache for common requests
173+
8. **Handle cache failures gracefully**: Application should work even if cache fails
174+
175+
## Monitoring and Debugging
176+
177+
### Enable Cache Status Headers
178+
179+
```rust
180+
let options = ServerCacheOptions {
181+
cache_status_headers: true,
182+
..Default::default()
183+
};
184+
```
185+
186+
This adds `X-Cache` headers to responses:
187+
- `X-Cache: HIT` - Served from cache
188+
- `X-Cache: MISS` - Generated by handler
189+
190+
### Track Metrics
191+
192+
Monitor these key metrics:
193+
- Cache hit rate (hits / total requests)
194+
- Average response time (hits vs misses)
195+
- Cache size and memory usage
196+
- Cache eviction rate
197+
- Stale response rate
198+
199+
## Getting Started
200+
201+
See the [tower-server](./tower-server.md) documentation for detailed implementation guide.

0 commit comments

Comments
 (0)