Skip to content

Commit a98b88a

Browse files
PoovarasanSudhir Ravindramohan
authored andcommitted
KOMPIO-2059 Enhance Code Documentation
1 parent 5faf003 commit a98b88a

File tree

7 files changed

+334
-19
lines changed

7 files changed

+334
-19
lines changed

README.md

Lines changed: 145 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,34 +1,170 @@
11
# Kompress Cache
22

3-
Redis-based cache that routes write operations to a primary Redis instance and read operations to replicas, with automatic failover to the primary in case of replica failure.
3+
Redis-based cache that routes write operations to a primary Redis instance and read operations to replicas, with automatic failover to the primary in case of replica failure. This package is designed to work seamlessly with FastAPI, making it ideal for modern python web applications.
44

5+
---
56

7+
## 🔧 Features
68

7-
## Configuration options - Environment variables
9+
- ✅ Async Redis support with graceful error handling
10+
- ✅ Automatic failover from replica to primary
11+
- ✅ Schema validation for cached values using Pydantic
12+
- ✅ Custom cache miss loading via `Loadable` interface
13+
- ✅ Support for primary and multiple replicas
14+
- ✅ Configurable via environment variables
15+
16+
---
17+
18+
19+
## ⚙️ Configuration options - Environment variables
820
| Environment Variable | Description | Default Value | Comments | Since |
921
|----------------------|----------------------------------------------------------------|--------------------------------------|----------------------------------------------------------|-------|
1022
| REDIS_HOST | Hostname/ IP Address of the Primary Redis cache | `localhost` | | 0.1.0 |
1123
| REDIS_PORT | Port of the Primary Redis cache | `6379` | | 0.1.0 |
12-
| REDIS_REPLICAS_HOST_PORT | Comma separated redis replicas host port | | Example: replica1:6379,replica2:6379 If no replicas provided, the primary redis server will be used for both read and write operations. | 0.1.0 |
24+
| REDIS_REPLICAS_HOST_PORT | Comma separated redis replicas host port | | Example: localhost:6380,localhost:6381 If no replicas provided, the primary redis server will be used for both read and write operations. | 0.1.0 |
1325
| REDIS_TIMEOUT | Timeout for a redis command execution in seconds | 5 | | 0.1.0 |
26+
---
1427

1528

1629

17-
## Usage
30+
## 💡 Usage
1831

1932
- Ensure that all configurations are setup and the given redis servers are running.
20-
- `kompress_cache` logger need to be configured. for more verbose logs, set the logger level to DEBUG.
33+
- `kompress_cache` logger need to be configured. For more verbose logs, set the logger level to DEBUG.
34+
35+
### 🔹 Basic Usage (No Validation)
2136
```
37+
import json
2238
from kompress_cache import get_cache
2339
2440
cache = get_cache()
2541
26-
# uses primary redis
27-
cache.hset("myhash", "key1", "value1") # otuput: 1
42+
user = {"id": "1", "name": "Alice"}
43+
44+
# Set data in Redis (uses the primary Redis instance)
45+
await cache.hset("users", user["id"], json.dumps(user)) # Output: 1
46+
47+
user_id = "1"
48+
49+
# Retrieve data from one of the replicas; fails over to primary if needed
50+
user_cache = await cache.hget("users", user_id) # Output: '{"id": "1", "name": "Alice"}'
51+
52+
if user_cache is None:
53+
print("User cache not found. Fetching from DB...")
54+
user_data = get_user_data_from_db(user_id)
55+
else:
56+
user_data = json.loads(user_cache)
57+
58+
print(user_data)
59+
60+
```
61+
62+
### 🚨 Exception Handling
63+
Redis connection issues are automatically caught and converted to appropriate FastAPI HTTPException errors:
64+
65+
| Redis Error | HTTP Status Code | Message |
66+
|-------------------------|-----------------------|--------------------------|
67+
| `ConnectionError` | 503 | Service Unavailable |
68+
| `TimeoutError` | 504 | Gateway Timeout |
69+
| `Other Redis Exception` | 500 | Internal Server Error |
2870

29-
# uses one of the given replicas to retrieve a data. When the replica fails, it gracefully switches to primary to retrieve the data
30-
cache.hget("myhash", "key1") # output: value1
71+
This means you can focus on your logic and let the cache gracefully degrade:
3172
```
73+
from fastapi import FastAPI, HTTPException
74+
from kompress_cache import get_cache, Loadable
75+
from myapp.models import UserModel
76+
from myapp.db import get_user_data_from_db
77+
78+
app = FastAPI()
79+
cache = get_cache()
80+
81+
@app.get("/users/{user_id}")
82+
async def get_user(user_id: str):
83+
try:
84+
user_loader = MyUserLoader(user_id)
85+
return await cache.hget_l("users", user_id, user_loader, UserModel)
86+
except HTTPException as e:
87+
if e.status_code in (503, 504):
88+
return await get_user_data_from_db(user_id)
89+
raise
90+
91+
```
92+
93+
### 🔸 Smart Caching with Pydantic + Loader
94+
Let's take it further using the power of Pydantic and cache-miss loaders.
95+
```
96+
from pydantic import BaseModel
97+
from kompress_cache import get_cache, Loadable
98+
99+
cache = get_cache()
100+
101+
# Define your Pydantic model
102+
class UserModel(BaseModel):
103+
id: str
104+
name: str
105+
106+
# Create a loader for cache misses
107+
class UserLoader(Loadable):
108+
def __init__(self, user_id: str):
109+
self.user_id = user_id
110+
111+
async def load(self) -> str:
112+
user_data = await get_user_data_from_db_or_external_api(self.user_id)
113+
return UserModel(**user_data).model_dump_json()
114+
115+
user_id = "1"
116+
user_loader = UserLoader(user_id)
117+
118+
# Retrieve the user data
119+
user_data = await cache.hget_l("users", user_id, user_loader, UserModel)
120+
121+
# If data is:
122+
# - not in cache → it is loaded via the loader and cached.
123+
# - in cache → the data is validated against the given BaseModel and if data is:
124+
# - invalid → it is refreshed via the loader.
125+
# - valid → returned as a validated model instance.
126+
127+
print(user_data.id) # "1"
128+
print(user_data.name) # "Alice"
129+
130+
```
131+
132+
### 🔁 Evolving Schema with Automatic Refresh
133+
Imagine you update your user schema to include an age field:
134+
```
135+
class UserModel(BaseModel):
136+
id: str
137+
name: str
138+
age: int
139+
```
140+
141+
- If the cache still has the old schema (without age), validation will fail.
142+
143+
- The hget_l method will detect this, trigger the loader, and update the cache with the latest schema.
144+
145+
✅ No need to manually invalidate the cache — just update your model!
146+
147+
148+
---
149+
150+
## 🧪 Built for Async
151+
Your cache logic works natively with async functions. Whether using FastAPI background tasks or high-throughput endpoints, Redis calls are non-blocking, thanks to `redis.asyncio`.
152+
153+
154+
---
155+
156+
157+
## 🤝 Contributing
158+
Pull requests are welcome! For major changes, please open an issue first to discuss what you’d like to change.
159+
160+
161+
---
162+
163+
## 📜 License
164+
165+
[MIT](/LICENSE)
166+
167+
---
32168

33169

34170
## Authors and acknowledgment

kompress_cache/__init__.py

Lines changed: 9 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,15 @@
11
from redis.asyncio import Redis
22

3-
from .cache import Cache
3+
from .cache import Cache, Loadable
44

55

66
def get_cache() -> Cache | Redis:
7+
"""Factory method to get an instance of the cache.
8+
9+
Returns:
10+
Cache | Redis: A new `Cache` instance configured with primary and replicas.
11+
"""
712
return Cache()
13+
14+
15+
__all__ = ["Loadable", "get_cache"]

0 commit comments

Comments
 (0)