Skip to content

Commit 1479935

Browse files
authored
Merge pull request #3 from redis-developer/mcp-server-explicit-params
Omnibus of changes
2 parents 25e3149 + 899fde2 commit 1479935

38 files changed

+4710
-844
lines changed

README.md

Lines changed: 53 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -279,3 +279,56 @@ python -m pytest
279279
3. Commit your changes
280280
4. Push to the branch
281281
5. Create a Pull Request
282+
283+
## Running the Background Task Worker
284+
285+
The Redis Memory Server uses Docket for background task management. There are two ways to run the worker:
286+
287+
### 1. Using the Docket CLI
288+
289+
After installing the package, you can run the worker using the Docket CLI command:
290+
291+
```bash
292+
docket worker --tasks agent_memory_server.docket_tasks:task_collection --docket memory-server
293+
```
294+
295+
You can customize the concurrency and redelivery timeout:
296+
297+
```bash
298+
docket worker --tasks agent_memory_server.docket_tasks:task_collection --concurrency 5 --redelivery-timeout 60 --docket memory-server
299+
```
300+
301+
**NOTE:** The name passed with `--docket` is effectively the name of a task queue where
302+
the worker will look for work. This name should match the docket name your API server
303+
is using, configured with the `docket_name` setting via environment variable
304+
or directly in `agent_memory_server.config.Settings`.
305+
306+
## Memory Compaction
307+
308+
The memory compaction functionality optimizes storage by merging duplicate and semantically similar memories. This improves retrieval quality and reduces storage costs.
309+
310+
### Key Features
311+
312+
- **Hash-based Deduplication**: Identifies and merges exact duplicate memories using content hashing
313+
- **Semantic Deduplication**: Finds and merges memories with similar meaning using vector search
314+
- **LLM-powered Merging**: Uses language models to intelligently combine memories
315+
316+
### Testing Approach
317+
318+
Testing the memory compaction functionality involves:
319+
320+
1. **Unit Tests**: Testing individual helper functions like `generate_memory_hash` and `merge_memories_with_llm`
321+
2. **Integration Tests**: Testing the complete workflow with minimal mocking
322+
3. **Mocked Tests**: Using helper functions to test specific parts of the workflow
323+
324+
The main integration test (`test_compact_memories_integration`) demonstrates the memory merging functionality without relying on Redis search, which makes it more robust and less prone to environment-specific failures.
325+
326+
### Running Tests
327+
328+
```bash
329+
# Run all tests
330+
python -m pytest tests/test_memory_compaction.py
331+
332+
# Run specific integration test
333+
python -m pytest tests/test_memory_compaction.py::TestMemoryCompaction::test_compact_memories_integration -v
334+
```

agent_memory_server/api.py

Lines changed: 15 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,10 @@
11
from typing import Literal
22

3-
from fastapi import APIRouter, BackgroundTasks, Depends, HTTPException
3+
from fastapi import APIRouter, Depends, HTTPException
44

55
from agent_memory_server import long_term_memory, messages
66
from agent_memory_server.config import settings
7+
from agent_memory_server.dependencies import get_background_tasks
78
from agent_memory_server.llms import get_model_config
89
from agent_memory_server.logging import get_logger
910
from agent_memory_server.models import (
@@ -16,7 +17,7 @@
1617
SessionMemory,
1718
SessionMemoryResponse,
1819
)
19-
from agent_memory_server.utils import get_redis_conn
20+
from agent_memory_server.utils.redis import get_redis_conn
2021

2122

2223
logger = get_logger(__name__)
@@ -63,7 +64,7 @@ async def list_sessions(
6364
Returns:
6465
List of session IDs
6566
"""
66-
redis = get_redis_conn()
67+
redis = await get_redis_conn()
6768

6869
total, session_ids = await messages.list_sessions(
6970
redis=redis,
@@ -101,7 +102,7 @@ async def get_session_memory(
101102
Returns:
102103
Conversation history and context
103104
"""
104-
redis = get_redis_conn()
105+
redis = await get_redis_conn()
105106

106107
# If context_window_max is explicitly provided, use that
107108
if context_window_max is not None:
@@ -130,19 +131,20 @@ async def get_session_memory(
130131
async def put_session_memory(
131132
session_id: str,
132133
memory: SessionMemory,
133-
background_tasks: BackgroundTasks,
134+
background_tasks=Depends(get_background_tasks),
134135
):
135136
"""
136137
Set session memory. Replaces existing session memory.
137138
138139
Args:
139140
session_id: The session ID
140141
memory: Messages and context to save
142+
background_tasks: DocketBackgroundTasks instance (injected automatically)
141143
142144
Returns:
143145
Acknowledgement response
144146
"""
145-
redis = get_redis_conn()
147+
redis = await get_redis_conn()
146148

147149
await messages.set_session_memory(
148150
redis=redis,
@@ -168,7 +170,7 @@ async def delete_session_memory(
168170
Returns:
169171
Acknowledgement response
170172
"""
171-
redis = get_redis_conn()
173+
redis = await get_redis_conn()
172174
await messages.delete_session_memory(
173175
redis=redis,
174176
session_id=session_id,
@@ -179,26 +181,25 @@ async def delete_session_memory(
179181

180182
@router.post("/long-term-memory", response_model=AckResponse)
181183
async def create_long_term_memory(
182-
payload: CreateLongTermMemoryPayload, background_tasks: BackgroundTasks
184+
payload: CreateLongTermMemoryPayload,
185+
background_tasks=Depends(get_background_tasks),
183186
):
184187
"""
185188
Create a long-term memory
186189
187190
Args:
188191
payload: Long-term memory payload
192+
background_tasks: DocketBackgroundTasks instance (injected automatically)
189193
190194
Returns:
191195
Acknowledgement response
192196
"""
193-
redis = get_redis_conn()
194-
195197
if not settings.long_term_memory:
196198
raise HTTPException(status_code=400, detail="Long-term memory is disabled")
197199

198-
await long_term_memory.index_long_term_memories(
199-
redis=redis,
200+
await background_tasks.add_task(
201+
long_term_memory.index_long_term_memories,
200202
memories=payload.memories,
201-
background_tasks=background_tasks,
202203
)
203204
return AckResponse(status="ok")
204205

@@ -214,7 +215,7 @@ async def search_long_term_memory(payload: SearchPayload):
214215
Returns:
215216
List of search results
216217
"""
217-
redis = get_redis_conn()
218+
redis = await get_redis_conn()
218219

219220
if not settings.long_term_memory:
220221
raise HTTPException(status_code=400, detail="Long-term memory is disabled")

agent_memory_server/client/__init__.py

Whitespace-only changes.

0 commit comments

Comments
 (0)