You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A service that provides memory management for AI applications using Redis.
3
+
A service that provides memory management for AI applications using Redis. This server helps manage both short-term and long-term memory for AI conversations, with features like automatic topic extraction, entity recognition, and context summarization.
4
4
5
5
## Features
6
6
7
-
- Short-term memory management with configurable window size
8
-
- Long-term memory with semantic search capabilities
9
-
- Automatic context summarization using LLMs
10
-
- Support for multiple model providers (OpenAI and Anthropic)
11
-
- Configurable token limits based on selected model
12
-
- Topic extraction using BERTopic
13
-
- Named Entity Recognition using BERT
7
+
-**Short-term Memory Management**
8
+
- Configurable window size for recent messages
9
+
- Automatic context summarization using LLMs
10
+
- Token limit management based on model capabilities
14
11
15
-
## Configuration
12
+
-**Long-term Memory**
13
+
- Semantic search capabilities
14
+
- Automatic message indexing
15
+
- Configurable memory retention
16
16
17
-
The service can be configured using environment variables:
18
-
19
-
-`REDIS_URL`: URL for Redis connection (default: `redis://localhost:6379`)
2. Open a terminal in the project root directory (where the docker-compose.yml file is located).
49
32
50
-
### Topic and NER Models
33
+
3. (Optional) Set up your environment variables (such as OPENAI_API_KEY and ANTHROPIC_API_KEY) either in a .env file or by modifying the docker-compose.yml as needed.
51
34
52
-
- Topic Extraction: Uses BERTopic with the specified model (default: Wikipedia-trained model)
53
-
- Named Entity Recognition: Uses BERT model fine-tuned on CoNLL-03 dataset
35
+
4. Build and start the containers by running:
36
+
docker-compose up --build
54
37
55
-
**Note**: Embedding operations always use OpenAI models, as Anthropic does not provide embedding API.
38
+
5. Once the containers are up, the API will be available at http://localhost:8000. You can also access the interactive API documentation at http://localhost:8000/docs.
56
39
57
-
## Installation
40
+
6. To stop the containers, press Ctrl+C in the terminal and then run:
0 commit comments