@@ -7,24 +7,15 @@ A Redis-powered memory server built for AI agents and applications. It manages b
77- ** Working Memory**
88
99 - Session-scoped storage for messages, structured memories, context, and metadata
10- - Automatically summarizes conversations when they exceed a client-configured window size
10+ - Automatically summarizes conversations when they exceed a client-configured (or server-managed) window size
1111 - Supports all major OpenAI and Anthropic models
1212 - Automatic (background) promotion of structured memories to long-term storage
1313
1414- ** Long-Term Memory**
1515
1616 - Persistent storage for memories across sessions
17- - ** Pluggable Vector Store Backends** - Support for multiple vector databases through LangChain VectorStore interface:
18- - ** Redis** (default) - RedisStack with RediSearch
19- - ** Chroma** - Open-source vector database
20- - ** Pinecone** - Managed vector database service
21- - ** Weaviate** - Open-source vector search engine
22- - ** Qdrant** - Vector similarity search engine
23- - ** Milvus** - Cloud-native vector database
24- - ** PostgreSQL/PGVector** - PostgreSQL with vector extensions
25- - ** LanceDB** - Embedded vector database
26- - ** OpenSearch** - Open-source search and analytics suite
27- - Semantic search to retrieve memories with advanced filtering system
17+ - Pluggable Vector Store Backends - Support for any LangChain VectorStore (defaults to Redis)
18+ - Semantic search to retrieve memories with advanced filtering
2819 - Filter by session, user ID, namespace, topics, entities, timestamps, and more
2920 - Supports both exact match and semantic similarity search
3021 - Automatic topic modeling for stored memories with BERTopic or configured LLM
0 commit comments