[Enhancement]: Add Optional MCP-Orchestrator + Hybrid RAG (Graph + Vector) Memory Layer #8129
CrazyAce25
started this conversation in
Feature Requests & Suggestions
Replies: 1 comment
-
been thinking about this for a while — totally agree vector-only RAG starts falling apart when reasoning depth increases or memory has to cross contexts. we actually implemented something pretty close to what you're describing:
it's MIT-licensed and works in production. even got a few upstream tool authors to vouch for it (Tesseract.js dev included). if you're building around this direction, ping me — happy to share the stack and real usage patterns. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
What features would you like to see added?
LibreChat is already one of the most dynamic open-source frontends for agentic workflows. But to truly unlock next-gen memory, local deployment speed, and structured reasoning I thought the following would be useful.
These additions would enable scoped task memory, memory-driven toolchains, and adaptive retrieval—all without sacrificing the modularity that I think all LibreChat users cherish.
🧠 Why This Matters
Vector-only RAG is beginning to show its limits. It struggles with:
At the same time, the ecosystem is rapidly moving toward HybridRAG, structured graph memory, and memory-aware agents.
🔧 What's being Proposed
1. ✅ Optional Native MCP-Style Orchestrator
Introduce a lightweight memory/task orchestrator (inspired by ActiveContext) that enables:
Design Principle: Modular and opt-in—can be enabled, extended, or replaced.
2. ✅ Hybrid RAG Support (Vector + Graph Retrieval)
Enable agents to retrieve memory from:
This unlocks:
🧩 Design Philosophy: Modularity First
Everything in this proposal follows a plug-and-play approach—just like LibreChat's current vector pipeline:
vectorOnly
,graphOnly
,hybrid
)This ensures users who want a lightweight experience stay unburdened—while power users gain flexibility to build memory-aware agents, even in fully offline environments.
✅ Key Benefits
This would make LibreChat the most memory-capable local agent interface available today—and future-proof it for HybridRAG workflows.
🚀 Suggested Implementation Paths
🙌 Final Thoughts
LibreChat already leads the open source UI space for LLMs. Adding modular support for an MCP-style orchestrator and Hybrid RAG retrieval could unlock:
More details
.
Which components are impacted by your request?
No response
Pictures
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions