-
Notifications
You must be signed in to change notification settings - Fork 961
Description
The repository's current RAG examples follow a linear execution path. these pipelines work well for straightforward queries, but they are not robust; if the retrieval step retrieves irrelevant documents, the model frequently imagines an answer.
Although loops and self-correction are used in modern "Agentic" workflows to address this, there isn't yet a specific tutorial that shows how to integrate LangGraph with OpenVINO.
Suggested Plan of Action
I suggest creating a new notebook called notebooks/llm-agent-langgraph/llm-agent-langgraph.ipynb.
The goal is to create a Self-Correcting RAG Agent that can grade the relevance of documents it retrieves and, if needed, rewrite its own search queries.
- State Management: Using
LangGraphto maintain the conversation state is one of the key technical concepts. - Conditional Logic: Using an OpenVINO-optimized LLM, a "Grader" node is implemented to determine whether the retrieval context is adequate.
- Cyclic Execution: The agent loops back to rewrite the query and re-retrieve if there is not enough context.
- Orchestration:
langgraph,langchain - Model:
Llama-3orPhi-3(viaoptimum-intel/openvino-genai) - Database: either
ChromadborFaiss
Value to Community
- Demonstrates capability of OpenVINO-optimized models to handle complex, multi-step agentic reasoning (not just single-turn generation).
- Provides a template for developers looking to build "Compound AI Systems" on Intel hardware
I am interested in implementing this notebook. Could you please assign this issue to me? I aim to have a Draft PR ready for review soon.