Skip to content

local RAG notebook with Ollama, ChromaDB, and optional agentic workflow#3384

Open
Irfan-del-droid wants to merge 7 commits intoopenvinotoolkit:latestfrom
Irfan-del-droid:feature/local-agentic-rag
Open

local RAG notebook with Ollama, ChromaDB, and optional agentic workflow#3384
Irfan-del-droid wants to merge 7 commits intoopenvinotoolkit:latestfrom
Irfan-del-droid:feature/local-agentic-rag

Conversation

@Irfan-del-droid
Copy link
Copy Markdown

This PR adds a new notebook demonstrating a minimal, fully local Retrieval-Augmented Generation (RAG) pipeline.

Key features:

  • Local LLM inference using Ollama
  • Document embedding and retrieval with ChromaDB
  • End-to-end RAG pipeline implementation
  • Optional agentic workflow using LangGraph
  • Optional OpenVINO integration for optimized inference

The notebook is designed to be:

  • CPU-friendly (no GPU required)
  • fully local (no external APIs)
  • beginner-friendly and modular

Advanced components such as the agentic workflow and OpenVINO integration are clearly marked as optional to maintain accessibility.

This contribution aims to provide a practical introduction to local-first AI workflows aligned with OpenVINO notebook standards.

Happy to refine based on feedback.

@review-notebook-app
Copy link
Copy Markdown

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@Irfan-del-droid
Copy link
Copy Markdown
Author

Hi! This notebook focuses on a minimal local RAG pipeline with optional extensions for agentic workflows and OpenVINO optimization. Please let me know if any adjustments are needed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant