Skip to content

MohamedMohy10/RAG-Chatbot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 

Repository files navigation

📄 AI Knowledge Base Chatbot

Overview

This project is a Retrieval-Augmented Generation (RAG) AI chatbot that can answer questions over uploaded PDFs.

  • Upload multiple PDFs
  • Search across documents
  • Get answers grounded in the documents
  • Track conversation history per PDF
  • See sources for each answer

Features

  • ✅ Upload PDFs and manage multiple documents
  • ✅ Ask questions about uploaded PDFs
  • ✅ Multi-PDF chat memory
  • ✅ Source citations for each answer
  • ✅ Fully deployable with Docker

Tech Stack

  • LLM: Ollama LLaMA 3
  • Framework: LangChain
  • Embeddings: HuggingFace Sentence Transformers
  • Vector DB: Chroma
  • Backend: FastAPI
  • Frontend: Streamlit
  • Deployment: Docker

Project Structure

rag-chatbot/
│
├── backend/
│ ├── backend.py
│ ├── requirements.txt
│
├── frontend/
│ ├── frontend.py
│ ├── requirements.txt
│
├── uploads/ # Folder for uploaded PDFs (auto-created)
├── backend.log # Generated by logging (auto-created)
├── Dockerfile
└── README.md

Setup Locally

Backend

cd backend
pip install -r requirements.txt
uvicorn backend:app --reload

Frontend

cd frontend
pip install -r requirements.txt
streamlit run frontend.py

Docker

docker build -t rag-chatbot .
docker run -p 8000:8000 -p 8501:8501 rag-chatbot

Usage

  1. Open the frontend in your browser
  2. Upload a PDF (or multiple PDFs)
  3. Select a PDF from the sidebar
  4. Ask questions; answers appear with chat history visible
  5. Switch PDFs anytime using the sidebar
  6. See sources for each answer

Note: Chat history is saved per PDF while the session is running. Backend logs are saved in backend.log.


Demo:

sample

About

No description or website provided.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors