Skip to content

Commit 4500032

Browse files
committed
update README
1 parent b6c22d1 commit 4500032

File tree

1 file changed

+63
-33
lines changed

1 file changed

+63
-33
lines changed

README.md

Lines changed: 63 additions & 33 deletions
Original file line numberDiff line numberDiff line change
@@ -15,30 +15,52 @@ Ragadoc is a privacy-first Streamlit application that lets you chat with your do
1515

1616
## 🚀 Quick Start
1717

18-
### Prerequisites
18+
### Model Selection Guide
1919

20-
1. **Install Ollama** (for local AI models):
21-
```bash
22-
# macOS
23-
brew install ollama
24-
25-
# Or download from https://ollama.com
26-
```
20+
Choose models based on your system capabilities:
2721

28-
2. **Start Ollama and install required models**:
29-
```bash
30-
ollama serve
31-
32-
# Install embedding model (required)
33-
ollama pull nomic-embed-text
34-
35-
# Install a chat model (choose one)
36-
ollama pull qwen3:14b # Recommended
37-
ollama pull llama3.1:8b # Alternative
38-
ollama pull mistral:latest # Alternative
39-
```
22+
| Model Type | Model Name | Size | RAM Required | Use Case |
23+
|------------|------------|------|--------------|----------|
24+
| **Embedding** | `nomic-embed-text` | ~274MB | 1GB | **Recommended** - General purpose |
25+
| **Embedding** | `all-minilm` | ~23MB | 512MB | Lightweight alternative |
26+
| **Chat** | `qwen3:14b` | ~8.5GB | 16GB | **Recommended** - Large model |
27+
| **Chat** | `llama3.1:8b` | ~4.7GB | 8GB | Balanced option |
28+
| **Chat** | `mistral:latest` | ~4.1GB | 8GB | Quick responses |
29+
| **Chat** | `phi3:mini` | ~2.3GB | 4GB | Low-resource systems |
30+
31+
> **Recommendation**: Use `nomic-embed-text` for embeddings and `qwen3:14b` for chat if your system supports it. For lower-spec systems, try `mistral:latest` or `phi3:mini`.
32+
33+
### Prerequisites (Required for Both Installation Methods)
34+
35+
**1. Install Ollama** (for local AI models):
36+
```bash
37+
# macOS
38+
brew install ollama
39+
40+
# Or download from https://ollama.com
41+
```
42+
43+
**2. Start Ollama and install required models**:
44+
```bash
45+
ollama serve
46+
47+
# Install embedding model (required)
48+
ollama pull nomic-embed-text
49+
50+
# Install a chat model (see recommendations above)
51+
ollama pull qwen3:14b
52+
```
53+
54+
### Installation Options
55+
56+
Choose your preferred installation method:
4057

41-
### Installation
58+
### Option 1: Direct Installation
59+
60+
**Additional Prerequisites:**
61+
- Python 3.8+
62+
63+
**Installation Steps:**
4264

4365
1. **Clone the repository**:
4466
```bash
@@ -64,6 +86,26 @@ Ragadoc is a privacy-first Streamlit application that lets you chat with your do
6486

6587
4. **Open your browser** to `http://localhost:8501`
6688

89+
### Option 2: Docker Installation
90+
91+
**Additional Prerequisites:**
92+
- Docker and Docker Compose
93+
94+
**Installation Steps:**
95+
96+
1. **Clone the repository**:
97+
```bash
98+
git clone https://github.com/yourusername/ragadoc.git
99+
cd ragadoc
100+
```
101+
102+
2. **Start with Docker Compose**:
103+
```bash
104+
docker-compose up
105+
```
106+
107+
3. **Open your browser** to `http://localhost:8501`
108+
67109
## 📖 How to Use
68110

69111
1. **Upload a PDF** - Drag and drop or browse for your document
@@ -116,19 +158,7 @@ Configure in the sidebar:
116158
- **Chunk Overlap**: Text overlap between chunks (default: 50)
117159
- **Top-K Results**: Number of relevant chunks to consider (default: 5)
118160

119-
## 🐳 Docker Support
120-
121-
Run Ragadoc in Docker:
122-
123-
```bash
124-
# Build the image
125-
docker-compose build
126-
127-
# Start the application
128-
docker-compose up
129-
```
130161

131-
Access at `http://localhost:8501`
132162

133163
## 🧪 Testing
134164

0 commit comments

Comments
 (0)