Skip to content

Commit 515aa0c

Browse files
author
marwan37
committed
add images for README
1 parent a3be487 commit 515aa0c

File tree

3 files changed

+17
-9
lines changed

3 files changed

+17
-9
lines changed

omni-reader/README.md

Lines changed: 17 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -20,23 +20,18 @@ pip install -r requirements.txt
2020

2121
### Configuration
2222

23-
1. Ensure Ollama is running with the required models:
23+
1. Ensure any Ollama models you want to use are pulled, e.g.:
2424

2525
```bash
26-
# For using the default Qwen2 model
2726
ollama pull llama3.2-vision:11b
2827
ollama pull gemma3:12b
29-
30-
# If using other Ollama models in your config, pull those as well
31-
# ollama pull llama3:70b
32-
# ollama pull dolphin-mixtral:8x7b
3328
```
3429

3530
2. Set the following environment variables:
3631

3732
```bash
38-
MISTRAL_API_KEY=your_mistral_api_key
3933
OPENAI_API_KEY=your_openai_api_key
34+
MISTRAL_API_KEY=your_mistral_api_key
4035
```
4136

4237
## 📌 Usage
@@ -66,9 +61,9 @@ models:
6661
custom_prompt: null # Optional custom prompt for all models
6762
# Either specify individual models (for backward compatibility)
6863
model1: "llama3.2-vision:11b" # First model for comparison
69-
model2: "gemma3:12b" # Second model for comparison
64+
model2: "mistral/pixtral-12b-2409" # Second model for comparison
7065
# Or specify multiple models as a list (new approach)
71-
models: ["llama3.2-vision:11b", "gemma3:12b"]
66+
models: ["llama3.2-vision:11b", "mistral/pixtral-12b-2409"]
7267
ground_truth_model: "gpt-4o-mini" # Model to use for ground truth when source is "openai"
7368

7469
# Ground truth configuration
@@ -130,6 +125,19 @@ For interactive use, the project includes a Streamlit app:
130125
streamlit run app.py
131126
```
132127

128+
### Remote Artifact Storage or Running Remotely
129+
130+
For remote artifact storage or running remotely, install the ZenML integrations for your cloud provider.
131+
132+
For example, for AWS, install the AWS integration:
133+
134+
```bash
135+
zenml integration install aws -y
136+
zenml integration install s3 -y
137+
```
138+
139+
And ensure your stack has a remote
140+
133141
## 📋 Pipeline Architecture
134142

135143
The OCR comparison pipeline consists of the following components:
385 KB
Loading

omni-reader/assets/streamlit.png

1.27 MB
Loading

0 commit comments

Comments
 (0)