Skip to content

Commit ec1b26c

Browse files
authored
docs: add ollama as embedding providers (#1656)
1 parent 5ceeebf commit ec1b26c

File tree

1 file changed

+16
-0
lines changed

1 file changed

+16
-0
lines changed

docs/docs/ops/litellm.md

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -107,6 +107,22 @@ The connector automatically creates the appropriate `vector(N)` column. See the
107107

108108
Below are common providers with their model strings and configuration. The `litellm` module is re-exported from `cocoindex.ops.litellm` for setting provider-specific variables. See the [LiteLLM embedding docs](https://docs.litellm.ai/docs/embedding/supported_embedding) for the full list.
109109

110+
### Ollama
111+
112+
| Model | Model string |
113+
|--------------------------|-------------------------------|
114+
| Nomic Embed Text | `ollama/nomic-embed-text` |
115+
| MXBai Embed Large | `ollama/mxbai-embed-large` |
116+
| All MiniLM | `ollama/all-minilm` |
117+
| Snowflake Arctic Embed | `ollama/snowflake-arctic-embed` |
118+
| BGE M3 | `ollama/bge-m3` |
119+
120+
No API key required. Ollama must be running locally (default `http://localhost:11434`). Pull the model first with `ollama pull <model-name>`.
121+
122+
```python
123+
embedder = LiteLLMEmbedder("ollama/nomic-embed-text", api_base="http://localhost:11434")
124+
```
125+
110126
### OpenAI
111127

112128
| Model | Model string |

0 commit comments

Comments
 (0)