Add remote client to langchain_ollama #24497
antoninoLorenzo
announced in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Checked
Feature request
In the Ollama classes in langchain_ollama there is no possibility to set a remote client.
For example, in
OllamaEmbeddings
: instead of usingollama.embedding(...)
there should be aollama.Client
configured; the default endpoint will still be localhost, hower for remote Ollama inference one will do something likeOllamaEmbeddings(model='...', host='...')
.Motivation
This functionality would make the
Ollama
wrappers more flexible. This would improve the library experience for open source solutions.Proposal (If applicable)
No response
Beta Was this translation helpful? Give feedback.
All reactions