[feature request] Allow passing custom HTTP headers to langchain_ollama.ChatOllama(...)
and pass ChatOllama(api_key=)
to the Authorization: Bearer ...
header)
#30643
Closed
vadimkantorov
announced in
Ideas
Replies: 1 comment
-
While ChatOllama doesn't have a direct headers or To add an authorization header, you would pass a dictionary to
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked
Feature request
Classes like
ChatOpenAI
/ChatGoogleGenerativeAI
support acceptingapi_key=
constructor argument and pass it into appropriate HTTP header.Vanilla Ollama server currently does not support checking an API key from HTTP headers, but people still roll custom nginx wrapper proxies.
I propose to still allow passing extra
headers=
dict argument (allowing to pass indict(Authorization = f"Bearer {my_custom_api_key}")
)I noticed that
ChatOllama(...)
does not throw if I pass inapi_key=
. But it appears that is simply ignored and is not passed to any HTTP header: https://github.com/langchain-ai/langchain/blob/master/libs/partners/ollama/langchain_ollama/chat_models.pyFor comparison, for Mistral,
Authorization
header is being constructed out ofself.mistral_api_key
: https://github.com/langchain-ai/langchain/blob/master/libs/partners/mistralai/langchain_mistralai/chat_models.py#L526-L536Ollama's Client class actually supports arguments
headers=
: https://github.com/ollama/ollama-python/blob/main/ollama/_client.py#L73So ChatOllama could just enable passing it through to the underlying
Client
/AsyncClient
in_set_clients
Thanks!
Motivation
Basic protection of publicly exposed Ollama server
Proposal (If applicable)
No response
Beta Was this translation helpful? Give feedback.
All reactions