Connecting to custom endpoint (LiteLLM) via plugins fails with 401 error #6282
sanjeevkumarraob
started this conversation in
Help Wanted
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi All,
I am using LiteLLM as a proxy to connect to Claude AI in AWS Bedrock. I have a custom secure tool to connect to other apps in my company. Earlier we were using 0.7.4 version of librechat and now we are in the upgrade process. I have seen the direct bedrock connectivity and litellm as endpoints, but i dont want to use that way. I want to go through the tool first and then if the request needs more info, then it will connect to proxy and get the data.

With the new langchain/openai package versions, this is not working and i tried to add the base url as per https://python.langchain.com/api_reference/openai/chat_models/langchain_openai.chat_models.base.ChatOpenAI.html#langchain_openai.chat_models.base.ChatOpenAI but still no use.
Have anyone tried this way or getting similar errors? how can i resolve this issue?
TIA
Beta Was this translation helpful? Give feedback.
All reactions