Use Litellm as a proxy for all providers #9499
Unanswered
menardorama
asked this question in
Q&A
Replies: 1 comment
-
@danny-avila do you havé any idea ? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I am deploying lots of isolated Librechat instances and having to log on each LLM provider is getting quite annoying for me.
We are already using Litellm for llm api access in order that we can plug other tools, for instance we use wildcard routing (https://docs.litellm.ai/docs/wildcard_routing).
As of Librechat is configured, I need to define each provider config as env variables for the legacy ones (OpenAI, Anthropic...) and as custom endpoints for the others (Mistral, Groq...).
The thing is using wildcard routing imply that instead defining for instance
gpt-5
we have to defineopenai/gpt5
; this brake the default naming of what is displayed on Librechat.I saw that I may be able to define the models using modelspecs, but this doesn't sound as easy for me.
My idea is to proxy nearly everything through Litellm in order to gather llm usage from a single point and get hopefully more precise metrics of consumption (especially when there is some billing); but keep endpoint provider category (in the menu, with the icons)
So my question is that if anybody already did this ? Can you share the librechat.yaml ?
If it's not possible, @danny-avila does it make sense to add if to the roadmap ?
Thanks
Beta Was this translation helpful? Give feedback.
All reactions