Rate Limit Fallback for OpenWebUI compatibility #14066
Unanswered
nicolasj92
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I am using LiteLLM as the router, API key provider and rate limiter behind an OpenWebUI chat interface.
It works pretty well but one odd behavior is that when the API key rate limit, that is configured in LiteLLM, is reached, OpenWebUI just hangs as it cannot process the 429 error I think or it is not transmitted in streaming mode.
Is there some way to just return a default output that informs the user that the rate limit was reached?
Any ideas are appreciated.
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions