LiteLLM + Ollama + lobe-chat implementation issues. #1345
Unanswered
adan89lion
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I'm trying to use lobe-chat as a frontend for Ollama, proxied via LiteLLM. However, I've been encountering 500 internal server error.
The following are my docker-compose and config files
LiteLLM Docker Compose File
LiteLLM Config File
lobe-chat Docker Compose File
Accessing
model list
produces the following error for LiteLLMAnd lobe-chat returned the following error
Beta Was this translation helpful? Give feedback.
All reactions