How to get a simple setup running? (It crashes fore me with "Router.atext_completion()" error) #9726
-
I am trying to get LiteLLM proxy working. Here is my setup:
Running
Additionally, the docker logs show me:
I am unsure what this "prompt" parameter is. The request body should be a normal OpenAI api request. Does anyone have an idea what's wrong here? |
Beta Was this translation helpful? Give feedback.
Answered by
SwiftedMind
Apr 3, 2025
Replies: 1 comment
-
I am stupid, I have to use |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
SwiftedMind
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I am stupid, I have to use
chat/completions
notcompletions
. Sorry! It works now 🎉