Replies: 2 comments 5 replies
-
hey @phasmax how would you want to specify this? |
Beta Was this translation helpful? Give feedback.
1 reply
-
@krrishdholakia Hi, I have maybe miss the doc information, but can we also passe the request message from a client like "openwebui" to the "prompt_variables" setting dynamically ? I didn't find a way to do that with litellm proxy. Also if we could route the result to another LLM model before sending the response to the client. |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Langfuse prompt management (prompt retrieval by ID) works well. The only thing missing is the ability to also specify the prompt_label - ie in langfuse you can have "production", "stage" labels for versions of a prompt.
Beta Was this translation helpful? Give feedback.
All reactions