Custom API URls & Langfuse Support #212
Closed
mrjasonroy
started this conversation in
Ideas
Replies: 2 comments
-
We have openAi like which supports litellm |
Beta Was this translation helpful? Give feedback.
0 replies
-
https://v4.ai-sdk.dev/providers/observability/langfuse#langfuse-observability |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Many companies (or individuals) use proxies like litellm or kong to proxy llms to control spend, monitor usage and for compliance reasons. These work typically by having a custom base API url and keys.
Proposal:
Companies who don't use a proxy will often use langfuse, which writes to standard opentelemetery endpoints to see how their tools are being used, this is fairly simple to setup using nextjs's observability options.
I am proposing adding both options that are configurable through environment variables to further enhance the platform.
Beta Was this translation helpful? Give feedback.
All reactions