Replies: 2 comments 2 replies
-
is a langflow MCP server, like any other MCP server? if so it would just be like a tool to a model which supports mcp servers right (openai/anthropic for now i think). ? |
Beta Was this translation helpful? Give feedback.
1 reply
-
not sure. what's the api spec? @flefevre |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi LiteLLM team
I wanted to ask if LiteLLM currently supports referencing or routing requests to agents built using Langflow's agent system or their MCP server.
Specifically:
Is it possible to integrate a Langflow agent endpoint as a "model" in LiteLLM's configuration?
Similarly, can LiteLLM call a Langflow MCP server endpoint in the same way it would call OpenAI or other custom endpoints?
If not currently supported, would this kind of integration fall within the project's scope?
The goal would be to use Langflow-constructed agents as "models" that LiteLLM can route to based on model name or use them in conjunction with LiteLLM’s proxy features.
If it is already possible, could you provide some tips or better a tutorial in the documentation.
Thanks again for your time and looking forward to your thoughts on this!
Best,
François
Beta Was this translation helpful? Give feedback.
All reactions