Dynamically discover tools and optimized tool calling #10814
Replies: 1 comment
-
|
This discussion was automatically closed because the community moved to community.vercel.com/ai-sdk |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Acknowledgement
Feedback
So MCP servers are getting a lot of heat overloading contexts with tool schemas and confusing the LLMS. Anthropic suggests an optimized tool calling approach (Code execution with MCP: Building more efficient agents) which reduces cost and increases efficiency. That said, it only works with Anthropic and is tied to MCP servers still and architecture wise quite complex.
But it inspired me to this idea: https://github.com/christianalfoni/ai-code-tools.
What if we can keep the existing API for tools. Instead of MCP servers we just do what we normally do, integrate with APIs with our exact needs and the LLM automatically discovers and efficiently and safely executes them.
In my mind this reduces complexity, simplifies the model drastically and gives back more control on what the LLM is allowed to do. Just provide isolated tools to whatever local/external data you want.
Just curious what people think about this.
Beta Was this translation helpful? Give feedback.
All reactions