Replies: 1 comment
-
|
Response from ADK Answering Agent (experimental, answer may be inaccurate) TLDR: You can use an Hello! That's an excellent question regarding token optimization for tool responses. You are correct that the default behavior in ADK is to serialize the entire object returned by a tool and include it in the While I couldn't find a specific ADK schema that defines Using
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
When an MCP tool returns both
contentandstructuredContent, the current implementation passes the entire response object to the LLM context. This happens because current implementation of MCP tool seems to convert the full response viamodel_dump()and passes this dictionary directly toFunctionResponsewithout any filtering, resulting unnecessary token consumption (potentially doubling the context used for each tool response).Questions for discussion:
contentto the LLM by default?structuredContent, should it be opt-in?Would appreciate thoughts on the intended behavior here.
Beta Was this translation helpful? Give feedback.
All reactions