-
Notifications
You must be signed in to change notification settings - Fork 2k
Description
** Please make sure you read the contribution guide and file the issues in the right place. **
Contribution guide.
Describe the bug
The FunctionTool's parameter filtering logic incorrectly filters out parameters for functions that use **kwargs, causing CrewAI-style tools to receive empty parameter dictionaries. This breaks integration with CrewAI tools that use the *args, **kwargs pattern.
To Reproduce
Please share a minimal code and data to reproduce your problem.
Steps to reproduce the behavior:
"""Main agent for the speaker research system following ADK multi-agent patterns."""
from google.adk.agents import LlmAgent
from google.adk.tools import FunctionTool
from google.adk.tools.crewai_tool import CrewaiTool
from crewai_tools import SerperDevTool
# Instantiate the CrewAI tool
serper_tool_instance = SerperDevTool()
sweb_search_tool = CrewaiTool(
tool=serper_tool_instance,
name='WebSearch',
description='Searches the web for information.'
)
# Main root agent that coordinates the entire system
root_agent = LlmAgent(
model="gemini-2.5-flash",
name='research_agent',
description='A specialized agent for researching and evaluating.',
instruction='''You are a research agent. When you need to search for information use WebSearch tool.'',
tools=[web_search_tool]
)
In src/google/adk/tools/function_tool.py
line 109, the parameter filtering logic:
args_to_call = {k: v for k, v in args_to_call.items() if k in valid_params}
Only allows parameters that match the function's signature parameter names. For functions with **kwargs, the valid parameters are {"self", "args", "kwargs"}, so specific parameters like search_query get filtered out.
Expected behavior
A clear and concise description of what you expected to happen.
Functions that accept **kwargs should receive all provided parameters (except self and tool_context) in their kwargs dictionary.
Desktop (please complete the following information):
- OS: macOS
- Python version: 3.13
- ADK version: (1.15.1) [current main branch]
Model Information:
- Are you using LiteLLM: No
- Which model is being used: `gemini-2.5-flash