-
Notifications
You must be signed in to change notification settings - Fork 2.6k
Description
** Please make sure you read the contribution guide and file the issues in the right place. **
Contribution guide.
Describe the bug
I try to connect ADK to a self-hosted vLLM server. follow the guide it should use litellm to do it, everything looks good but the remote MCP server not work
I checked the Events in the dev UI, it is able to recognize how many tools belong to the MCP server, the model also can reply me that it is using the tools to do something, but it is not.
I have tested many kind of models, like Llama4 and granite, none of them can run the mcp tools
When I use LiteLLm, AI will not call the tools
root_agent = LlmAgent(
model=LiteLlm(
model=MODEL_ID,
api_base=MODEL_API,
extra_headers={"Authorization": f"Bearer {MODEL_TOKEN}"}
),
name="root_agent",
instruction=system_instruction,
tools=mcp_toolsets,
)I try to use gemini, everything works good
root_agent = Agent(
model='gemini-3-pro-preview',
name='root_agent',
instruction=system_instruction,
tools=mcp_toolsets,
)To Reproduce
Please share a minimal code and data to reproduce your problem.
Steps to reproduce the behavior:
- Use litellm to connect a self-hosted vLLM server
- Add a remote MCP server, it could be a Stream or SSE
- Try to ask AI to get some information via the MCP tools
Expected behavior
A clear and concise description of what you expected to happen.
Screenshots
If applicable, add screenshots to help explain your problem.
Desktop (please complete the following information):
- OS: [e.g. macOS, Linux, Windows]
- Python version(python -V):
- ADK version(pip show google-adk):
Model Information:
- Are you using LiteLLM: Yes
- Which model is being used(e.g. gemini-2.5-pro)
Additional context
Add any other context about the problem here.