-
Notifications
You must be signed in to change notification settings - Fork 414
Description
π Bug Summary
I am unable to call tools in my Virtual Server based on the official langchain_mcp_adapters documentation:
https://docs.langchain.com/oss/python/langchain/mcp#model-context-protocol-mcp
π§© Affected Component
Select the area of the project impacted:
-
mcpgateway- API -
mcpgateway- UI (admin panel) -
mcpgateway.wrapper- stdio wrapper - Federation or Transports
- CLI, Makefiles, or shell scripts
- Container setup (Docker/Podman/Compose)
- Other (explain below)
π Steps to Reproduce
Env:
fastmcp 2.11.3
langchain-mcp-adapters 0.1.14
langgraph 1.0.2
The issue can be reproduced by following the steps below:
- Create a basic MCP server using FastMCP
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("Weather", port=8010, host="0.0.0.0")
@mcp.tool()
async def get_weather(location: str) -> dict:
"""Get weather for location."""
return {"result": "It's always sunny in New York"}
if __name__ == "__main__":
mcp.run(transport="streamable-http")
- Register this mcp server to gateway, and create a pulic Virtual Server with this tool, then we get the server url:
http://{docker_hostname}:4444/servers/4fec000a5ec2456aae2c71e0f2920311/mcp
-Use my Virtual Server url in LangChain
from langchain_mcp_adapters.client import MultiServerMCPClient
from langchain.agents import create_agent
async def main():
client = MultiServerMCPClient(
{
"gateway": {
"transport": "streamable_http",
"url": f"http://{docker_hostname}:4444/servers/4fec000a5ec2456aae2c71e0f2920311/mcp",
"headers": {
"Authorization": f"Bearer {jwt_token}"
}
}
}
)
tools = await client.get_tools()
agent = create_agent(
llm,
tools
)
response = await agent.ainvoke({"messages": [{"role": "user", "content": "what is the weather in nyc?"}]})
print(response)
asyncio.run(main())
After running, I got the following response...The tool_calls were not actually executed, and no ToolMessage was generated:
{'messages': [HumanMessage(content='what is the weather in nyc?', additional_kwargs={}, response_metadata={}, id='3962d25f-21a3-48d9-8fca-c864aaf6389f'), AIMessage(content='<|start|>assistant<|channel|>commentary to=functions.weather-get-weather <|constrain|>json<|message|>{\n "location": "New York City"\n}<|call|>', additional_kwargs={}, response_metadata={'token_usage': {'prompt_tokens': 127, 'total_tokens': 189, 'completion_tokens': 62, 'completion_tokens_details': None, 'prompt_tokens_details': None}, 'model': 'Turbo', 'finish_reason': 'FinishReason.tool_calls'}, id='lc_run--05316aa0-997d-45c6-818e-f08a83c988b8-0', usage_metadata={'input_tokens': 127, 'output_tokens': 62, 'total_tokens': 189})]}
At the same time, if I directly use the FastMCP URL: http://{my_ip}:8010/mcp, I will get the following output:
{'messages': [HumanMessage(content='what is the weather in nyc?', additional_kwargs={}, response_metadata={}, id='2326dd2d-f265-49cd-b54a-2772b7a2c04e'), AIMessage(content='', additional_kwargs={}, response_metadata={'token_usage': {'prompt_tokens': 126, 'total_tokens': 193, 'completion_tokens': 67, 'completion_tokens_details': None, 'prompt_tokens_details': None}, 'model': 'Turbo', 'finish_reason': 'FinishReason.tool_calls'}, id='lc_run--9dc34bf9-87c1-459f-9f5b-7e720088943f-0', tool_calls=[{'name': 'get_weather', 'args': {'location': 'New York City'}, 'id': 'call_c895fc2f70bd4795af8b0e5c', 'type': 'tool_call'}], usage_metadata={'input_tokens': 126, 'output_tokens': 67, 'total_tokens': 193}), ToolMessage(content='{\n "result": "It\'s always sunny in New York"\n}', name='get_weather', id='287dcb07-11fb-4a54-9c58-1ab384e262d3', tool_call_id='call_c895fc2f70bd4795af8b0e5c'), AIMessage(content='The current weather in New\u202fYork City is: **βItβs always sunny in New\u202fYork.β** π', additional_kwargs={}, response_metadata={'token_usage': {'prompt_tokens': 176, 'total_tokens': 247, 'completion_tokens': 71, 'completion_tokens_details': None, 'prompt_tokens_details': None}, 'model': 'Turbo', 'finish_reason': 'FinishReason.stop'}, id='lc_run--2f28eaa1-896e-4074-96c4-42b5f64e0d19-0', usage_metadata={'input_tokens': 176, 'output_tokens': 71, 'total_tokens': 247})]}
As you can see, using the original MCP URL can successfully call the tool, but using the Virtual Server MCP URL it does not work. Please help resolve this issue.
π€ Expected Behavior
What should have happened instead?
LangChain should be able to call tools normally via the Virtual Server MCP URL
π Logs / Error Output
Paste any relevant stack traces or logs here.
π§ Environment Info
You can retrieve most of this from the /version endpoint.
| Key | Value |
|---|---|
| Version or commit | v0.9.0 and v0.8.0 |
| Runtime | Python 3.12 |
| Platform / OS | windows |
| Container | Docker |
π§© Additional Context (optional)
Add any configuration details, flags, or related issues.
- gateway docker run with --network=host
- I can call tool on the web by using MCP Servers > Test