Skip to content

create_agent did not remove the reasoning from the message listΒ #33834

@angular-moon

Description

@angular-moon

Checked other resources

  • This is a bug, not a usage question.
  • I added a clear and descriptive title that summarizes this issue.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
  • This is not related to the langchain-community package.
  • I read what a minimal reproducible example is (https://stackoverflow.com/help/minimal-reproducible-example).
  • I posted a self-contained, minimal, reproducible example. A maintainer can copy it and run it AS IS.

Example Code

# enable thinking
init_chat_model(
      openai_api_key=Config.MODEL_API_KEY,
      model=Config.MODEL_NAME or "doubao-seed-1-6-251015",
      model_provider="openai",
      base_url=Config.MODEL_API_BASE_URL,
      temperature=0,
      extra_body={"thinking": {"type": "enabled"}},
      stream_usage=True,
      use_responses_api=True,
  )
def get_weather(city: str) -> str:
    """Get weather for a given city."""
    return f"It's always sunny in {city}!"


weather_agent = create_agent(
    model=llm,
    tools=[get_weather],
    system_prompt="You are a helpful assistant",
)

"""
BadRequestError: Error code: 400 - {'error': {'code': 'InvalidParameter', 'message': 'The parameter `input` specified in the request are not valid: item reasoning is not supported. Request id: 02176234456560087423b8d6cd1ec64c1b9b82c48fdb45880e407', 'param': 'input', 'type': 'BadRequest'}}
During task with name 'model' and id '50decdd0-1d8c-d2ba-bd63-ba6d7fa8df3b'
"""
result = weather_agent.invoke(
    input={"messages": [HumanMessage(content="What's the weather like in San Francisco?")]}
)

for msg in result["messages"]:
    print(msg.type, msg.content_blocks)
    print("\n")

Error Message and Stack Trace (if applicable)

Cell In[9], line 25
11 weather_agent = create_agent(
12     model=llm,
13     tools=[get_weather],
14     system_prompt="You are a helpful assistant",
15 )
17 # result = llm.invoke("What's the weather like in San Francisco?")
18
19 # print(result);
(...) 23
24 # Run the agent
---> 25 result = weather_agent.invoke(
26     input={"messages": [HumanMessage(content="What's the weather like in San Francisco?")]}
27 )
29 for msg in result["messages"]:
30     print(msg.type, msg.content_blocks)

File c:\Users\Administrator\Desktop\langchain_demo.venv\Lib\site-packages\langgraph\pregel\main.py:3094, in Pregel.invoke(self, input, config, context, stream_mode, print_mode, output_keys, interrupt_before, interrupt_after, durability,** kwargs)
3091 chunks: list[dict[str, Any] | Any] = []
3092 interrupts: list[Interrupt] = []
-> 3094 for chunk in self.stream(
3095     input,
3096     config,
...
1049 break
1051 assert response is not None, "could not resolve response (should never happen)"
BadRequestError: Error code: 400 - {
    'error': {
        'code': 'InvalidParameter',
        'message': 'The parameter input specified in the request are not valid: item reasoning is not supported. Request id: 02176234456560087423b8d6cd1ec64c1b9b82c48fdb45880e407',
        'param': 'input',
        'type': 'BadRequest'
    }
}
During task with name 'model' and id '50decdd0-1d8c-d2ba-bd63-ba6d7fa8df3b'

Description

The agent created with create_agent reports an error when invoke.
Expectation: Properly handle messages and automatically remove reasoning when calling the llm api.

System Info

System Information

OS: Windows
OS Version: 10.0.26200
Python Version: 3.13.1 (main, Jan 14 2025, 22:47:35) [MSC v.1942 64 bit (AMD64)]

Package Information

langchain_core: 1.0.2
langchain: 1.0.3
langsmith: 0.4.38
langchain_openai: 1.0.1
langgraph_sdk: 0.2.9

Optional packages not installed

langserve

Other Dependencies

claude-agent-sdk: Installed. No version info available.
httpx: 0.28.1
jsonpatch: 1.33
langchain-anthropic: Installed. No version info available.
langchain-aws: Installed. No version info available.
langchain-community: Installed. No version info available.
langchain-deepseek: Installed. No version info available.
langchain-fireworks: Installed. No version info available.
langchain-google-genai: Installed. No version info available.
langchain-google-vertexai: Installed. No version info available.
langchain-groq: Installed. No version info available.
langchain-huggingface: Installed. No version info available.
langchain-mistralai: Installed. No version info available.
langchain-ollama: Installed. No version info available.
langchain-perplexity: Installed. No version info available.
langchain-together: Installed. No version info available.
langchain-xai: Installed. No version info available.
langgraph: 1.0.2
langsmith-pyo3: Installed. No version info available.
openai: 2.6.1
openai-agents: Installed. No version info available.
opentelemetry-api: Installed. No version info available.
opentelemetry-exporter-otlp-proto-http: Installed. No version info available.
opentelemetry-sdk: Installed. No version info available.
orjson: 3.11.4
packaging: 25.0
pydantic: 2.12.3
pytest: Installed. No version info available.
pyyaml: 6.0.3
requests: 2.32.5
requests-toolbelt: 1.0.0
rich: Installed. No version info available.
tenacity: 9.1.2
tiktoken: 0.12.0
typing-extensions: 4.15.0
vcrpy: Installed. No version info available.
zstandard: 0.25.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugRelated to a bug, vulnerability, unexpected error with an existing feature

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions