Skip to content

Deepseek-v3.2 throws an error when running react_agent with extra_body={"thinking": {"type": "enabled"}}. #6521

@heyboy905

Description

@heyboy905

Checked other resources

  • This is a bug, not a usage question. For questions, please use the LangChain Forum (https://forum.langchain.com/).
  • I added a clear and detailed title that summarizes the issue.
  • I read what a minimal reproducible example is (https://stackoverflow.com/help/minimal-reproducible-example).
  • I included a self-contained, minimal example that demonstrates the issue INCLUDING all the relevant imports. The code run AS IS to reproduce the issue.

Example Code

def make_llm():
    api_key = "xxx"
    api_base = "xxx"
    model = "deepseek/deepseek-v3.2"
    return ChatOpenAI(base_url=api_base, api_key=api_key, model=model, extra_body={"thinking": {"type": "enabled"}})

Error Message and Stack Trace (if applicable)

2025-12-01 21:51:30 - INFO - logger_v2 - aid=- wsid=- - [Step 2] ────────────────────────────────────────
2025-12-01 21:51:30 - INFO - logger_v2 - aid=- wsid=- - [RunLoop] ❌ Error: thread_id=test_session_100142336476783, error=Error code: 400 - {'message': 'unknown error in the model inference server trace_id: 86c6abebf34bcbaae44608dac30d9121', 'type': 'api_error'}
Traceback (most recent call last):
  File "e:\multi-turn-agent\agent\react_agent.py", line 599, in _execute_with_langsmith_tracking
    return await self._process_agent_stream(initial, step_count)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "e:\multi-turn-agent\agent\react_agent.py", line 634, in _process_agent_stream
    async for event in self._agent.astream(initial, config=self.get_fresh_config()):
  File "E:\anaconda\envs\multi_turn_agent\Lib\site-packages\langgraph\pregel\main.py", line 2939, in astream
    async for _ in runner.atick(
  File "E:\anaconda\envs\multi_turn_agent\Lib\site-packages\langgraph\pregel\_runner.py", line 295, in atick
    await arun_with_retry(
  File "E:\anaconda\envs\multi_turn_agent\Lib\site-packages\langgraph\pregel\_retry.py", line 137, in arun_with_retry
    return await task.proc.ainvoke(task.input, config)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\anaconda\envs\multi_turn_agent\Lib\site-packages\langgraph\_internal\_runnable.py", line 706, in ainvoke
    input = await asyncio.create_task(
            ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\anaconda\envs\multi_turn_agent\Lib\site-packages\langgraph\_internal\_runnable.py", line 465, in ainvoke
    ret = await asyncio.create_task(coro, context=context)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\anaconda\envs\multi_turn_agent\Lib\site-packages\langgraph\prebuilt\chat_agent_executor.py", line 655, in acall_model
    response = cast(AIMessage, await static_model.ainvoke(model_input, config))  # type: ignore[union-attr]
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\anaconda\envs\multi_turn_agent\Lib\site-packages\langchain_core\runnables\base.py", line 3290, in ainvoke
    input_ = await coro_with_context(part(), context, create_task=True)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\anaconda\envs\multi_turn_agent\Lib\site-packages\langchain_core\runnables\base.py", line 5723, in ainvoke
    return await self.bound.ainvoke(
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\anaconda\envs\multi_turn_agent\Lib\site-packages\langchain_core\language_models\chat_models.py", line 417, in ainvoke
    llm_result = await self.agenerate_prompt(
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\anaconda\envs\multi_turn_agent\Lib\site-packages\langchain_core\language_models\chat_models.py", line 1034, in agenerate_prompt
    return await self.agenerate(
           ^^^^^^^^^^^^^^^^^^^^^
  File "E:\anaconda\envs\multi_turn_agent\Lib\site-packages\langchain_core\language_models\chat_models.py", line 992, in agenerate
    raise exceptions[0]
  File "E:\anaconda\envs\multi_turn_agent\Lib\site-packages\langchain_core\language_models\chat_models.py", line 1162, in _agenerate_with_cache
    result = await self._agenerate(
             ^^^^^^^^^^^^^^^^^^^^^^
  File "E:\anaconda\envs\multi_turn_agent\Lib\site-packages\langchain_openai\chat_models\base.py", line 824, in _agenerate
    response = await self.async_client.create(**payload)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\anaconda\envs\multi_turn_agent\Lib\site-packages\openai\resources\chat\completions\completions.py", line 2028, in create
    return await self._post(
           ^^^^^^^^^^^^^^^^^
  File "E:\anaconda\envs\multi_turn_agent\Lib\site-packages\openai\_base_client.py", line 1748, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\anaconda\envs\multi_turn_agent\Lib\site-packages\openai\_base_client.py", line 1555, in request
    raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'message': 'unknown error in the model inference server trace_id: 86c6abebf34bcbaae44608dac30d9121', 'type': 'api_error'}
During task with name 'agent' and id 'b56f4fa6-3273-8be6-3c92-f7e01e1707c1'

Description

Deepseek-v3.2 throws an error when running react_agent with extra_body={"thinking": {"type": "enabled"}}.
langchain==0.3.27
langchain-community==0.3.27
langchain-core==0.3.74
langchain-openai==0.2.9
langchain-text-splitters==0.3.9
langgraph==0.6.4
langgraph-checkpoint==2.1.1
langgraph-prebuilt==0.6.4
langgraph-sdk==0.2.0
langsmith==0.4.13

System Info

python react_agent.py

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingpendingawaiting review/confirmation by maintainer

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions