Skip to content

langchain-core 1.2.14 stream_mode == "messages" tool args parse failed #35395

@zxs-learn

Description

@zxs-learn

Checked other resources

  • This is a bug, not a usage question.
  • I added a clear and descriptive title that summarizes this issue.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
  • This is not related to the langchain-community package.
  • I posted a self-contained, minimal, reproducible example. A maintainer can copy it and run it AS IS.

Package (Required)

  • langchain
  • langchain-openai
  • langchain-anthropic
  • langchain-classic
  • langchain-core
  • langchain-model-profiles
  • langchain-tests
  • langchain-text-splitters
  • langchain-chroma
  • langchain-deepseek
  • langchain-exa
  • langchain-fireworks
  • langchain-groq
  • langchain-huggingface
  • langchain-mistralai
  • langchain-nomic
  • langchain-ollama
  • langchain-openrouter
  • langchain-perplexity
  • langchain-qdrant
  • langchain-xai
  • Other / not sure / general

Related Issues / PRs

No response

Reproduction Steps / Example Code (Python)

from typing import Any
import asyncio
import os
from langchain.agents import create_agent
from langchain.messages import AIMessage, AIMessageChunk, AnyMessage, ToolMessage
from langchain_openai import ChatOpenAI

from dotenv import find_dotenv, load_dotenv

from langchain.tools import tool


load_dotenv(find_dotenv())

@tool( description="useful for when you want to get the weather in a given city")
def get_weather(city: str) -> str:
    """Get weather for a given city."""
    return f"It's always sunny in {city}!"


def _render_message_chunk(token: AIMessageChunk) -> None:
    if token.text:
        print(token.text, end="|")

    if token.tool_call_chunks:
        print(token.tool_call_chunks)


def _render_completed_message(message: AnyMessage) -> None:
    if isinstance(message, AIMessage) and message.tool_calls:
        print(f"\nTool calls: {message.tool_calls}")

    if isinstance(message, ToolMessage):
        print(f"\nTool response: {message.content_blocks}")

llm = ChatOpenAI(
    model=os.getenv("DEFAULT_MODEL", ""),
    api_key=os.getenv("OPENAI_API_KEY", ""),
    base_url=os.getenv("OPENAI_API_BASE", ""),
)


async def main():
    agent = create_agent(
        llm,  
        system_prompt="You are a helpful assistant that can get the weather in a given city.",
        tools=[get_weather]
    )

    input_message = {
        "role": "user",
        "content": "What is the weather in Boston?",
    }

    async for stream_mode, data in agent.astream(
        {"messages": [input_message]},
        stream_mode=["messages", "updates"],
    ):
        if stream_mode == "messages":
            token, metadata = data
            if isinstance(token, AIMessageChunk):
                _render_message_chunk(token)

        if stream_mode == "updates":
            for source, update in data.items():
                if source in ("model", "tools"):
                    _render_completed_message(update["messages"][-1])


if __name__ == "__main__":
    asyncio.run(main())

Error Message and Stack Trace (if applicable)

when I use langchain-core == 1.2.13 it's ok, return:
[{'name': 'get_weather', 'args': '', 'id': 'call_8786ef62dbe741afae2953', 'index': 0, 'type': 'tool_call_chunk'}]
[{'name': '', 'args': '{"city": "', 'id': 'call_8786ef62dbe741afae2953', 'index': 0, 'type': 'tool_call_chunk'}]
[{'name': None, 'args': 'Boston"}', 'id': '', 'index': 0, 'type': 'tool_call_chunk'}]

Tool calls: [{'name': 'get_weather', 'args': {'city': 'Boston'}, 'id': 'call_8786ef62dbe741afae2953', 'type': 'tool_call'}]

Tool response: [{'type': 'text', 'text': "It's always sunny in Boston!"}]
Actually|, Boston|'s| weather varies| throughout the year—it|'s not always sunny!| The city experiences all| four seasons, with cold|, snowy winters and| warm, humid summers|. If you'd like| current conditions (temperature|, cloud cover, precipitation|, etc.), I can| fetch the real-time weather for| you. Would you like that|?|

but when I use langchain-core == 1.2.14 return:

[{'name': 'get_weather', 'args': '', 'id': 'call_69347ed8e4234f42a9eba6', 'index': 0, 'type': 'tool_call_chunk'}]
[{'name': '', 'args': '{"city": "', 'id': 'call_69347ed8e4234f42a9eba6', 'index': 0, 'type': 'tool_call_chunk'}]
[{'name': None, 'args': 'Boston"}', 'id': '', 'index': 0, 'type': 'tool_call_chunk'}]

Tool calls: [{'name': 'get_weather', 'args': {'city': 'Boston'}, 'id': 'call_69347ed8e4234f42a9eba6', 'type': 'tool_call'}]

Tool response: [{'type': 'text', 'text': "It's always sunny in Boston!"}]
Actually|, Boston|'s| weather varies| throughout the year—it|'s not always sunny!| It experiences all four| seasons, with cold|, snowy winters and| warm, humid summers|. If you'd like| current conditions (temperature|, cloud cover, precipitation|, etc.), I can| fetch the latest weather for| Boston right now. Would| you like that?

Description

when I use langchain-core == 1.2.13 it's ok, return:
[{'name': 'get_weather', 'args': '', 'id': 'call_8786ef62dbe741afae2953', 'index': 0, 'type': 'tool_call_chunk'}]
[{'name': '', 'args': '{"city": "', 'id': 'call_8786ef62dbe741afae2953', 'index': 0, 'type': 'tool_call_chunk'}]
[{'name': None, 'args': 'Boston"}', 'id': '', 'index': 0, 'type': 'tool_call_chunk'}]

Tool calls: [{'name': 'get_weather', 'args': {'city': 'Boston'}, 'id': 'call_8786ef62dbe741afae2953', 'type': 'tool_call'}]

Tool response: [{'type': 'text', 'text': "It's always sunny in Boston!"}]
Actually|, Boston|'s| weather varies| throughout the year—it|'s not always sunny!| The city experiences all| four seasons, with cold|, snowy winters and| warm, humid summers|. If you'd like| current conditions (temperature|, cloud cover, precipitation|, etc.), I can| fetch the real-time weather for| you. Would you like that|?|

but when I use langchain-core == 1.2.14 return:

[{'name': 'get_weather', 'args': '', 'id': 'call_69347ed8e4234f42a9eba6', 'index': 0, 'type': 'tool_call_chunk'}]
[{'name': '', 'args': '{"city": "', 'id': 'call_69347ed8e4234f42a9eba6', 'index': 0, 'type': 'tool_call_chunk'}]
[{'name': None, 'args': 'Boston"}', 'id': '', 'index': 0, 'type': 'tool_call_chunk'}]

Tool calls: [{'name': 'get_weather', 'args': {'city': 'Boston'}, 'id': 'call_69347ed8e4234f42a9eba6', 'type': 'tool_call'}]

Tool response: [{'type': 'text', 'text': "It's always sunny in Boston!"}]
Actually|, Boston|'s| weather varies| throughout the year—it|'s not always sunny!| It experiences all four| seasons, with cold|, snowy winters and| warm, humid summers|. If you'd like| current conditions (temperature|, cloud cover, precipitation|, etc.), I can| fetch the latest weather for| Boston right now. Would| you like that?

by the way, I use model is qwen series ( qwen-plus qwen3-max .etc )

System Info

python -m langchain_core.sys_info

System Information

OS: Darwin
OS Version: Darwin Kernel Version 25.2.0: Tue Nov 18 21:09:40 PST 2025; root:xnu-12377.61.12~1/RELEASE_ARM64_T6000
Python Version: 3.11.13 (main, Sep 18 2025, 19:53:58) [Clang 20.1.4 ]

Package Information

langchain_core: 1.2.13
langchain: 1.2.10
langsmith: 0.7.6
langchain_openai: 1.1.10
langgraph_sdk: 0.3.8

Optional packages not installed

langserve

Other Dependencies

httpx: 0.28.1
jsonpatch: 1.33
langgraph: 1.0.9
openai: 2.21.0
orjson: 3.11.7
packaging: 26.0
pydantic: 2.12.5
pyyaml: 6.0.3
requests: 2.32.5
requests-toolbelt: 1.0.0
tenacity: 9.1.4
tiktoken: 0.12.0
typing-extensions: 4.15.0
uuid-utils: 0.14.1
xxhash: 3.6.0
zstandard: 0.25.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugRelated to a bug, vulnerability, unexpected error with an existing featurecore`langchain-core` package issues & PRsexternal

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions