Skip to content

Commit 22e6d90

Browse files
authored
langchain_mistralai: Include finish_reason in response metadata when parsing MistralAI chunks toAIMessageChunk (#31667)
## Description <!-- What does this pull request accomplish? --> - When parsing MistralAI chunk dicts to Langchain to `AIMessageChunk` schemas via the `_convert_chunk_to_message_chunk` utility function, the `finish_reason` was not being included in `response_metadata` as it is for other providers. - This PR adds a one-liner fix to include the finish reason. - fixes: #31666
1 parent 7ff4050 commit 22e6d90

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

libs/partners/mistralai/langchain_mistralai/chat_models.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -271,7 +271,8 @@ def _convert_chunk_to_message_chunk(
271271
if _choice.get("finish_reason") is not None and isinstance(
272272
chunk.get("model"), str
273273
):
274-
response_metadata["model_name"] = chunk.get("model")
274+
response_metadata["model_name"] = chunk["model"]
275+
response_metadata["finish_reason"] = _choice["finish_reason"]
275276
return AIMessageChunk(
276277
content=content,
277278
additional_kwargs=additional_kwargs,

0 commit comments

Comments
 (0)