langchain_mistralai: Include finish_reason in response metadata when parsing MistralAI chunks toAIMessageChunk (#31667)

## Description
<!-- What does this pull request accomplish? -->
- When parsing MistralAI chunk dicts to Langchain to `AIMessageChunk`
schemas via the `_convert_chunk_to_message_chunk` utility function, the
`finish_reason` was not being included in `response_metadata` as it is
for other providers.
- This PR adds a one-liner fix to include the finish reason.

- fixes: https://github.com/langchain-ai/langchain/issues/31666
This commit is contained in:
Saran Connolly 2025-06-20 20:41:20 +01:00 committed by GitHub
parent 7ff405077d
commit 22e6d90937
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -271,7 +271,8 @@ def _convert_chunk_to_message_chunk(
if _choice.get("finish_reason") is not None and isinstance(
chunk.get("model"), str
):
response_metadata["model_name"] = chunk.get("model")
response_metadata["model_name"] = chunk["model"]
response_metadata["finish_reason"] = _choice["finish_reason"]
return AIMessageChunk(
content=content,
additional_kwargs=additional_kwargs,