mirror of
https://github.com/hwchase17/langchain.git
synced 2025-06-23 23:29:21 +00:00
langchain_mistralai: Include finish_reason
in response metadata when parsing MistralAI chunks toAIMessageChunk
(#31667)
## Description <!-- What does this pull request accomplish? --> - When parsing MistralAI chunk dicts to Langchain to `AIMessageChunk` schemas via the `_convert_chunk_to_message_chunk` utility function, the `finish_reason` was not being included in `response_metadata` as it is for other providers. - This PR adds a one-liner fix to include the finish reason. - fixes: https://github.com/langchain-ai/langchain/issues/31666
This commit is contained in:
parent
7ff405077d
commit
22e6d90937
@ -271,7 +271,8 @@ def _convert_chunk_to_message_chunk(
|
||||
if _choice.get("finish_reason") is not None and isinstance(
|
||||
chunk.get("model"), str
|
||||
):
|
||||
response_metadata["model_name"] = chunk.get("model")
|
||||
response_metadata["model_name"] = chunk["model"]
|
||||
response_metadata["finish_reason"] = _choice["finish_reason"]
|
||||
return AIMessageChunk(
|
||||
content=content,
|
||||
additional_kwargs=additional_kwargs,
|
||||
|
Loading…
Reference in New Issue
Block a user