fix(deepseek): convert tool output arrays to strings (#31913)

## Description
When ChatDeepSeek invokes a tool that returns a list, it results in an
openai.UnprocessableEntityError due to a failure in deserializing the
JSON body.

The root of the problem is that ChatDeepSeek uses BaseChatOpenAI
internally, but the APIs are not identical: OpenAI v1/chat/completions
accepts arrays as tool results, but Deepseek API does not.

As a solution added `_get_request_payload` method to ChatDeepSeek, which
inherits the behavior from BaseChatOpenAI but adds a step to stringify
tool message content in case the content is an array. I also add a unit
test for this.

From the linked issue you can find the full reproducible example the
reporter of the issue provided. After the changes it works as expected.

Source: [Deepseek
docs](https://api-docs.deepseek.com/api/create-chat-completion/)


![image](https://github.com/user-attachments/assets/a59ed3e7-6444-46d1-9dcf-97e40e4e8952)

Source: [OpenAI
docs](https://platform.openai.com/docs/api-reference/chat/create)


![image](https://github.com/user-attachments/assets/728f4fc6-e1a3-4897-b39f-6f1ade07d3dc)


## Issue
Fixes #31394

## Dependencies:
No new dependencies.

## Twitter handle:
Don't have one.

---------

Co-authored-by: Mason Daugherty <github@mdrxy.com>
Co-authored-by: Mason Daugherty <mason@langchain.dev>
This commit is contained in:
nikk0o046 2025-07-16 19:19:44 +03:00 committed by GitHub
parent 96bf8262e2
commit b1c7de98f5
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
2 changed files with 31 additions and 1 deletions

View File

@ -2,6 +2,7 @@
from __future__ import annotations
import json
from collections.abc import Iterator
from json import JSONDecodeError
from typing import Any, Literal, Optional, TypeVar, Union
@ -218,6 +219,19 @@ class ChatDeepSeek(BaseChatOpenAI):
self.async_client = self.root_async_client.chat.completions
return self
def _get_request_payload(
self,
input_: LanguageModelInput,
*,
stop: Optional[list[str]] = None,
**kwargs: Any,
) -> dict:
payload = super()._get_request_payload(input_, stop=stop, **kwargs)
for message in payload["messages"]:
if message["role"] == "tool" and isinstance(message["content"], list):
message["content"] = json.dumps(message["content"])
return payload
def _create_chat_result(
self,
response: Union[dict, openai.BaseModel],

View File

@ -5,7 +5,7 @@ from __future__ import annotations
from typing import Any, Literal, Union
from unittest.mock import MagicMock
from langchain_core.messages import AIMessageChunk
from langchain_core.messages import AIMessageChunk, ToolMessage
from langchain_tests.unit_tests import ChatModelUnitTests
from openai import BaseModel
from openai.types.chat import ChatCompletionMessage
@ -217,3 +217,19 @@ class TestChatDeepSeekCustomUnit:
msg = "Expected chunk_result not to be None"
raise AssertionError(msg)
assert chunk_result.message.additional_kwargs.get("reasoning_content") is None
def test_get_request_payload(self) -> None:
"""Test that tool message content is converted from list to string."""
chat_model = ChatDeepSeek(model="deepseek-chat", api_key=SecretStr("api_key"))
tool_message = ToolMessage(content=[], tool_call_id="test_id")
payload = chat_model._get_request_payload([tool_message])
assert payload["messages"][0]["content"] == "[]"
tool_message = ToolMessage(content=["item1", "item2"], tool_call_id="test_id")
payload = chat_model._get_request_payload([tool_message])
assert payload["messages"][0]["content"] == '["item1", "item2"]'
tool_message = ToolMessage(content="test string", tool_call_id="test_id")
payload = chat_model._get_request_payload([tool_message])
assert payload["messages"][0]["content"] == "test string"