partner: ChatDeepSeek on openrouter not returning reasoning (#30240)

Deepseek model does not return reasoning when hosted on openrouter
(Issue [30067](https://github.com/langchain-ai/langchain/issues/30067))

the following code did not return reasoning:

```python
llm = ChatDeepSeek( model = 'deepseek/deepseek-r1:nitro', api_base="https://openrouter.ai/api/v1", api_key=os.getenv("OPENROUTER_API_KEY")) 
messages = [
    {"role": "system", "content": "You are an assistant."},
    {"role": "user", "content": "9.11 and 9.8, which is greater? Explain the reasoning behind this decision."}
]
response = llm.invoke(messages, extra_body={"include_reasoning": True})
print(response.content)
print(f"REASONING: {response.additional_kwargs.get('reasoning_content', '')}")
print(response)
```

The fix is to extract reasoning from
response.choices[0].message["model_extra"] and from
choices[0].delta["reasoning"]. and place in response additional_kwargs.
Change is really just the addition of a couple one-sentence if
statements.

---------

Co-authored-by: andrasfe <andrasf94@gmail.com>
Co-authored-by: Chester Curme <chester.curme@gmail.com>
This commit is contained in:
Andras L Ferenczi
2025-03-21 09:35:37 -07:00
committed by GitHub
parent 4852ab8d0a
commit b5f49df86a
3 changed files with 187 additions and 5 deletions

View File

@@ -228,6 +228,15 @@ class ChatDeepSeek(BaseChatOpenAI):
rtn.generations[0].message.additional_kwargs["reasoning_content"] = (
response.choices[0].message.reasoning_content # type: ignore
)
# Handle use via OpenRouter
elif hasattr(response.choices[0].message, "model_extra"): # type: ignore
model_extra = response.choices[0].message.model_extra # type: ignore
if isinstance(model_extra, dict) and (
reasoning := model_extra.get("reasoning")
):
rtn.generations[0].message.additional_kwargs["reasoning_content"] = (
reasoning
)
return rtn
@@ -244,11 +253,17 @@ class ChatDeepSeek(BaseChatOpenAI):
)
if (choices := chunk.get("choices")) and generation_chunk:
top = choices[0]
if reasoning_content := top.get("delta", {}).get("reasoning_content"):
if isinstance(generation_chunk.message, AIMessageChunk):
if isinstance(generation_chunk.message, AIMessageChunk):
if reasoning_content := top.get("delta", {}).get("reasoning_content"):
generation_chunk.message.additional_kwargs["reasoning_content"] = (
reasoning_content
)
# Handle use via OpenRouter
elif reasoning := top.get("delta", {}).get("reasoning"):
generation_chunk.message.additional_kwargs["reasoning_content"] = (
reasoning
)
return generation_chunk
def _stream(