mirror of
https://github.com/hwchase17/langchain.git
synced 2025-09-18 16:16:33 +00:00
Deepseek model does not return reasoning when hosted on openrouter (Issue [30067](https://github.com/langchain-ai/langchain/issues/30067)) the following code did not return reasoning: ```python llm = ChatDeepSeek( model = 'deepseek/deepseek-r1:nitro', api_base="https://openrouter.ai/api/v1", api_key=os.getenv("OPENROUTER_API_KEY")) messages = [ {"role": "system", "content": "You are an assistant."}, {"role": "user", "content": "9.11 and 9.8, which is greater? Explain the reasoning behind this decision."} ] response = llm.invoke(messages, extra_body={"include_reasoning": True}) print(response.content) print(f"REASONING: {response.additional_kwargs.get('reasoning_content', '')}") print(response) ``` The fix is to extract reasoning from response.choices[0].message["model_extra"] and from choices[0].delta["reasoning"]. and place in response additional_kwargs. Change is really just the addition of a couple one-sentence if statements. --------- Co-authored-by: andrasfe <andrasf94@gmail.com> Co-authored-by: Chester Curme <chester.curme@gmail.com>