langchain_openai: Make sure the response from the async client in the astream method of ChatOpenAI is properly awaited in case of "include_response_headers=True" (#26031)

- **Description:** This is a **one line change**. the
`self.async_client.with_raw_response.create(**payload)` call is not
properly awaited within the `_astream` method. In `_agenerate` this is
done already, but likely forgotten in the other method.
  - **Issue:** Not applicable
  - **Dependencies:** No dependencies required.

(If no one reviews your PR within a few days, please @-mention one of
baskaryan, efriis, eyurtsev, ccurme, vbarda, hwchase17.)

---------

Co-authored-by: Chester Curme <chester.curme@gmail.com>
This commit is contained in:
Friso H. Kingma
2024-09-04 15:26:48 +02:00
committed by GitHub
parent c812237217
commit af11fbfbf6
2 changed files with 35 additions and 3 deletions

View File

@@ -757,7 +757,7 @@ class BaseChatOpenAI(BaseChatModel):
)
return
if self.include_response_headers:
raw_response = self.async_client.with_raw_response.create(**payload)
raw_response = await self.async_client.with_raw_response.create(**payload)
response = raw_response.parse()
base_generation_info = {"headers": dict(raw_response.headers)}
else: