Allow chat models that do not return token usage (#7907)

- Description: It allows to use chat models that do not return token
usage
- Issue: [#7900](https://github.com/hwchase17/langchain/issues/7900)
- Dependencies: None
- Tag maintainer: @agola11 @hwchase17 
- Twitter handle: @alonsosilva

---------

Co-authored-by: Bagatur <22008038+baskaryan@users.noreply.github.com>
Co-authored-by: William FH <13333726+hinthornw@users.noreply.github.com>
This commit is contained in:
Alonso Silva Allende 2023-07-19 03:12:09 +02:00 committed by GitHub
parent bdf0c2267f
commit 1152f4d48b
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -391,7 +391,8 @@ class ChatOpenAI(BaseChatModel):
generation_info=dict(finish_reason=res.get("finish_reason")),
)
generations.append(gen)
llm_output = {"token_usage": response["usage"], "model_name": self.model_name}
token_usage = response.get("usage", {})
llm_output = {"token_usage": token_usage, "model_name": self.model_name}
return ChatResult(generations=generations, llm_output=llm_output)
async def _agenerate(