Add support for ChatOpenAI models in Infino callback handler (#11608)

**Description:** This PR adds support for ChatOpenAI models in the
Infino callback handler. In particular, this PR implements
`on_chat_model_start` callback, so that ChatOpenAI models are supported.
With this change, Infino callback handler can be used to track latency,
errors, and prompt tokens for ChatOpenAI models too (in addition to the
support for OpenAI and other non-chat models it has today). The existing
example notebook is updated to show how to use this integration as well.
cc/ @naman-modi @savannahar68

**Issue:** https://github.com/langchain-ai/langchain/issues/11607 

**Dependencies:** None

**Tag maintainer:** @hwchase17 

**Twitter handle:** [@vkakade](https://twitter.com/vkakade)
This commit is contained in:
Vinay Kakade
2023-10-12 02:30:54 +05:30
committed by GitHub
parent d0603c86b6
commit dd0cd98861
2 changed files with 339 additions and 44 deletions

File diff suppressed because one or more lines are too long