community: Add token_usage and model_name metadata to ChatZhipuAI stream() and astream() response (#27677)

Thank you for contributing to LangChain!


- **Description:** Add token_usage and model_name metadata to
ChatZhipuAI stream() and astream() response
- **Issue:** None
- **Dependencies:** None
- **Twitter handle:** None


- [ ] **Add tests and docs**: If you're adding a new integration, please
include
1. a test for the integration, preferably unit tests that do not rely on
network access,
2. an example notebook showing its use. It lives in
`docs/docs/integrations` directory.


- [ ] **Lint and test**: Run `make format`, `make lint` and `make test`
from the root of the package(s) you've modified. See contribution
guidelines for more: https://python.langchain.com/docs/contributing/

Additional guidelines:
- Make sure optional dependencies are imported within a function.
- Please do not add dependencies to pyproject.toml files (even optional
ones) unless they are required for unit tests.
- Most PRs should not touch more than one package.
- Changes should be backwards compatible.
- If you are adding something to community, do not re-import it in
langchain.

If no one reviews your PR within a few days, please @-mention one of
baskaryan, efriis, eyurtsev, ccurme, vbarda, hwchase17.

Co-authored-by: jianfehuang <jianfehuang@tencent.com>
This commit is contained in:
随风枫叶 2024-10-30 22:34:33 +08:00 committed by GitHub
parent 8a5807a6b4
commit 18cfb4c067
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -591,13 +591,19 @@ class ChatZhipuAI(BaseChatModel):
if len(chunk["choices"]) == 0:
continue
choice = chunk["choices"][0]
usage = chunk.get("usage", None)
model_name = chunk.get("model", "")
chunk = _convert_delta_to_message_chunk(
choice["delta"], default_chunk_class
)
finish_reason = choice.get("finish_reason", None)
generation_info = (
{"finish_reason": finish_reason}
{
"finish_reason": finish_reason,
"token_usage": usage,
"model_name": model_name,
}
if finish_reason is not None
else None
)
@ -678,13 +684,19 @@ class ChatZhipuAI(BaseChatModel):
if len(chunk["choices"]) == 0:
continue
choice = chunk["choices"][0]
usage = chunk.get("usage", None)
model_name = chunk.get("model", "")
chunk = _convert_delta_to_message_chunk(
choice["delta"], default_chunk_class
)
finish_reason = choice.get("finish_reason", None)
generation_info = (
{"finish_reason": finish_reason}
{
"finish_reason": finish_reason,
"token_usage": usage,
"model_name": model_name,
}
if finish_reason is not None
else None
)