mirror of
https://github.com/hwchase17/langchain.git
synced 2025-09-22 11:00:37 +00:00
standard-tests[patch]: require model_name in response_metadata if returns_usage_metadata (#30497)
We are implementing a token-counting callback handler in `langchain-core` that is intended to work with all chat models supporting usage metadata. The callback will aggregate usage metadata by model. This requires responses to include the model name in its metadata. To support this, if a model `returns_usage_metadata`, we check that it includes a string model name in its `response_metadata` in the `"model_name"` key. More context: https://github.com/langchain-ai/langchain/pull/30487
This commit is contained in:
@@ -412,6 +412,9 @@ class ChatModelUnitTests(ChatModelTests):
|
||||
def returns_usage_metadata(self) -> bool:
|
||||
return False
|
||||
|
||||
Models supporting ``usage_metadata`` should also return the name of the
|
||||
underlying model in the ``response_metadata`` of the AIMessage.
|
||||
|
||||
.. dropdown:: supports_anthropic_inputs
|
||||
|
||||
Boolean property indicating whether the chat model supports Anthropic-style
|
||||
|
Reference in New Issue
Block a user