anthropic: support for code execution, MCP connector, files API features (#31340)

Support for the new [batch of beta
features](https://www.anthropic.com/news/agent-capabilities-api)
released yesterday:

- [Code
execution](https://docs.anthropic.com/en/docs/agents-and-tools/tool-use/code-execution-tool)
- [MCP
connector](https://docs.anthropic.com/en/docs/agents-and-tools/mcp-connector)
- [Files
API](https://docs.anthropic.com/en/docs/build-with-claude/files)

Also verified support for [prompt cache
TTL](https://docs.anthropic.com/en/docs/build-with-claude/prompt-caching#1-hour-cache-duration-beta).
This commit is contained in:
ccurme 2025-05-27 12:45:45 -04:00 committed by GitHub
parent 1ebcbf1d11
commit 580986b260
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
11 changed files with 824 additions and 41 deletions

View File

@ -41,6 +41,8 @@ jobs:
FIREWORKS_API_KEY: ${{ secrets.FIREWORKS_API_KEY }} FIREWORKS_API_KEY: ${{ secrets.FIREWORKS_API_KEY }}
GOOGLE_API_KEY: ${{ secrets.GOOGLE_API_KEY }} GOOGLE_API_KEY: ${{ secrets.GOOGLE_API_KEY }}
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }} ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
ANTHROPIC_FILES_API_IMAGE_ID: ${{ secrets.ANTHROPIC_FILES_API_IMAGE_ID }}
ANTHROPIC_FILES_API_PDF_ID: ${{ secrets.ANTHROPIC_FILES_API_PDF_ID }}
AZURE_OPENAI_API_VERSION: ${{ secrets.AZURE_OPENAI_API_VERSION }} AZURE_OPENAI_API_VERSION: ${{ secrets.AZURE_OPENAI_API_VERSION }}
AZURE_OPENAI_API_BASE: ${{ secrets.AZURE_OPENAI_API_BASE }} AZURE_OPENAI_API_BASE: ${{ secrets.AZURE_OPENAI_API_BASE }}
AZURE_OPENAI_API_KEY: ${{ secrets.AZURE_OPENAI_API_KEY }} AZURE_OPENAI_API_KEY: ${{ secrets.AZURE_OPENAI_API_KEY }}

View File

@ -344,6 +344,8 @@ jobs:
fail-fast: false # Continue testing other partners if one fails fail-fast: false # Continue testing other partners if one fails
env: env:
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }} ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
ANTHROPIC_FILES_API_IMAGE_ID: ${{ secrets.ANTHROPIC_FILES_API_IMAGE_ID }}
ANTHROPIC_FILES_API_PDF_ID: ${{ secrets.ANTHROPIC_FILES_API_PDF_ID }}
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }} OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
AZURE_OPENAI_API_VERSION: ${{ secrets.AZURE_OPENAI_API_VERSION }} AZURE_OPENAI_API_VERSION: ${{ secrets.AZURE_OPENAI_API_VERSION }}
AZURE_OPENAI_API_BASE: ${{ secrets.AZURE_OPENAI_API_BASE }} AZURE_OPENAI_API_BASE: ${{ secrets.AZURE_OPENAI_API_BASE }}

View File

@ -127,6 +127,8 @@ jobs:
env: env:
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }} OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }} ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
ANTHROPIC_FILES_API_IMAGE_ID: ${{ secrets.ANTHROPIC_FILES_API_IMAGE_ID }}
ANTHROPIC_FILES_API_PDF_ID: ${{ secrets.ANTHROPIC_FILES_API_PDF_ID }}
AZURE_OPENAI_API_VERSION: ${{ secrets.AZURE_OPENAI_API_VERSION }} AZURE_OPENAI_API_VERSION: ${{ secrets.AZURE_OPENAI_API_VERSION }}
AZURE_OPENAI_API_BASE: ${{ secrets.AZURE_OPENAI_API_BASE }} AZURE_OPENAI_API_BASE: ${{ secrets.AZURE_OPENAI_API_BASE }}
AZURE_OPENAI_API_KEY: ${{ secrets.AZURE_OPENAI_API_KEY }} AZURE_OPENAI_API_KEY: ${{ secrets.AZURE_OPENAI_API_KEY }}

View File

@ -325,6 +325,102 @@
"ai_msg.tool_calls" "ai_msg.tool_calls"
] ]
}, },
{
"cell_type": "markdown",
"id": "535a16e4-cd5a-479f-b315-37c816ec4387",
"metadata": {},
"source": [
"## Multimodal\n",
"\n",
"Claude supports image and PDF inputs as content blocks, both in Anthropic's native format (see docs for [vision](https://docs.anthropic.com/en/docs/build-with-claude/vision#base64-encoded-image-example) and [PDF support](https://docs.anthropic.com/en/docs/build-with-claude/pdf-support)) as well as LangChain's [standard format](/docs/how_to/multimodal_inputs/).\n",
"\n",
"### Files API\n",
"\n",
"Claude also supports interactions with files through its managed [Files API](https://docs.anthropic.com/en/docs/build-with-claude/files). See examples below.\n",
"\n",
"The Files API can also be used to upload files to a container for use with Claude's built-in code-execution tools. See the [code execution](#code-execution) section below, for details.\n",
"\n",
"<details>\n",
"<summary>Images</summary>\n",
"\n",
"```python\n",
"# Upload image\n",
"\n",
"import anthropic\n",
"\n",
"client = anthropic.Anthropic()\n",
"file = client.beta.files.upload(\n",
" # Supports image/jpeg, image/png, image/gif, image/webp\n",
" file=(\"image.png\", open(\"/path/to/image.png\", \"rb\"), \"image/png\"),\n",
")\n",
"image_file_id = file.id\n",
"\n",
"\n",
"# Run inference\n",
"from langchain_anthropic import ChatAnthropic\n",
"\n",
"llm = ChatAnthropic(\n",
" model=\"claude-sonnet-4-20250514\",\n",
" betas=[\"files-api-2025-04-14\"],\n",
")\n",
"\n",
"input_message = {\n",
" \"role\": \"user\",\n",
" \"content\": [\n",
" {\n",
" \"type\": \"text\",\n",
" \"text\": \"Describe this image.\",\n",
" },\n",
" {\n",
" \"type\": \"image\",\n",
" \"source\": {\n",
" \"type\": \"file\",\n",
" \"file_id\": image_file_id,\n",
" },\n",
" },\n",
" ],\n",
"}\n",
"llm.invoke([input_message])\n",
"```\n",
"\n",
"</details>\n",
"\n",
"<details>\n",
"<summary>PDFs</summary>\n",
"\n",
"```python\n",
"# Upload document\n",
"\n",
"import anthropic\n",
"\n",
"client = anthropic.Anthropic()\n",
"file = client.beta.files.upload(\n",
" file=(\"document.pdf\", open(\"/path/to/document.pdf\", \"rb\"), \"application/pdf\"),\n",
")\n",
"pdf_file_id = file.id\n",
"\n",
"\n",
"# Run inference\n",
"from langchain_anthropic import ChatAnthropic\n",
"\n",
"llm = ChatAnthropic(\n",
" model=\"claude-sonnet-4-20250514\",\n",
" betas=[\"files-api-2025-04-14\"],\n",
")\n",
"\n",
"input_message = {\n",
" \"role\": \"user\",\n",
" \"content\": [\n",
" {\"type\": \"text\", \"text\": \"Describe this document.\"},\n",
" {\"type\": \"document\", \"source\": {\"type\": \"file\", \"file_id\": pdf_file_id}}\n",
" ],\n",
"}\n",
"llm.invoke([input_message])\n",
"```\n",
"\n",
"</details>"
]
},
{ {
"cell_type": "markdown", "cell_type": "markdown",
"id": "6e36d25c-f358-49e5-aefa-b99fbd3fec6b", "id": "6e36d25c-f358-49e5-aefa-b99fbd3fec6b",
@ -454,6 +550,27 @@
"print(f\"\\nSecond:\\n{usage_2}\")" "print(f\"\\nSecond:\\n{usage_2}\")"
] ]
}, },
{
"cell_type": "markdown",
"id": "9678656f-1ec4-4bf1-bf62-bbd49eb5c4e7",
"metadata": {},
"source": [
":::tip Extended caching\n",
"\n",
" The cache lifetime is 5 minutes by default. If this is too short, you can apply one hour caching by enabling the `\"extended-cache-ttl-2025-04-11\"` beta header:\n",
"\n",
" ```python\n",
" llm = ChatAnthropic(\n",
" model=\"claude-3-7-sonnet-20250219\",\n",
" # highlight-next-line\n",
" betas=[\"extended-cache-ttl-2025-04-11\"],\n",
" )\n",
" ```\n",
" and specifying `\"cache_control\": {\"type\": \"ephemeral\", \"ttl\": \"1h\"}`.\n",
"\n",
":::"
]
},
{ {
"cell_type": "markdown", "cell_type": "markdown",
"id": "141ce9c5-012d-4502-9d61-4a413b5d959a", "id": "141ce9c5-012d-4502-9d61-4a413b5d959a",
@ -953,6 +1070,159 @@
"response = llm_with_tools.invoke(\"How do I update a web app to TypeScript 5.5?\")" "response = llm_with_tools.invoke(\"How do I update a web app to TypeScript 5.5?\")"
] ]
}, },
{
"cell_type": "markdown",
"id": "1478cdc6-2e52-4870-80f9-b4ddf88f2db2",
"metadata": {},
"source": [
"### Code execution\n",
"\n",
"Claude can use a [code execution tool](https://docs.anthropic.com/en/docs/agents-and-tools/tool-use/code-execution-tool) to execute Python code in a sandboxed environment.\n",
"\n",
":::info Code execution is supported since ``langchain-anthropic>=0.3.14``\n",
"\n",
":::"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "2ce13632-a2da-439f-a429-f66481501630",
"metadata": {},
"outputs": [],
"source": [
"from langchain_anthropic import ChatAnthropic\n",
"\n",
"llm = ChatAnthropic(\n",
" model=\"claude-sonnet-4-20250514\",\n",
" betas=[\"code-execution-2025-05-22\"],\n",
")\n",
"\n",
"tool = {\"type\": \"code_execution_20250522\", \"name\": \"code_execution\"}\n",
"llm_with_tools = llm.bind_tools([tool])\n",
"\n",
"response = llm_with_tools.invoke(\n",
" \"Calculate the mean and standard deviation of \" \"[1, 2, 3, 4, 5, 6, 7, 8, 9, 10]\"\n",
")"
]
},
{
"cell_type": "markdown",
"id": "24076f91-3a3d-4e53-9618-429888197061",
"metadata": {},
"source": [
"<details>\n",
"<summary>Use with Files API</summary>\n",
"\n",
"Using the Files API, Claude can write code to access files for data analysis and other purposes. See example below:\n",
"\n",
"```python\n",
"# Upload file\n",
"\n",
"import anthropic\n",
"\n",
"client = anthropic.Anthropic()\n",
"file = client.beta.files.upload(\n",
" file=open(\"/path/to/sample_data.csv\", \"rb\")\n",
")\n",
"file_id = file.id\n",
"\n",
"\n",
"# Run inference\n",
"from langchain_anthropic import ChatAnthropic\n",
"\n",
"llm = ChatAnthropic(\n",
" model=\"claude-sonnet-4-20250514\",\n",
" betas=[\"code-execution-2025-05-22\"],\n",
")\n",
"\n",
"tool = {\"type\": \"code_execution_20250522\", \"name\": \"code_execution\"}\n",
"llm_with_tools = llm.bind_tools([tool])\n",
"\n",
"input_message = {\n",
" \"role\": \"user\",\n",
" \"content\": [\n",
" {\n",
" \"type\": \"text\",\n",
" \"text\": \"Please plot these data and tell me what you see.\",\n",
" },\n",
" {\n",
" \"type\": \"container_upload\",\n",
" \"file_id\": file_id,\n",
" },\n",
" ]\n",
"}\n",
"llm_with_tools.invoke([input_message])\n",
"```\n",
"\n",
"Note that Claude may generate files as part of its code execution. You can access these files using the Files API:\n",
"```python\n",
"# Take all file outputs for demonstration purposes\n",
"file_ids = []\n",
"for block in response.content:\n",
" if block[\"type\"] == \"code_execution_tool_result\":\n",
" file_ids.extend(\n",
" content[\"file_id\"]\n",
" for content in block.get(\"content\", {}).get(\"content\", [])\n",
" if \"file_id\" in content\n",
" )\n",
"\n",
"for i, file_id in enumerate(file_ids):\n",
" file_content = client.beta.files.download(file_id)\n",
" file_content.write_to_file(f\"/path/to/file_{i}.png\")\n",
"```\n",
"\n",
"</details>"
]
},
{
"cell_type": "markdown",
"id": "040f381a-1768-479a-9a5e-aa2d7d77e0d5",
"metadata": {},
"source": [
"### Remote MCP\n",
"\n",
"Claude can use a [MCP connector tool](https://docs.anthropic.com/en/docs/agents-and-tools/mcp-connector) for model-generated calls to remote MCP servers.\n",
"\n",
":::info Remote MCP is supported since ``langchain-anthropic>=0.3.14``\n",
"\n",
":::"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "22fc4a89-e6d8-4615-96cb-2e117349aebf",
"metadata": {},
"outputs": [],
"source": [
"from langchain_anthropic import ChatAnthropic\n",
"\n",
"mcp_servers = [\n",
" {\n",
" \"type\": \"url\",\n",
" \"url\": \"https://mcp.deepwiki.com/mcp\",\n",
" \"name\": \"deepwiki\",\n",
" \"tool_configuration\": { # optional configuration\n",
" \"enabled\": True,\n",
" \"allowed_tools\": [\"ask_question\"],\n",
" },\n",
" \"authorization_token\": \"PLACEHOLDER\", # optional authorization\n",
" }\n",
"]\n",
"\n",
"llm = ChatAnthropic(\n",
" model=\"claude-sonnet-4-20250514\",\n",
" betas=[\"mcp-client-2025-04-04\"],\n",
" mcp_servers=mcp_servers,\n",
")\n",
"\n",
"response = llm.invoke(\n",
" \"What transport protocols does the 2025-03-26 version of the MCP \"\n",
" \"spec (modelcontextprotocol/modelcontextprotocol) support?\"\n",
")"
]
},
{ {
"cell_type": "markdown", "cell_type": "markdown",
"id": "2fd5d545-a40d-42b1-ad0c-0a79e2536c9b", "id": "2fd5d545-a40d-42b1-ad0c-0a79e2536c9b",

View File

@ -129,6 +129,7 @@ def _format_for_tracing(messages: list[BaseMessage]) -> list[BaseMessage]:
isinstance(block, dict) isinstance(block, dict)
and block.get("type") == "image" and block.get("type") == "image"
and is_data_content_block(block) and is_data_content_block(block)
and block.get("source_type") != "id"
): ):
if message_to_trace is message: if message_to_trace is message:
message_to_trace = message.model_copy() message_to_trace = message.model_copy()

View File

@ -193,6 +193,7 @@ def test_configurable_with_default() -> None:
"name": None, "name": None,
"disable_streaming": False, "disable_streaming": False,
"model": "claude-3-sonnet-20240229", "model": "claude-3-sonnet-20240229",
"mcp_servers": None,
"max_tokens": 1024, "max_tokens": 1024,
"temperature": None, "temperature": None,
"thinking": None, "thinking": None,
@ -203,6 +204,7 @@ def test_configurable_with_default() -> None:
"stop_sequences": None, "stop_sequences": None,
"anthropic_api_url": "https://api.anthropic.com", "anthropic_api_url": "https://api.anthropic.com",
"anthropic_api_key": SecretStr("bar"), "anthropic_api_key": SecretStr("bar"),
"betas": None,
"default_headers": None, "default_headers": None,
"model_kwargs": {}, "model_kwargs": {},
"streaming": False, "streaming": False,

View File

@ -1,4 +1,5 @@
import copy import copy
import json
import re import re
import warnings import warnings
from collections.abc import AsyncIterator, Iterator, Mapping, Sequence from collections.abc import AsyncIterator, Iterator, Mapping, Sequence
@ -100,6 +101,7 @@ def _is_builtin_tool(tool: Any) -> bool:
"computer_", "computer_",
"bash_", "bash_",
"web_search_", "web_search_",
"code_execution_",
] ]
return any(tool_type.startswith(prefix) for prefix in _builtin_tool_prefixes) return any(tool_type.startswith(prefix) for prefix in _builtin_tool_prefixes)
@ -219,6 +221,14 @@ def _format_data_content_block(block: dict) -> dict:
"data": block["data"], "data": block["data"],
}, },
} }
elif block["source_type"] == "id":
formatted_block = {
"type": "image",
"source": {
"type": "file",
"file_id": block["id"],
},
}
else: else:
raise ValueError( raise ValueError(
"Anthropic only supports 'url' and 'base64' source_type for image " "Anthropic only supports 'url' and 'base64' source_type for image "
@ -252,6 +262,14 @@ def _format_data_content_block(block: dict) -> dict:
"data": block["text"], "data": block["text"],
}, },
} }
elif block["source_type"] == "id":
formatted_block = {
"type": "document",
"source": {
"type": "file",
"file_id": block["id"],
},
}
else: else:
raise ValueError(f"Block of type {block['type']} is not supported.") raise ValueError(f"Block of type {block['type']} is not supported.")
@ -340,6 +358,29 @@ def _format_messages(
else: else:
block.pop("text", None) block.pop("text", None)
content.append(block) content.append(block)
elif block["type"] in ("server_tool_use", "mcp_tool_use"):
formatted_block = {
k: v
for k, v in block.items()
if k
in (
"type",
"id",
"input",
"name",
"server_name", # for mcp_tool_use
"cache_control",
)
}
# Attempt to parse streamed output
if block.get("input") == {} and "partial_json" in block:
try:
input_ = json.loads(block["partial_json"])
if input_:
formatted_block["input"] = input_
except json.JSONDecodeError:
pass
content.append(formatted_block)
elif block["type"] == "text": elif block["type"] == "text":
text = block.get("text", "") text = block.get("text", "")
# Only add non-empty strings for now as empty ones are not # Only add non-empty strings for now as empty ones are not
@ -375,6 +416,25 @@ def _format_messages(
[HumanMessage(block["content"])] [HumanMessage(block["content"])]
)[1][0]["content"] )[1][0]["content"]
content.append({**block, **{"content": tool_content}}) content.append({**block, **{"content": tool_content}})
elif block["type"] in (
"code_execution_tool_result",
"mcp_tool_result",
"web_search_tool_result",
):
content.append(
{
k: v
for k, v in block.items()
if k
in (
"type",
"content",
"tool_use_id",
"is_error", # for mcp_tool_result
"cache_control",
)
}
)
else: else:
content.append(block) content.append(block)
else: else:
@ -472,20 +532,21 @@ class ChatAnthropic(BaseChatModel):
**NOTE**: Any param which is not explicitly supported will be passed directly to the **NOTE**: Any param which is not explicitly supported will be passed directly to the
``anthropic.Anthropic.messages.create(...)`` API every time to the model is ``anthropic.Anthropic.messages.create(...)`` API every time to the model is
invoked. For example: invoked. For example:
.. code-block:: python
from langchain_anthropic import ChatAnthropic .. code-block:: python
import anthropic
ChatAnthropic(..., extra_headers={}).invoke(...) from langchain_anthropic import ChatAnthropic
import anthropic
# results in underlying API call of: ChatAnthropic(..., extra_headers={}).invoke(...)
anthropic.Anthropic(..).messages.create(..., extra_headers={}) # results in underlying API call of:
# which is also equivalent to: anthropic.Anthropic(..).messages.create(..., extra_headers={})
ChatAnthropic(...).invoke(..., extra_headers={}) # which is also equivalent to:
ChatAnthropic(...).invoke(..., extra_headers={})
Invoke: Invoke:
.. code-block:: python .. code-block:: python
@ -645,6 +706,35 @@ class ChatAnthropic(BaseChatModel):
"After examining both images carefully, I can see that they are actually identical." "After examining both images carefully, I can see that they are actually identical."
.. dropdown:: Files API
You can also pass in files that are managed through Anthropic's
`Files API <https://docs.anthropic.com/en/docs/build-with-claude/files>`_:
.. code-block:: python
from langchain_anthropic import ChatAnthropic
llm = ChatAnthropic(
model="claude-sonnet-4-20250514",
betas=["files-api-2025-04-14"],
)
input_message = {
"role": "user",
"content": [
{
"type": "text",
"text": "Describe this document.",
},
{
"type": "image",
"source_type": "id",
"id": "file_abc123...",
},
],
}
llm.invoke([input_message])
PDF input: PDF input:
See `multimodal guides <https://python.langchain.com/docs/how_to/multimodal_inputs/>`_ See `multimodal guides <https://python.langchain.com/docs/how_to/multimodal_inputs/>`_
for more detail. for more detail.
@ -681,6 +771,35 @@ class ChatAnthropic(BaseChatModel):
"This appears to be a simple document..." "This appears to be a simple document..."
.. dropdown:: Files API
You can also pass in files that are managed through Anthropic's
`Files API <https://docs.anthropic.com/en/docs/build-with-claude/files>`_:
.. code-block:: python
from langchain_anthropic import ChatAnthropic
llm = ChatAnthropic(
model="claude-sonnet-4-20250514",
betas=["files-api-2025-04-14"],
)
input_message = {
"role": "user",
"content": [
{
"type": "text",
"text": "Describe this document.",
},
{
"type": "file",
"source_type": "id",
"id": "file_abc123...",
},
],
}
llm.invoke([input_message])
Extended thinking: Extended thinking:
Claude 3.7 Sonnet supports an Claude 3.7 Sonnet supports an
`extended thinking <https://docs.anthropic.com/en/docs/build-with-claude/extended-thinking>`_ `extended thinking <https://docs.anthropic.com/en/docs/build-with-claude/extended-thinking>`_
@ -797,7 +916,7 @@ class ChatAnthropic(BaseChatModel):
or by setting ``stream_usage=False`` when initializing ChatAnthropic. or by setting ``stream_usage=False`` when initializing ChatAnthropic.
Prompt caching: Prompt caching:
See LangChain `docs <https://python.langchain.com/docs/integrations/chat/anthropic/>`_ See LangChain `docs <https://python.langchain.com/docs/integrations/chat/anthropic/#built-in-tools>`_
for more detail. for more detail.
.. code-block:: python .. code-block:: python
@ -834,6 +953,24 @@ class ChatAnthropic(BaseChatModel):
{'cache_read': 0, 'cache_creation': 1458} {'cache_read': 0, 'cache_creation': 1458}
.. dropdown:: Extended caching
The cache lifetime is 5 minutes by default. If this is too short, you can
apply one hour caching by enabling the ``"extended-cache-ttl-2025-04-11"``
beta header:
.. code-block:: python
llm = ChatAnthropic(
model="claude-3-7-sonnet-20250219",
betas=["extended-cache-ttl-2025-04-11"],
)
and specifying ``"cache_control": {"type": "ephemeral", "ttl": "1h"}``.
See `Claude documentation <https://docs.anthropic.com/en/docs/build-with-claude/prompt-caching#1-hour-cache-duration-beta>`_
for detail.
Token-efficient tool use (beta): Token-efficient tool use (beta):
See LangChain `docs <https://python.langchain.com/docs/integrations/chat/anthropic/>`_ See LangChain `docs <https://python.langchain.com/docs/integrations/chat/anthropic/>`_
for more detail. for more detail.
@ -875,7 +1012,7 @@ class ChatAnthropic(BaseChatModel):
See LangChain `docs <https://python.langchain.com/docs/integrations/chat/anthropic/>`_ See LangChain `docs <https://python.langchain.com/docs/integrations/chat/anthropic/>`_
for more detail. for more detail.
Web search: .. dropdown:: Web search
.. code-block:: python .. code-block:: python
@ -890,7 +1027,53 @@ class ChatAnthropic(BaseChatModel):
"How do I update a web app to TypeScript 5.5?" "How do I update a web app to TypeScript 5.5?"
) )
Text editor: .. dropdown:: Code execution
.. code-block:: python
llm = ChatAnthropic(
model="claude-sonnet-4-20250514",
betas=["code-execution-2025-05-22"],
)
tool = {"type": "code_execution_20250522", "name": "code_execution"}
llm_with_tools = llm.bind_tools([tool])
response = llm_with_tools.invoke(
"Calculate the mean and standard deviation of [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]"
)
.. dropdown:: Remote MCP
.. code-block:: python
from langchain_anthropic import ChatAnthropic
mcp_servers = [
{
"type": "url",
"url": "https://mcp.deepwiki.com/mcp",
"name": "deepwiki",
"tool_configuration": { # optional configuration
"enabled": True,
"allowed_tools": ["ask_question"],
},
"authorization_token": "PLACEHOLDER", # optional authorization
}
]
llm = ChatAnthropic(
model="claude-sonnet-4-20250514",
betas=["mcp-client-2025-04-04"],
mcp_servers=mcp_servers,
)
response = llm.invoke(
"What transport protocols does the 2025-03-26 version of the MCP "
"spec (modelcontextprotocol/modelcontextprotocol) support?"
)
.. dropdown:: Text editor
.. code-block:: python .. code-block:: python
@ -986,6 +1169,13 @@ class ChatAnthropic(BaseChatModel):
default_headers: Optional[Mapping[str, str]] = None default_headers: Optional[Mapping[str, str]] = None
"""Headers to pass to the Anthropic clients, will be used for every API call.""" """Headers to pass to the Anthropic clients, will be used for every API call."""
betas: Optional[list[str]] = None
"""List of beta features to enable. If specified, invocations will be routed
through client.beta.messages.create.
Example: ``betas=["mcp-client-2025-04-04"]``
"""
model_kwargs: dict[str, Any] = Field(default_factory=dict) model_kwargs: dict[str, Any] = Field(default_factory=dict)
streaming: bool = False streaming: bool = False
@ -1000,6 +1190,13 @@ class ChatAnthropic(BaseChatModel):
"""Parameters for Claude reasoning, """Parameters for Claude reasoning,
e.g., ``{"type": "enabled", "budget_tokens": 10_000}``""" e.g., ``{"type": "enabled", "budget_tokens": 10_000}``"""
mcp_servers: Optional[list[dict[str, Any]]] = None
"""List of MCP servers to use for the request.
Example: ``mcp_servers=[{"type": "url", "url": "https://mcp.example.com/mcp",
"name": "example-mcp"}]``
"""
@property @property
def _llm_type(self) -> str: def _llm_type(self) -> str:
"""Return type of chat model.""" """Return type of chat model."""
@ -1007,7 +1204,10 @@ class ChatAnthropic(BaseChatModel):
@property @property
def lc_secrets(self) -> dict[str, str]: def lc_secrets(self) -> dict[str, str]:
return {"anthropic_api_key": "ANTHROPIC_API_KEY"} return {
"anthropic_api_key": "ANTHROPIC_API_KEY",
"mcp_servers": "ANTHROPIC_MCP_SERVERS",
}
@classmethod @classmethod
def is_lc_serializable(cls) -> bool: def is_lc_serializable(cls) -> bool:
@ -1099,6 +1299,8 @@ class ChatAnthropic(BaseChatModel):
"top_k": self.top_k, "top_k": self.top_k,
"top_p": self.top_p, "top_p": self.top_p,
"stop_sequences": stop or self.stop_sequences, "stop_sequences": stop or self.stop_sequences,
"betas": self.betas,
"mcp_servers": self.mcp_servers,
"system": system, "system": system,
**self.model_kwargs, **self.model_kwargs,
**kwargs, **kwargs,
@ -1107,6 +1309,18 @@ class ChatAnthropic(BaseChatModel):
payload["thinking"] = self.thinking payload["thinking"] = self.thinking
return {k: v for k, v in payload.items() if v is not None} return {k: v for k, v in payload.items() if v is not None}
def _create(self, payload: dict) -> Any:
if "betas" in payload:
return self._client.beta.messages.create(**payload)
else:
return self._client.messages.create(**payload)
async def _acreate(self, payload: dict) -> Any:
if "betas" in payload:
return await self._async_client.beta.messages.create(**payload)
else:
return await self._async_client.messages.create(**payload)
def _stream( def _stream(
self, self,
messages: list[BaseMessage], messages: list[BaseMessage],
@ -1121,17 +1335,19 @@ class ChatAnthropic(BaseChatModel):
kwargs["stream"] = True kwargs["stream"] = True
payload = self._get_request_payload(messages, stop=stop, **kwargs) payload = self._get_request_payload(messages, stop=stop, **kwargs)
try: try:
stream = self._client.messages.create(**payload) stream = self._create(payload)
coerce_content_to_string = ( coerce_content_to_string = (
not _tools_in_params(payload) not _tools_in_params(payload)
and not _documents_in_params(payload) and not _documents_in_params(payload)
and not _thinking_in_params(payload) and not _thinking_in_params(payload)
) )
block_start_event = None
for event in stream: for event in stream:
msg = _make_message_chunk_from_anthropic_event( msg, block_start_event = _make_message_chunk_from_anthropic_event(
event, event,
stream_usage=stream_usage, stream_usage=stream_usage,
coerce_content_to_string=coerce_content_to_string, coerce_content_to_string=coerce_content_to_string,
block_start_event=block_start_event,
) )
if msg is not None: if msg is not None:
chunk = ChatGenerationChunk(message=msg) chunk = ChatGenerationChunk(message=msg)
@ -1155,17 +1371,19 @@ class ChatAnthropic(BaseChatModel):
kwargs["stream"] = True kwargs["stream"] = True
payload = self._get_request_payload(messages, stop=stop, **kwargs) payload = self._get_request_payload(messages, stop=stop, **kwargs)
try: try:
stream = await self._async_client.messages.create(**payload) stream = await self._acreate(payload)
coerce_content_to_string = ( coerce_content_to_string = (
not _tools_in_params(payload) not _tools_in_params(payload)
and not _documents_in_params(payload) and not _documents_in_params(payload)
and not _thinking_in_params(payload) and not _thinking_in_params(payload)
) )
block_start_event = None
async for event in stream: async for event in stream:
msg = _make_message_chunk_from_anthropic_event( msg, block_start_event = _make_message_chunk_from_anthropic_event(
event, event,
stream_usage=stream_usage, stream_usage=stream_usage,
coerce_content_to_string=coerce_content_to_string, coerce_content_to_string=coerce_content_to_string,
block_start_event=block_start_event,
) )
if msg is not None: if msg is not None:
chunk = ChatGenerationChunk(message=msg) chunk = ChatGenerationChunk(message=msg)
@ -1234,7 +1452,7 @@ class ChatAnthropic(BaseChatModel):
return generate_from_stream(stream_iter) return generate_from_stream(stream_iter)
payload = self._get_request_payload(messages, stop=stop, **kwargs) payload = self._get_request_payload(messages, stop=stop, **kwargs)
try: try:
data = self._client.messages.create(**payload) data = self._create(payload)
except anthropic.BadRequestError as e: except anthropic.BadRequestError as e:
_handle_anthropic_bad_request(e) _handle_anthropic_bad_request(e)
return self._format_output(data, **kwargs) return self._format_output(data, **kwargs)
@ -1253,7 +1471,7 @@ class ChatAnthropic(BaseChatModel):
return await agenerate_from_stream(stream_iter) return await agenerate_from_stream(stream_iter)
payload = self._get_request_payload(messages, stop=stop, **kwargs) payload = self._get_request_payload(messages, stop=stop, **kwargs)
try: try:
data = await self._async_client.messages.create(**payload) data = await self._acreate(payload)
except anthropic.BadRequestError as e: except anthropic.BadRequestError as e:
_handle_anthropic_bad_request(e) _handle_anthropic_bad_request(e)
return self._format_output(data, **kwargs) return self._format_output(data, **kwargs)
@ -1722,8 +1940,10 @@ def convert_to_anthropic_tool(
def _tools_in_params(params: dict) -> bool: def _tools_in_params(params: dict) -> bool:
return "tools" in params or ( return (
"extra_body" in params and params["extra_body"].get("tools") "tools" in params
or ("extra_body" in params and params["extra_body"].get("tools"))
or "mcp_servers" in params
) )
@ -1772,7 +1992,8 @@ def _make_message_chunk_from_anthropic_event(
*, *,
stream_usage: bool = True, stream_usage: bool = True,
coerce_content_to_string: bool, coerce_content_to_string: bool,
) -> Optional[AIMessageChunk]: block_start_event: Optional[anthropic.types.RawMessageStreamEvent] = None,
) -> tuple[Optional[AIMessageChunk], Optional[anthropic.types.RawMessageStreamEvent]]:
"""Convert Anthropic event to AIMessageChunk. """Convert Anthropic event to AIMessageChunk.
Note that not all events will result in a message chunk. In these cases Note that not all events will result in a message chunk. In these cases
@ -1800,7 +2021,17 @@ def _make_message_chunk_from_anthropic_event(
elif ( elif (
event.type == "content_block_start" event.type == "content_block_start"
and event.content_block is not None and event.content_block is not None
and event.content_block.type in ("tool_use", "document", "redacted_thinking") and event.content_block.type
in (
"tool_use",
"code_execution_tool_result",
"document",
"redacted_thinking",
"mcp_tool_use",
"mcp_tool_result",
"server_tool_use",
"web_search_tool_result",
)
): ):
if coerce_content_to_string: if coerce_content_to_string:
warnings.warn("Received unexpected tool content block.") warnings.warn("Received unexpected tool content block.")
@ -1820,6 +2051,7 @@ def _make_message_chunk_from_anthropic_event(
content=[content_block], content=[content_block],
tool_call_chunks=tool_call_chunks, # type: ignore tool_call_chunks=tool_call_chunks, # type: ignore
) )
block_start_event = event
elif event.type == "content_block_delta": elif event.type == "content_block_delta":
if event.delta.type in ("text_delta", "citations_delta"): if event.delta.type in ("text_delta", "citations_delta"):
if coerce_content_to_string and hasattr(event.delta, "text"): if coerce_content_to_string and hasattr(event.delta, "text"):
@ -1849,16 +2081,23 @@ def _make_message_chunk_from_anthropic_event(
elif event.delta.type == "input_json_delta": elif event.delta.type == "input_json_delta":
content_block = event.delta.model_dump() content_block = event.delta.model_dump()
content_block["index"] = event.index content_block["index"] = event.index
content_block["type"] = "tool_use" if (
tool_call_chunk = create_tool_call_chunk( (block_start_event is not None)
index=event.index, and hasattr(block_start_event, "content_block")
id=None, and (block_start_event.content_block.type == "tool_use")
name=None, ):
args=event.delta.partial_json, tool_call_chunk = create_tool_call_chunk(
) index=event.index,
id=None,
name=None,
args=event.delta.partial_json,
)
tool_call_chunks = [tool_call_chunk]
else:
tool_call_chunks = []
message_chunk = AIMessageChunk( message_chunk = AIMessageChunk(
content=[content_block], content=[content_block],
tool_call_chunks=[tool_call_chunk], # type: ignore tool_call_chunks=tool_call_chunks, # type: ignore
) )
elif event.type == "message_delta" and stream_usage: elif event.type == "message_delta" and stream_usage:
usage_metadata = UsageMetadata( usage_metadata = UsageMetadata(
@ -1877,7 +2116,7 @@ def _make_message_chunk_from_anthropic_event(
else: else:
pass pass
return message_chunk return message_chunk, block_start_event
@deprecated(since="0.1.0", removal="1.0.0", alternative="ChatAnthropic") @deprecated(since="0.1.0", removal="1.0.0", alternative="ChatAnthropic")

View File

@ -7,7 +7,7 @@ authors = []
license = { text = "MIT" } license = { text = "MIT" }
requires-python = ">=3.9" requires-python = ">=3.9"
dependencies = [ dependencies = [
"anthropic<1,>=0.51.0", "anthropic<1,>=0.52.0",
"langchain-core<1.0.0,>=0.3.59", "langchain-core<1.0.0,>=0.3.59",
"pydantic<3.0.0,>=2.7.4", "pydantic<3.0.0,>=2.7.4",
] ]

View File

@ -1,6 +1,7 @@
"""Test ChatAnthropic chat model.""" """Test ChatAnthropic chat model."""
import json import json
import os
from base64 import b64encode from base64 import b64encode
from typing import Optional from typing import Optional
@ -863,3 +864,206 @@ def test_image_tool_calling() -> None:
] ]
llm = ChatAnthropic(model="claude-3-5-sonnet-latest") llm = ChatAnthropic(model="claude-3-5-sonnet-latest")
llm.bind_tools([color_picker]).invoke(messages) llm.bind_tools([color_picker]).invoke(messages)
# TODO: set up VCR
def test_web_search() -> None:
pytest.skip()
llm = ChatAnthropic(model="claude-3-5-sonnet-latest")
tool = {"type": "web_search_20250305", "name": "web_search", "max_uses": 1}
llm_with_tools = llm.bind_tools([tool])
input_message = {
"role": "user",
"content": [
{
"type": "text",
"text": "How do I update a web app to TypeScript 5.5?",
}
],
}
response = llm_with_tools.invoke([input_message])
block_types = {block["type"] for block in response.content}
assert block_types == {"text", "server_tool_use", "web_search_tool_result"}
# Test streaming
full: Optional[BaseMessageChunk] = None
for chunk in llm_with_tools.stream([input_message]):
assert isinstance(chunk, AIMessageChunk)
full = chunk if full is None else full + chunk
assert isinstance(full, AIMessageChunk)
assert isinstance(full.content, list)
block_types = {block["type"] for block in full.content} # type: ignore[index]
assert block_types == {"text", "server_tool_use", "web_search_tool_result"}
# Test we can pass back in
next_message = {
"role": "user",
"content": "Please repeat the last search, but focus on sources from 2024.",
}
_ = llm_with_tools.invoke(
[input_message, full, next_message],
)
def test_code_execution() -> None:
pytest.skip()
llm = ChatAnthropic(
model="claude-sonnet-4-20250514",
betas=["code-execution-2025-05-22"],
)
tool = {"type": "code_execution_20250522", "name": "code_execution"}
llm_with_tools = llm.bind_tools([tool])
input_message = {
"role": "user",
"content": [
{
"type": "text",
"text": (
"Calculate the mean and standard deviation of "
"[1, 2, 3, 4, 5, 6, 7, 8, 9, 10]"
),
}
],
}
response = llm_with_tools.invoke([input_message])
block_types = {block["type"] for block in response.content}
assert block_types == {"text", "server_tool_use", "code_execution_tool_result"}
# Test streaming
full: Optional[BaseMessageChunk] = None
for chunk in llm_with_tools.stream([input_message]):
assert isinstance(chunk, AIMessageChunk)
full = chunk if full is None else full + chunk
assert isinstance(full, AIMessageChunk)
assert isinstance(full.content, list)
block_types = {block["type"] for block in full.content} # type: ignore[index]
assert block_types == {"text", "server_tool_use", "code_execution_tool_result"}
# Test we can pass back in
next_message = {
"role": "user",
"content": "Please add more comments to the code.",
}
_ = llm_with_tools.invoke(
[input_message, full, next_message],
)
def test_remote_mcp() -> None:
pytest.skip()
mcp_servers = [
{
"type": "url",
"url": "https://mcp.deepwiki.com/mcp",
"name": "deepwiki",
"tool_configuration": {"enabled": True, "allowed_tools": ["ask_question"]},
"authorization_token": "PLACEHOLDER",
}
]
llm = ChatAnthropic(
model="claude-sonnet-4-20250514",
betas=["mcp-client-2025-04-04"],
mcp_servers=mcp_servers,
)
input_message = {
"role": "user",
"content": [
{
"type": "text",
"text": (
"What transport protocols does the 2025-03-26 version of the MCP "
"spec (modelcontextprotocol/modelcontextprotocol) support?"
),
}
],
}
response = llm.invoke([input_message])
block_types = {block["type"] for block in response.content}
assert block_types == {"text", "mcp_tool_use", "mcp_tool_result"}
# Test streaming
full: Optional[BaseMessageChunk] = None
for chunk in llm.stream([input_message]):
assert isinstance(chunk, AIMessageChunk)
full = chunk if full is None else full + chunk
assert isinstance(full, AIMessageChunk)
assert isinstance(full.content, list)
block_types = {block["type"] for block in full.content}
assert block_types == {"text", "mcp_tool_use", "mcp_tool_result"}
# Test we can pass back in
next_message = {
"role": "user",
"content": "Please query the same tool again, but add 'please' to your query.",
}
_ = llm.invoke(
[input_message, full, next_message],
)
@pytest.mark.parametrize("block_format", ["anthropic", "standard"])
def test_files_api_image(block_format: str) -> None:
image_file_id = os.getenv("ANTHROPIC_FILES_API_IMAGE_ID")
if not image_file_id:
pytest.skip()
llm = ChatAnthropic(
model="claude-sonnet-4-20250514",
betas=["files-api-2025-04-14"],
)
if block_format == "anthropic":
block = {
"type": "image",
"source": {
"type": "file",
"file_id": image_file_id,
},
}
else:
# standard block format
block = {
"type": "image",
"source_type": "id",
"id": image_file_id,
}
input_message = {
"role": "user",
"content": [
{"type": "text", "text": "Describe this image."},
block,
],
}
_ = llm.invoke([input_message])
@pytest.mark.parametrize("block_format", ["anthropic", "standard"])
def test_files_api_pdf(block_format: str) -> None:
pdf_file_id = os.getenv("ANTHROPIC_FILES_API_PDF_ID")
if not pdf_file_id:
pytest.skip()
llm = ChatAnthropic(
model="claude-sonnet-4-20250514",
betas=["files-api-2025-04-14"],
)
if block_format == "anthropic":
block = {"type": "document", "source": {"type": "file", "file_id": pdf_file_id}}
else:
# standard block format
block = {
"type": "file",
"source_type": "id",
"id": pdf_file_id,
}
input_message = {
"role": "user",
"content": [
{"type": "text", "text": "Describe this document."},
block,
],
}
_ = llm.invoke([input_message])

View File

@ -2,7 +2,7 @@
import os import os
from typing import Any, Callable, Literal, Optional, cast from typing import Any, Callable, Literal, Optional, cast
from unittest.mock import patch from unittest.mock import MagicMock, patch
import anthropic import anthropic
import pytest import pytest
@ -10,6 +10,8 @@ from anthropic.types import Message, TextBlock, Usage
from langchain_core.messages import AIMessage, HumanMessage, SystemMessage, ToolMessage from langchain_core.messages import AIMessage, HumanMessage, SystemMessage, ToolMessage
from langchain_core.runnables import RunnableBinding from langchain_core.runnables import RunnableBinding
from langchain_core.tools import BaseTool from langchain_core.tools import BaseTool
from langchain_core.tracers.base import BaseTracer
from langchain_core.tracers.schemas import Run
from pydantic import BaseModel, Field, SecretStr from pydantic import BaseModel, Field, SecretStr
from pytest import CaptureFixture, MonkeyPatch from pytest import CaptureFixture, MonkeyPatch
@ -994,3 +996,63 @@ def test_usage_metadata_standardization() -> None:
assert result["input_tokens"] == 0 assert result["input_tokens"] == 0
assert result["output_tokens"] == 0 assert result["output_tokens"] == 0
assert result["total_tokens"] == 0 assert result["total_tokens"] == 0
class FakeTracer(BaseTracer):
def __init__(self) -> None:
super().__init__()
self.chat_model_start_inputs: list = []
def _persist_run(self, run: Run) -> None:
"""Persist a run."""
pass
def on_chat_model_start(self, *args: Any, **kwargs: Any) -> Run:
self.chat_model_start_inputs.append({"args": args, "kwargs": kwargs})
return super().on_chat_model_start(*args, **kwargs)
def test_mcp_tracing() -> None:
# Test we exclude sensitive information from traces
mcp_servers = [
{
"type": "url",
"url": "https://mcp.deepwiki.com/mcp",
"name": "deepwiki",
"authorization_token": "PLACEHOLDER",
}
]
llm = ChatAnthropic(
model="claude-sonnet-4-20250514",
betas=["mcp-client-2025-04-04"],
mcp_servers=mcp_servers,
)
tracer = FakeTracer()
mock_client = MagicMock()
def mock_create(*args: Any, **kwargs: Any) -> Message:
return Message(
id="foo",
content=[TextBlock(type="text", text="bar")],
model="baz",
role="assistant",
stop_reason=None,
stop_sequence=None,
usage=Usage(input_tokens=2, output_tokens=1),
type="message",
)
mock_client.messages.create = mock_create
input_message = HumanMessage("Test query")
with patch.object(llm, "_client", mock_client):
_ = llm.invoke([input_message], config={"callbacks": [tracer]})
# Test headers are not traced
assert len(tracer.chat_model_start_inputs) == 1
assert "PLACEHOLDER" not in str(tracer.chat_model_start_inputs)
# Test headers are correctly propagated to request
payload = llm._get_request_payload([input_message])
assert payload["mcp_servers"][0]["authorization_token"] == "PLACEHOLDER"

View File

@ -18,7 +18,7 @@ wheels = [
[[package]] [[package]]
name = "anthropic" name = "anthropic"
version = "0.51.0" version = "0.52.0"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
dependencies = [ dependencies = [
{ name = "anyio" }, { name = "anyio" },
@ -29,9 +29,9 @@ dependencies = [
{ name = "sniffio" }, { name = "sniffio" },
{ name = "typing-extensions" }, { name = "typing-extensions" },
] ]
sdist = { url = "https://files.pythonhosted.org/packages/63/4a/96f99a61ae299f9e5aa3e765d7342d95ab2e2ba5b69a3ffedb00ef779651/anthropic-0.51.0.tar.gz", hash = "sha256:6f824451277992af079554430d5b2c8ff5bc059cc2c968cdc3f06824437da201", size = 219063 } sdist = { url = "https://files.pythonhosted.org/packages/57/fd/8a9332f5baf352c272494a9d359863a53385a208954c1a7251a524071930/anthropic-0.52.0.tar.gz", hash = "sha256:f06bc924d7eb85f8a43fe587b875ff58b410d60251b7dc5f1387b322a35bd67b", size = 229372 }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/8c/6e/9637122c5f007103bd5a259f4250bd8f1533dd2473227670fd10a1457b62/anthropic-0.51.0-py3-none-any.whl", hash = "sha256:b8b47d482c9aa1f81b923555cebb687c2730309a20d01be554730c8302e0f62a", size = 263957 }, { url = "https://files.pythonhosted.org/packages/a0/43/172c0031654908bbac2a87d356fff4de1b4947a9b14b9658540b69416417/anthropic-0.52.0-py3-none-any.whl", hash = "sha256:c026daa164f0e3bde36ce9cbdd27f5f1419fff03306be1e138726f42e6a7810f", size = 286076 },
] ]
[[package]] [[package]]
@ -451,7 +451,7 @@ typing = [
[package.metadata] [package.metadata]
requires-dist = [ requires-dist = [
{ name = "anthropic", specifier = ">=0.51.0,<1" }, { name = "anthropic", specifier = ">=0.52.0,<1" },
{ name = "langchain-core", editable = "../../core" }, { name = "langchain-core", editable = "../../core" },
{ name = "pydantic", specifier = ">=2.7.4,<3.0.0" }, { name = "pydantic", specifier = ">=2.7.4,<3.0.0" },
] ]
@ -486,7 +486,7 @@ typing = [
[[package]] [[package]]
name = "langchain-core" name = "langchain-core"
version = "0.3.59" version = "0.3.61"
source = { editable = "../../core" } source = { editable = "../../core" }
dependencies = [ dependencies = [
{ name = "jsonpatch" }, { name = "jsonpatch" },
@ -501,10 +501,9 @@ dependencies = [
[package.metadata] [package.metadata]
requires-dist = [ requires-dist = [
{ name = "jsonpatch", specifier = ">=1.33,<2.0" }, { name = "jsonpatch", specifier = ">=1.33,<2.0" },
{ name = "langsmith", specifier = ">=0.1.125,<0.4" }, { name = "langsmith", specifier = ">=0.1.126,<0.4" },
{ name = "packaging", specifier = ">=23.2,<25" }, { name = "packaging", specifier = ">=23.2,<25" },
{ name = "pydantic", marker = "python_full_version < '3.12.4'", specifier = ">=2.5.2,<3.0.0" }, { name = "pydantic", specifier = ">=2.7.4" },
{ name = "pydantic", marker = "python_full_version >= '3.12.4'", specifier = ">=2.7.4,<3.0.0" },
{ name = "pyyaml", specifier = ">=5.3" }, { name = "pyyaml", specifier = ">=5.3" },
{ name = "tenacity", specifier = ">=8.1.0,!=8.4.0,<10.0.0" }, { name = "tenacity", specifier = ">=8.1.0,!=8.4.0,<10.0.0" },
{ name = "typing-extensions", specifier = ">=4.7" }, { name = "typing-extensions", specifier = ">=4.7" },