anthropic: support for code execution, MCP connector, files API features (#31340)

Support for the new [batch of beta
features](https://www.anthropic.com/news/agent-capabilities-api)
released yesterday:

- [Code
execution](https://docs.anthropic.com/en/docs/agents-and-tools/tool-use/code-execution-tool)
- [MCP
connector](https://docs.anthropic.com/en/docs/agents-and-tools/mcp-connector)
- [Files
API](https://docs.anthropic.com/en/docs/build-with-claude/files)

Also verified support for [prompt cache
TTL](https://docs.anthropic.com/en/docs/build-with-claude/prompt-caching#1-hour-cache-duration-beta).
This commit is contained in:
ccurme
2025-05-27 12:45:45 -04:00
committed by GitHub
parent 1ebcbf1d11
commit 580986b260
11 changed files with 824 additions and 41 deletions

View File

@@ -325,6 +325,102 @@
"ai_msg.tool_calls"
]
},
{
"cell_type": "markdown",
"id": "535a16e4-cd5a-479f-b315-37c816ec4387",
"metadata": {},
"source": [
"## Multimodal\n",
"\n",
"Claude supports image and PDF inputs as content blocks, both in Anthropic's native format (see docs for [vision](https://docs.anthropic.com/en/docs/build-with-claude/vision#base64-encoded-image-example) and [PDF support](https://docs.anthropic.com/en/docs/build-with-claude/pdf-support)) as well as LangChain's [standard format](/docs/how_to/multimodal_inputs/).\n",
"\n",
"### Files API\n",
"\n",
"Claude also supports interactions with files through its managed [Files API](https://docs.anthropic.com/en/docs/build-with-claude/files). See examples below.\n",
"\n",
"The Files API can also be used to upload files to a container for use with Claude's built-in code-execution tools. See the [code execution](#code-execution) section below, for details.\n",
"\n",
"<details>\n",
"<summary>Images</summary>\n",
"\n",
"```python\n",
"# Upload image\n",
"\n",
"import anthropic\n",
"\n",
"client = anthropic.Anthropic()\n",
"file = client.beta.files.upload(\n",
" # Supports image/jpeg, image/png, image/gif, image/webp\n",
" file=(\"image.png\", open(\"/path/to/image.png\", \"rb\"), \"image/png\"),\n",
")\n",
"image_file_id = file.id\n",
"\n",
"\n",
"# Run inference\n",
"from langchain_anthropic import ChatAnthropic\n",
"\n",
"llm = ChatAnthropic(\n",
" model=\"claude-sonnet-4-20250514\",\n",
" betas=[\"files-api-2025-04-14\"],\n",
")\n",
"\n",
"input_message = {\n",
" \"role\": \"user\",\n",
" \"content\": [\n",
" {\n",
" \"type\": \"text\",\n",
" \"text\": \"Describe this image.\",\n",
" },\n",
" {\n",
" \"type\": \"image\",\n",
" \"source\": {\n",
" \"type\": \"file\",\n",
" \"file_id\": image_file_id,\n",
" },\n",
" },\n",
" ],\n",
"}\n",
"llm.invoke([input_message])\n",
"```\n",
"\n",
"</details>\n",
"\n",
"<details>\n",
"<summary>PDFs</summary>\n",
"\n",
"```python\n",
"# Upload document\n",
"\n",
"import anthropic\n",
"\n",
"client = anthropic.Anthropic()\n",
"file = client.beta.files.upload(\n",
" file=(\"document.pdf\", open(\"/path/to/document.pdf\", \"rb\"), \"application/pdf\"),\n",
")\n",
"pdf_file_id = file.id\n",
"\n",
"\n",
"# Run inference\n",
"from langchain_anthropic import ChatAnthropic\n",
"\n",
"llm = ChatAnthropic(\n",
" model=\"claude-sonnet-4-20250514\",\n",
" betas=[\"files-api-2025-04-14\"],\n",
")\n",
"\n",
"input_message = {\n",
" \"role\": \"user\",\n",
" \"content\": [\n",
" {\"type\": \"text\", \"text\": \"Describe this document.\"},\n",
" {\"type\": \"document\", \"source\": {\"type\": \"file\", \"file_id\": pdf_file_id}}\n",
" ],\n",
"}\n",
"llm.invoke([input_message])\n",
"```\n",
"\n",
"</details>"
]
},
{
"cell_type": "markdown",
"id": "6e36d25c-f358-49e5-aefa-b99fbd3fec6b",
@@ -454,6 +550,27 @@
"print(f\"\\nSecond:\\n{usage_2}\")"
]
},
{
"cell_type": "markdown",
"id": "9678656f-1ec4-4bf1-bf62-bbd49eb5c4e7",
"metadata": {},
"source": [
":::tip Extended caching\n",
"\n",
" The cache lifetime is 5 minutes by default. If this is too short, you can apply one hour caching by enabling the `\"extended-cache-ttl-2025-04-11\"` beta header:\n",
"\n",
" ```python\n",
" llm = ChatAnthropic(\n",
" model=\"claude-3-7-sonnet-20250219\",\n",
" # highlight-next-line\n",
" betas=[\"extended-cache-ttl-2025-04-11\"],\n",
" )\n",
" ```\n",
" and specifying `\"cache_control\": {\"type\": \"ephemeral\", \"ttl\": \"1h\"}`.\n",
"\n",
":::"
]
},
{
"cell_type": "markdown",
"id": "141ce9c5-012d-4502-9d61-4a413b5d959a",
@@ -953,6 +1070,159 @@
"response = llm_with_tools.invoke(\"How do I update a web app to TypeScript 5.5?\")"
]
},
{
"cell_type": "markdown",
"id": "1478cdc6-2e52-4870-80f9-b4ddf88f2db2",
"metadata": {},
"source": [
"### Code execution\n",
"\n",
"Claude can use a [code execution tool](https://docs.anthropic.com/en/docs/agents-and-tools/tool-use/code-execution-tool) to execute Python code in a sandboxed environment.\n",
"\n",
":::info Code execution is supported since ``langchain-anthropic>=0.3.14``\n",
"\n",
":::"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "2ce13632-a2da-439f-a429-f66481501630",
"metadata": {},
"outputs": [],
"source": [
"from langchain_anthropic import ChatAnthropic\n",
"\n",
"llm = ChatAnthropic(\n",
" model=\"claude-sonnet-4-20250514\",\n",
" betas=[\"code-execution-2025-05-22\"],\n",
")\n",
"\n",
"tool = {\"type\": \"code_execution_20250522\", \"name\": \"code_execution\"}\n",
"llm_with_tools = llm.bind_tools([tool])\n",
"\n",
"response = llm_with_tools.invoke(\n",
" \"Calculate the mean and standard deviation of \" \"[1, 2, 3, 4, 5, 6, 7, 8, 9, 10]\"\n",
")"
]
},
{
"cell_type": "markdown",
"id": "24076f91-3a3d-4e53-9618-429888197061",
"metadata": {},
"source": [
"<details>\n",
"<summary>Use with Files API</summary>\n",
"\n",
"Using the Files API, Claude can write code to access files for data analysis and other purposes. See example below:\n",
"\n",
"```python\n",
"# Upload file\n",
"\n",
"import anthropic\n",
"\n",
"client = anthropic.Anthropic()\n",
"file = client.beta.files.upload(\n",
" file=open(\"/path/to/sample_data.csv\", \"rb\")\n",
")\n",
"file_id = file.id\n",
"\n",
"\n",
"# Run inference\n",
"from langchain_anthropic import ChatAnthropic\n",
"\n",
"llm = ChatAnthropic(\n",
" model=\"claude-sonnet-4-20250514\",\n",
" betas=[\"code-execution-2025-05-22\"],\n",
")\n",
"\n",
"tool = {\"type\": \"code_execution_20250522\", \"name\": \"code_execution\"}\n",
"llm_with_tools = llm.bind_tools([tool])\n",
"\n",
"input_message = {\n",
" \"role\": \"user\",\n",
" \"content\": [\n",
" {\n",
" \"type\": \"text\",\n",
" \"text\": \"Please plot these data and tell me what you see.\",\n",
" },\n",
" {\n",
" \"type\": \"container_upload\",\n",
" \"file_id\": file_id,\n",
" },\n",
" ]\n",
"}\n",
"llm_with_tools.invoke([input_message])\n",
"```\n",
"\n",
"Note that Claude may generate files as part of its code execution. You can access these files using the Files API:\n",
"```python\n",
"# Take all file outputs for demonstration purposes\n",
"file_ids = []\n",
"for block in response.content:\n",
" if block[\"type\"] == \"code_execution_tool_result\":\n",
" file_ids.extend(\n",
" content[\"file_id\"]\n",
" for content in block.get(\"content\", {}).get(\"content\", [])\n",
" if \"file_id\" in content\n",
" )\n",
"\n",
"for i, file_id in enumerate(file_ids):\n",
" file_content = client.beta.files.download(file_id)\n",
" file_content.write_to_file(f\"/path/to/file_{i}.png\")\n",
"```\n",
"\n",
"</details>"
]
},
{
"cell_type": "markdown",
"id": "040f381a-1768-479a-9a5e-aa2d7d77e0d5",
"metadata": {},
"source": [
"### Remote MCP\n",
"\n",
"Claude can use a [MCP connector tool](https://docs.anthropic.com/en/docs/agents-and-tools/mcp-connector) for model-generated calls to remote MCP servers.\n",
"\n",
":::info Remote MCP is supported since ``langchain-anthropic>=0.3.14``\n",
"\n",
":::"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "22fc4a89-e6d8-4615-96cb-2e117349aebf",
"metadata": {},
"outputs": [],
"source": [
"from langchain_anthropic import ChatAnthropic\n",
"\n",
"mcp_servers = [\n",
" {\n",
" \"type\": \"url\",\n",
" \"url\": \"https://mcp.deepwiki.com/mcp\",\n",
" \"name\": \"deepwiki\",\n",
" \"tool_configuration\": { # optional configuration\n",
" \"enabled\": True,\n",
" \"allowed_tools\": [\"ask_question\"],\n",
" },\n",
" \"authorization_token\": \"PLACEHOLDER\", # optional authorization\n",
" }\n",
"]\n",
"\n",
"llm = ChatAnthropic(\n",
" model=\"claude-sonnet-4-20250514\",\n",
" betas=[\"mcp-client-2025-04-04\"],\n",
" mcp_servers=mcp_servers,\n",
")\n",
"\n",
"response = llm.invoke(\n",
" \"What transport protocols does the 2025-03-26 version of the MCP \"\n",
" \"spec (modelcontextprotocol/modelcontextprotocol) support?\"\n",
")"
]
},
{
"cell_type": "markdown",
"id": "2fd5d545-a40d-42b1-ad0c-0a79e2536c9b",