mirror of
https://github.com/hwchase17/langchain.git
synced 2026-02-21 06:33:41 +00:00
345 lines
11 KiB
Plaintext
345 lines
11 KiB
Plaintext
{
|
|
"cells": [
|
|
{
|
|
"cell_type": "raw",
|
|
"id": "afaf8039",
|
|
"metadata": {},
|
|
"source": [
|
|
"---\n",
|
|
"sidebar_label: Anthropic\n",
|
|
"---"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "e49f1e0d",
|
|
"metadata": {},
|
|
"source": [
|
|
"# ChatAnthropic\n",
|
|
"\n",
|
|
"This notebook provides a quick overview for getting started with Anthropic [chat models](/docs/concepts/#chat-models). For detailed documentation of all ChatAnthropic features and configurations head to the [API reference](https://api.python.langchain.com/en/latest/chat_models/langchain_anthropic.chat_models.ChatAnthropic.html).\n",
|
|
"\n",
|
|
"Anthropic has several chat models. You can find information about their latest models and their costs, context windows, and supported input types in the [Anthropic docs](https://docs.anthropic.com/en/docs/models-overview).\n",
|
|
"\n",
|
|
"\n",
|
|
":::info AWS Bedrock and Google VertexAI\n",
|
|
"\n",
|
|
"Note that certain Anthropic models can also be accessed via AWS Bedrock and Google VertexAI. See the [ChatBedrock](/docs/integrations/chat/bedrock/) and [ChatVertexAI](/docs/integrations/chat/google_vertex_ai_palm/) integrations to use Anthropic models via these services.\n",
|
|
"\n",
|
|
":::\n",
|
|
"\n",
|
|
"## Overview\n",
|
|
"### Integration details\n",
|
|
"\n",
|
|
"| Class | Package | Local | Serializable | [JS support](https://js.langchain.com/v0.2/docs/integrations/chat/anthropic) | Package downloads | Package latest |\n",
|
|
"| :--- | :--- | :---: | :---: | :---: | :---: | :---: |\n",
|
|
"| [ChatAnthropic](https://api.python.langchain.com/en/latest/chat_models/langchain_anthropic.chat_models.ChatAnthropic.html) | [langchain-anthropic](https://api.python.langchain.com/en/latest/anthropic_api_reference.html) | ❌ | beta | ✅ |  |  |\n",
|
|
"\n",
|
|
"### Model features\n",
|
|
"| [Tool calling](/docs/how_to/tool_calling/) | [Structured output](/docs/how_to/structured_output/) | JSON mode | [Image input](/docs/how_to/multimodal_inputs/) | Audio input | Video input | [Token-level streaming](/docs/how_to/chat_streaming/) | Native async | [Token usage](/docs/how_to/chat_token_usage_tracking/) | [Logprobs](/docs/how_to/logprobs/) |\n",
|
|
"| :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |\n",
|
|
"| ✅ | ✅ | ❌ | ✅ | ❌ | ❌ | ✅ | ✅ | ✅ | ❌ | \n",
|
|
"\n",
|
|
"## Setup\n",
|
|
"\n",
|
|
"To access Anthropic models you'll need to create an Anthropic account, get an API key, and install the `langchain-anthropic` integration package.\n",
|
|
"\n",
|
|
"### Credentials\n",
|
|
"\n",
|
|
"Head to https://console.anthropic.com/ to sign up for Anthropic and generate an API key. Once you've done this set the ANTHROPIC_API_KEY environment variable:"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"id": "433e8d2b-9519-4b49-b2c4-7ab65b046c94",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"import getpass\n",
|
|
"import os\n",
|
|
"\n",
|
|
"os.environ[\"anthropic_API_KEY\"] = getpass.getpass(\"Enter your Anthropic API key: \")"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "72ee0c4b-9764-423a-9dbf-95129e185210",
|
|
"metadata": {},
|
|
"source": [
|
|
"If you want to get automated tracing of your model calls you can also set your [LangSmith](https://docs.smith.langchain.com/) API key by uncommenting below:"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"id": "a15d341e-3e26-4ca3-830b-5aab30ed66de",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"# os.environ[\"LANGSMITH_API_KEY\"] = getpass.getpass(\"Enter your LangSmith API key: \")\n",
|
|
"# os.environ[\"LANGSMITH_TRACING\"] = \"true\""
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "0730d6a1-c893-4840-9817-5e5251676d5d",
|
|
"metadata": {},
|
|
"source": [
|
|
"### Installation\n",
|
|
"\n",
|
|
"The LangChain Anthropic integration lives in the `langchain-anthropic` package:"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"id": "652d6238-1f87-422a-b135-f5abbb8652fc",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"%pip install -qU langchain-anthropic"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "a38cde65-254d-4219-a441-068766c0d4b5",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Instantiation\n",
|
|
"\n",
|
|
"Now we can instantiate our model object and generate chat completions:"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 1,
|
|
"id": "cb09c344-1836-4e0c-acf8-11d13ac1dbae",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"from langchain_anthropic import ChatAnthropic\n",
|
|
"\n",
|
|
"llm = ChatAnthropic(\n",
|
|
" model=\"claude-3-sonnet-20240229\",\n",
|
|
" temperature=0,\n",
|
|
" max_tokens=1024,\n",
|
|
" timeout=None,\n",
|
|
" max_retries=2,\n",
|
|
" # other params...\n",
|
|
")"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "2b4f3e15",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Invocation\n"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 2,
|
|
"id": "62e0dbc3",
|
|
"metadata": {
|
|
"tags": []
|
|
},
|
|
"outputs": [
|
|
{
|
|
"data": {
|
|
"text/plain": [
|
|
"AIMessage(content=\"Voici la traduction en français :\\n\\nJ'aime la programmation.\", response_metadata={'id': 'msg_013qztabaFADNnKsHR1rdrju', 'model': 'claude-3-sonnet-20240229', 'stop_reason': 'end_turn', 'stop_sequence': None, 'usage': {'input_tokens': 29, 'output_tokens': 21}}, id='run-a22ab30c-7e09-48f5-bc27-a08a9d8f7fa1-0', usage_metadata={'input_tokens': 29, 'output_tokens': 21, 'total_tokens': 50})"
|
|
]
|
|
},
|
|
"execution_count": 2,
|
|
"metadata": {},
|
|
"output_type": "execute_result"
|
|
}
|
|
],
|
|
"source": [
|
|
"messages = [\n",
|
|
" (\n",
|
|
" \"system\",\n",
|
|
" \"You are a helpful assistant that translates English to French. Translate the user sentence.\",\n",
|
|
" ),\n",
|
|
" (\"human\", \"I love programming.\"),\n",
|
|
"]\n",
|
|
"ai_msg = llm.invoke(messages)\n",
|
|
"ai_msg"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 3,
|
|
"id": "d86145b3-bfef-46e8-b227-4dda5c9c2705",
|
|
"metadata": {},
|
|
"outputs": [
|
|
{
|
|
"name": "stdout",
|
|
"output_type": "stream",
|
|
"text": [
|
|
"Voici la traduction en français :\n",
|
|
"\n",
|
|
"J'aime la programmation.\n"
|
|
]
|
|
}
|
|
],
|
|
"source": [
|
|
"print(ai_msg.content)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "18e2bfc0-7e78-4528-a73f-499ac150dca8",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Chaining\n",
|
|
"\n",
|
|
"We can [chain](/docs/how_to/sequence/) our model with a prompt template like so:"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 4,
|
|
"id": "e197d1d7-a070-4c96-9f8a-a0e86d046e0b",
|
|
"metadata": {},
|
|
"outputs": [
|
|
{
|
|
"data": {
|
|
"text/plain": [
|
|
"AIMessage(content='Ich liebe Programmieren.', response_metadata={'id': 'msg_01FWrA8w9HbjqYPTQ7VryUnp', 'model': 'claude-3-sonnet-20240229', 'stop_reason': 'end_turn', 'stop_sequence': None, 'usage': {'input_tokens': 23, 'output_tokens': 11}}, id='run-b749bf20-b46d-4d62-ac73-f59adab6dd7e-0', usage_metadata={'input_tokens': 23, 'output_tokens': 11, 'total_tokens': 34})"
|
|
]
|
|
},
|
|
"execution_count": 4,
|
|
"metadata": {},
|
|
"output_type": "execute_result"
|
|
}
|
|
],
|
|
"source": [
|
|
"from langchain_core.prompts import ChatPromptTemplate\n",
|
|
"\n",
|
|
"prompt = ChatPromptTemplate.from_messages(\n",
|
|
" [\n",
|
|
" (\n",
|
|
" \"system\",\n",
|
|
" \"You are a helpful assistant that translates {input_language} to {output_language}.\",\n",
|
|
" ),\n",
|
|
" (\"human\", \"{input}\"),\n",
|
|
" ]\n",
|
|
")\n",
|
|
"\n",
|
|
"chain = prompt | llm\n",
|
|
"chain.invoke(\n",
|
|
" {\n",
|
|
" \"input_language\": \"English\",\n",
|
|
" \"output_language\": \"German\",\n",
|
|
" \"input\": \"I love programming.\",\n",
|
|
" }\n",
|
|
")"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "d1ee55bc-ffc8-4cfa-801c-993953a08cfd",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Content blocks\n",
|
|
"\n",
|
|
"One key difference to note between Anthropic models and most others is that the contents of a single Anthropic AI message can either be a single string or a **list of content blocks**. For example when an Anthropic model invokes a tool, the tool invocation is part of the message content (as well as being exposed in the standardized `AIMessage.tool_calls`):"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 10,
|
|
"id": "4a374a24-2534-4e6f-825b-30fab7bbe0cb",
|
|
"metadata": {},
|
|
"outputs": [
|
|
{
|
|
"data": {
|
|
"text/plain": [
|
|
"[{'text': \"Okay, let's use the GetWeather tool to check the current temperatures in Los Angeles and New York City.\",\n",
|
|
" 'type': 'text'},\n",
|
|
" {'id': 'toolu_01Tnp5tL7LJZaVyQXKEjbqcC',\n",
|
|
" 'input': {'location': 'Los Angeles, CA'},\n",
|
|
" 'name': 'GetWeather',\n",
|
|
" 'type': 'tool_use'}]"
|
|
]
|
|
},
|
|
"execution_count": 10,
|
|
"metadata": {},
|
|
"output_type": "execute_result"
|
|
}
|
|
],
|
|
"source": [
|
|
"from langchain_core.pydantic_v1 import BaseModel, Field\n",
|
|
"\n",
|
|
"\n",
|
|
"class GetWeather(BaseModel):\n",
|
|
" \"\"\"Get the current weather in a given location\"\"\"\n",
|
|
"\n",
|
|
" location: str = Field(..., description=\"The city and state, e.g. San Francisco, CA\")\n",
|
|
"\n",
|
|
"\n",
|
|
"llm_with_tools = llm.bind_tools([GetWeather])\n",
|
|
"ai_msg = llm_with_tools.invoke(\"Which city is hotter today: LA or NY?\")\n",
|
|
"ai_msg.content"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 11,
|
|
"id": "6b4a1ead-952c-489f-a8d4-355d3fb55f3f",
|
|
"metadata": {},
|
|
"outputs": [
|
|
{
|
|
"data": {
|
|
"text/plain": [
|
|
"[{'name': 'GetWeather',\n",
|
|
" 'args': {'location': 'Los Angeles, CA'},\n",
|
|
" 'id': 'toolu_01Tnp5tL7LJZaVyQXKEjbqcC'}]"
|
|
]
|
|
},
|
|
"execution_count": 11,
|
|
"metadata": {},
|
|
"output_type": "execute_result"
|
|
}
|
|
],
|
|
"source": [
|
|
"ai_msg.tool_calls"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "3a5bb5ca-c3ae-4a58-be67-2cd18574b9a3",
|
|
"metadata": {},
|
|
"source": [
|
|
"## API reference\n",
|
|
"\n",
|
|
"For detailed documentation of all ChatAnthropic features and configurations head to the API reference: https://api.python.langchain.com/en/latest/chat_models/langchain_anthropic.chat_models.ChatAnthropic.html"
|
|
]
|
|
}
|
|
],
|
|
"metadata": {
|
|
"kernelspec": {
|
|
"display_name": "Python 3 (ipykernel)",
|
|
"language": "python",
|
|
"name": "python3"
|
|
},
|
|
"language_info": {
|
|
"codemirror_mode": {
|
|
"name": "ipython",
|
|
"version": 3
|
|
},
|
|
"file_extension": ".py",
|
|
"mimetype": "text/x-python",
|
|
"name": "python",
|
|
"nbconvert_exporter": "python",
|
|
"pygments_lexer": "ipython3",
|
|
"version": "3.9.1"
|
|
}
|
|
},
|
|
"nbformat": 4,
|
|
"nbformat_minor": 5
|
|
}
|