docs: anthropic quickstart (#18440)

This commit is contained in:
Bagatur 2024-03-03 13:59:28 -08:00 committed by GitHub
parent 74f3908182
commit db47b5deee
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
2 changed files with 64 additions and 6 deletions

View File

@ -65,10 +65,10 @@ We will link to relevant docs.
## LLM Chain
We'll show how to use models available via API, like OpenAI and Cohere, and local open source models, using integrations like Ollama.
We'll show how to use models available via API, like OpenAI, and local open source models, using integrations like Ollama.
<Tabs>
<TabItem value="openai" label="OpenAI (API)" default>
<TabItem value="openai" label="OpenAI" default>
First we'll need to import the LangChain x OpenAI integration package.
@ -115,7 +115,36 @@ llm = Ollama(model="llama2")
```
</TabItem>
<TabItem value="cohere" label="Cohere (API)" default>
<TabItem value="anthropic" label="Anthropic">
First we'll need to import the LangChain x Anthropic package.
```shell
pip install langchain-anthropic
```
Accessing the API requires an API key, which you can get by creating an account [here](https://claude.ai/login). Once we have a key we'll want to set it as an environment variable by running:
```shell
export ANTHROPIC_API_KEY="..."
```
We can then initialize the model:
```python
from langchain_anthropic import ChatAnthropic
llm = ChatAnthropic(model="claude-2.1", temperature=0.2, max_tokens=1024)
```
If you'd prefer not to set an environment variable you can pass the key in directly via the `anthropic_api_key` named parameter when initiating the Anthropic Chat Model class:
```python
llm = ChatAnthropic(anthropic_api_key="...")
```
</TabItem>
<TabItem value="cohere" label="Cohere">
First we'll need to import the Cohere SDK package.

View File

@ -7,7 +7,7 @@ sidebar_position: 0
The quick start will cover the basics of working with language models. It will introduce the two different types of models - LLMs and ChatModels. It will then cover how to use PromptTemplates to format the inputs to these models, and how to use Output Parsers to work with the outputs. For a deeper conceptual guide into these topics - please see [this documentation](./concepts)
## Models
For this getting started guide, we will provide two options: using OpenAI (a popular model available via API) or using a local open source model.
For this getting started guide, we will provide a few options: using an API like Anthropic or OpenAI, or using a local open source model via Ollama.
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
@ -62,6 +62,35 @@ from langchain_community.chat_models import ChatOllama
llm = Ollama(model="llama2")
chat_model = ChatOllama()
```
</TabItem>
<TabItem value="anthropic" label="Anthropic (chat model only)">
First we'll need to import the LangChain x Anthropic package.
```shell
pip install langchain-anthropic
```
Accessing the API requires an API key, which you can get by creating an account [here](https://claude.ai/login). Once we have a key we'll want to set it as an environment variable by running:
```shell
export ANTHROPIC_API_KEY="..."
```
We can then initialize the model:
```python
from langchain_anthropic import ChatAnthropic
chat_model = ChatAnthropic(model="claude-2.1", temperature=0.2, max_tokens=1024)
```
If you'd prefer not to set an environment variable you can pass the key in directly via the `anthropic_api_key` named parameter when initiating the Anthropic Chat Model class:
```python
chat_model = ChatAnthropic(anthropic_api_key="...")
```
</TabItem>
@ -84,7 +113,7 @@ We can then initialize the model:
```python
from langchain_community.chat_models import ChatCohere
llm = ChatCohere()
chat_model = ChatCohere()
```
If you'd prefer not to set an environment variable you can pass the key in directly via the `cohere_api_key` named parameter when initiating the Cohere LLM class:
@ -92,7 +121,7 @@ If you'd prefer not to set an environment variable you can pass the key in direc
```python
from langchain_community.chat_models import ChatCohere
llm = ChatCohere(cohere_api_key="...")
chat_model = ChatCohere(cohere_api_key="...")
```
</TabItem>