docs: Adds pointers from LLM pages to equivalent chat model pages (#22759)

@baskaryan
This commit is contained in:
Jacob Lee 2024-06-10 14:13:22 -07:00 committed by GitHub
parent 7f180f996b
commit 89804c3026
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
10 changed files with 102 additions and 54 deletions

View File

@ -3,10 +3,15 @@
{ {
"cell_type": "raw", "cell_type": "raw",
"id": "602a52a4", "id": "602a52a4",
"metadata": {}, "metadata": {
"vscode": {
"languageId": "raw"
}
},
"source": [ "source": [
"---\n", "---\n",
"sidebar_label: Anthropic\n", "sidebar_label: Anthropic\n",
"sidebar_class_name: hidden\n",
"---" "---"
] ]
}, },
@ -17,9 +22,13 @@
"source": [ "source": [
"# AnthropicLLM\n", "# AnthropicLLM\n",
"\n", "\n",
"This example goes over how to use LangChain to interact with `Anthropic` models.\n", ":::caution\n",
"You are currently on a page documenting the use of Anthropic legacy Claude 2 models as [text completion models](/docs/concepts/#llms). The latest and most popular Anthropic models are [chat completion models](/docs/concepts/#chat-models).\n",
"\n", "\n",
"NOTE: AnthropicLLM only supports legacy Claude 2 models. To use the newest Claude 3 models, please use [`ChatAnthropic`](/docs/integrations/chat/anthropic) instead.\n", "You are probably looking for [this page instead](/docs/integrations/chat/anthropic/).\n",
":::\n",
"\n",
"This example goes over how to use LangChain to interact with `Anthropic` models.\n",
"\n", "\n",
"## Installation" "## Installation"
] ]

View File

@ -11,6 +11,12 @@
"cell_type": "markdown", "cell_type": "markdown",
"metadata": {}, "metadata": {},
"source": [ "source": [
":::caution\n",
"You are currently on a page documenting the use of Amazon Bedrock models as [text completion models](/docs/concepts/#llms). Many popular models available on Bedrock are [chat completion models](/docs/concepts/#chat-models).\n",
"\n",
"You may be looking for [this page instead](/docs/integrations/chat/bedrock/).\n",
":::\n",
"\n",
">[Amazon Bedrock](https://aws.amazon.com/bedrock/) is a fully managed service that offers a choice of \n", ">[Amazon Bedrock](https://aws.amazon.com/bedrock/) is a fully managed service that offers a choice of \n",
"> high-performing foundation models (FMs) from leading AI companies like `AI21 Labs`, `Anthropic`, `Cohere`, \n", "> high-performing foundation models (FMs) from leading AI companies like `AI21 Labs`, `Anthropic`, `Cohere`, \n",
"> `Meta`, `Stability AI`, and `Amazon` via a single API, along with a broad set of capabilities you need to \n", "> `Meta`, `Stability AI`, and `Amazon` via a single API, along with a broad set of capabilities you need to \n",

View File

@ -7,6 +7,12 @@
"source": [ "source": [
"# Cohere\n", "# Cohere\n",
"\n", "\n",
":::caution\n",
"You are currently on a page documenting the use of Cohere models as [text completion models](/docs/concepts/#llms). Many popular Cohere models are [chat completion models](/docs/concepts/#chat-models).\n",
"\n",
"You may be looking for [this page instead](/docs/integrations/chat/cohere/).\n",
":::\n",
"\n",
">[Cohere](https://cohere.ai/about) is a Canadian startup that provides natural language processing models that help companies improve human-machine interactions.\n", ">[Cohere](https://cohere.ai/about) is a Canadian startup that provides natural language processing models that help companies improve human-machine interactions.\n",
"\n", "\n",
"Head to the [API reference](https://api.python.langchain.com/en/latest/llms/langchain_community.llms.cohere.Cohere.html) for detailed documentation of all attributes and methods." "Head to the [API reference](https://api.python.langchain.com/en/latest/llms/langchain_community.llms.cohere.Cohere.html) for detailed documentation of all attributes and methods."

View File

@ -7,6 +7,12 @@
"source": [ "source": [
"# Fireworks\n", "# Fireworks\n",
"\n", "\n",
":::caution\n",
"You are currently on a page documenting the use of Fireworks models as [text completion models](/docs/concepts/#llms). Many popular Fireworks models are [chat completion models](/docs/concepts/#chat-models).\n",
"\n",
"You may be looking for [this page instead](/docs/integrations/chat/fireworks/).\n",
":::\n",
"\n",
">[Fireworks](https://app.fireworks.ai/) accelerates product development on generative AI by creating an innovative AI experiment and production platform. \n", ">[Fireworks](https://app.fireworks.ai/) accelerates product development on generative AI by creating an innovative AI experiment and production platform. \n",
"\n", "\n",
"This example goes over how to use LangChain to interact with `Fireworks` models." "This example goes over how to use LangChain to interact with `Fireworks` models."

View File

@ -25,6 +25,12 @@
"id": "bead5ede-d9cc-44b9-b062-99c90a10cf40", "id": "bead5ede-d9cc-44b9-b062-99c90a10cf40",
"metadata": {}, "metadata": {},
"source": [ "source": [
":::caution\n",
"You are currently on a page documenting the use of Google models as [text completion models](/docs/concepts/#llms). Many popular Google models are [chat completion models](/docs/concepts/#chat-models).\n",
"\n",
"You may be looking for [this page instead](/docs/integrations/chat/google_generative_ai/).\n",
":::\n",
"\n",
"A guide on using [Google Generative AI](https://developers.generativeai.google/) models with Langchain. Note: It's separate from Google Cloud Vertex AI [integration](/docs/integrations/llms/google_vertex_ai_palm)." "A guide on using [Google Generative AI](https://developers.generativeai.google/) models with Langchain. Note: It's separate from Google Cloud Vertex AI [integration](/docs/integrations/llms/google_vertex_ai_palm)."
] ]
}, },

View File

@ -15,6 +15,12 @@
"source": [ "source": [
"# Google Cloud Vertex AI\n", "# Google Cloud Vertex AI\n",
"\n", "\n",
":::caution\n",
"You are currently on a page documenting the use of Google Vertex [text completion models](/docs/concepts/#llms). Many Google models are [chat completion models](/docs/concepts/#chat-models).\n",
"\n",
"You may be looking for [this page instead](/docs/integrations/chat/google_vertex_ai_palm/).\n",
":::\n",
"\n",
"**Note:** This is separate from the `Google Generative AI` integration, it exposes [Vertex AI Generative API](https://cloud.google.com/vertex-ai/docs/generative-ai/learn/overview) on `Google Cloud`.\n", "**Note:** This is separate from the `Google Generative AI` integration, it exposes [Vertex AI Generative API](https://cloud.google.com/vertex-ai/docs/generative-ai/learn/overview) on `Google Cloud`.\n",
"\n", "\n",
"VertexAI exposes all foundational models available in google cloud:\n", "VertexAI exposes all foundational models available in google cloud:\n",

View File

@ -6,6 +6,12 @@
"source": [ "source": [
"# Ollama\n", "# Ollama\n",
"\n", "\n",
":::caution\n",
"You are currently on a page documenting the use of Ollama models as [text completion models](/docs/concepts/#llms). Many popular Ollama models are [chat completion models](/docs/concepts/#chat-models).\n",
"\n",
"You may be looking for [this page instead](/docs/integrations/chat/ollama/).\n",
":::\n",
"\n",
"[Ollama](https://ollama.ai/) allows you to run open-source large language models, such as Llama 2, locally.\n", "[Ollama](https://ollama.ai/) allows you to run open-source large language models, such as Llama 2, locally.\n",
"\n", "\n",
"Ollama bundles model weights, configuration, and data into a single package, defined by a Modelfile. \n", "Ollama bundles model weights, configuration, and data into a single package, defined by a Modelfile. \n",

View File

@ -7,6 +7,12 @@
"source": [ "source": [
"# Together AI\n", "# Together AI\n",
"\n", "\n",
":::caution\n",
"You are currently on a page documenting the use of Together AI models as [text completion models](/docs/concepts/#llms). Many popular Together AI models are [chat completion models](/docs/concepts/#chat-models).\n",
"\n",
"You may be looking for [this page instead](/docs/integrations/chat/together/).\n",
":::\n",
"\n",
"[Together AI](https://www.together.ai/) offers an API to query [50+ leading open-source models](https://docs.together.ai/docs/inference-models) in a couple lines of code.\n", "[Together AI](https://www.together.ai/) offers an API to query [50+ leading open-source models](https://docs.together.ai/docs/inference-models) in a couple lines of code.\n",
"\n", "\n",
"This example goes over how to use LangChain to interact with Together AI models." "This example goes over how to use LangChain to interact with Together AI models."

View File

@ -14,6 +14,18 @@ pip install -U langchain-anthropic
You need to set the `ANTHROPIC_API_KEY` environment variable. You need to set the `ANTHROPIC_API_KEY` environment variable.
You can get an Anthropic API key [here](https://console.anthropic.com/settings/keys) You can get an Anthropic API key [here](https://console.anthropic.com/settings/keys)
## Chat Models
### ChatAnthropic
See a [usage example](/docs/integrations/chat/anthropic).
```python
from langchain_anthropic import ChatAnthropic
model = ChatAnthropic(model='claude-3-opus-20240229')
```
## LLMs ## LLMs
### [Legacy] AnthropicLLM ### [Legacy] AnthropicLLM
@ -28,17 +40,3 @@ from langchain_anthropic import AnthropicLLM
model = AnthropicLLM(model='claude-2.1') model = AnthropicLLM(model='claude-2.1')
``` ```
## Chat Models
### ChatAnthropic
See a [usage example](/docs/integrations/chat/anthropic).
```python
from langchain_anthropic import ChatAnthropic
model = ChatAnthropic(model='claude-3-opus-20240229')
```

View File

@ -2,45 +2,10 @@
All functionality related to [Google Cloud Platform](https://cloud.google.com/) and other `Google` products. All functionality related to [Google Cloud Platform](https://cloud.google.com/) and other `Google` products.
## LLMs ## Chat models
We recommend individual developers to start with Gemini API (`langchain-google-genai`) and move to Vertex AI (`langchain-google-vertexai`) when they need access to commercial support and higher rate limits. If youre already Cloud-friendly or Cloud-native, then you can get started in Vertex AI straight away. We recommend individual developers to start with Gemini API (`langchain-google-genai`) and move to Vertex AI (`langchain-google-vertexai`) when they need access to commercial support and higher rate limits. If youre already Cloud-friendly or Cloud-native, then you can get started in Vertex AI straight away.
Please, find more information [here](https://ai.google.dev/gemini-api/docs/migrate-to-cloud). Please see [here](https://ai.google.dev/gemini-api/docs/migrate-to-cloud) for more information.
### Google Generative AI
Access GoogleAI `Gemini` models such as `gemini-pro` and `gemini-pro-vision` through the `GoogleGenerativeAI` class.
Install python package.
```bash
pip install langchain-google-genai
```
See a [usage example](/docs/integrations/llms/google_ai).
```python
from langchain_google_genai import GoogleGenerativeAI
```
### Vertex AI Model Garden
Access `PaLM` and hundreds of OSS models via `Vertex AI Model Garden` service.
We need to install `langchain-google-vertexai` python package.
```bash
pip install langchain-google-vertexai
```
See a [usage example](/docs/integrations/llms/google_vertex_ai_palm#vertex-model-garden).
```python
from langchain_google_vertexai import VertexAIModelGarden
```
## Chat models
### Google Generative AI ### Google Generative AI
@ -107,6 +72,40 @@ See a [usage example](/docs/integrations/chat/google_vertex_ai_palm).
from langchain_google_vertexai import ChatVertexAI from langchain_google_vertexai import ChatVertexAI
``` ```
## LLMs
### Google Generative AI
Access GoogleAI `Gemini` models such as `gemini-pro` and `gemini-pro-vision` through the `GoogleGenerativeAI` class.
Install python package.
```bash
pip install langchain-google-genai
```
See a [usage example](/docs/integrations/llms/google_ai).
```python
from langchain_google_genai import GoogleGenerativeAI
```
### Vertex AI Model Garden
Access `PaLM` and hundreds of OSS models via `Vertex AI Model Garden` service.
We need to install `langchain-google-vertexai` python package.
```bash
pip install langchain-google-vertexai
```
See a [usage example](/docs/integrations/llms/google_vertex_ai_palm#vertex-model-garden).
```python
from langchain_google_vertexai import VertexAIModelGarden
```
## Embedding models ## Embedding models
### Google Generative AI Embeddings ### Google Generative AI Embeddings