mirror of
https://github.com/hwchase17/langchain.git
synced 2025-08-02 01:23:07 +00:00
docs: Adds pointers from LLM pages to equivalent chat model pages (#22759)
@baskaryan
This commit is contained in:
parent
7f180f996b
commit
89804c3026
@ -3,10 +3,15 @@
|
||||
{
|
||||
"cell_type": "raw",
|
||||
"id": "602a52a4",
|
||||
"metadata": {},
|
||||
"metadata": {
|
||||
"vscode": {
|
||||
"languageId": "raw"
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"---\n",
|
||||
"sidebar_label: Anthropic\n",
|
||||
"sidebar_class_name: hidden\n",
|
||||
"---"
|
||||
]
|
||||
},
|
||||
@ -17,9 +22,13 @@
|
||||
"source": [
|
||||
"# AnthropicLLM\n",
|
||||
"\n",
|
||||
"This example goes over how to use LangChain to interact with `Anthropic` models.\n",
|
||||
":::caution\n",
|
||||
"You are currently on a page documenting the use of Anthropic legacy Claude 2 models as [text completion models](/docs/concepts/#llms). The latest and most popular Anthropic models are [chat completion models](/docs/concepts/#chat-models).\n",
|
||||
"\n",
|
||||
"NOTE: AnthropicLLM only supports legacy Claude 2 models. To use the newest Claude 3 models, please use [`ChatAnthropic`](/docs/integrations/chat/anthropic) instead.\n",
|
||||
"You are probably looking for [this page instead](/docs/integrations/chat/anthropic/).\n",
|
||||
":::\n",
|
||||
"\n",
|
||||
"This example goes over how to use LangChain to interact with `Anthropic` models.\n",
|
||||
"\n",
|
||||
"## Installation"
|
||||
]
|
||||
|
@ -11,6 +11,12 @@
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
":::caution\n",
|
||||
"You are currently on a page documenting the use of Amazon Bedrock models as [text completion models](/docs/concepts/#llms). Many popular models available on Bedrock are [chat completion models](/docs/concepts/#chat-models).\n",
|
||||
"\n",
|
||||
"You may be looking for [this page instead](/docs/integrations/chat/bedrock/).\n",
|
||||
":::\n",
|
||||
"\n",
|
||||
">[Amazon Bedrock](https://aws.amazon.com/bedrock/) is a fully managed service that offers a choice of \n",
|
||||
"> high-performing foundation models (FMs) from leading AI companies like `AI21 Labs`, `Anthropic`, `Cohere`, \n",
|
||||
"> `Meta`, `Stability AI`, and `Amazon` via a single API, along with a broad set of capabilities you need to \n",
|
||||
|
@ -7,6 +7,12 @@
|
||||
"source": [
|
||||
"# Cohere\n",
|
||||
"\n",
|
||||
":::caution\n",
|
||||
"You are currently on a page documenting the use of Cohere models as [text completion models](/docs/concepts/#llms). Many popular Cohere models are [chat completion models](/docs/concepts/#chat-models).\n",
|
||||
"\n",
|
||||
"You may be looking for [this page instead](/docs/integrations/chat/cohere/).\n",
|
||||
":::\n",
|
||||
"\n",
|
||||
">[Cohere](https://cohere.ai/about) is a Canadian startup that provides natural language processing models that help companies improve human-machine interactions.\n",
|
||||
"\n",
|
||||
"Head to the [API reference](https://api.python.langchain.com/en/latest/llms/langchain_community.llms.cohere.Cohere.html) for detailed documentation of all attributes and methods."
|
||||
|
@ -7,6 +7,12 @@
|
||||
"source": [
|
||||
"# Fireworks\n",
|
||||
"\n",
|
||||
":::caution\n",
|
||||
"You are currently on a page documenting the use of Fireworks models as [text completion models](/docs/concepts/#llms). Many popular Fireworks models are [chat completion models](/docs/concepts/#chat-models).\n",
|
||||
"\n",
|
||||
"You may be looking for [this page instead](/docs/integrations/chat/fireworks/).\n",
|
||||
":::\n",
|
||||
"\n",
|
||||
">[Fireworks](https://app.fireworks.ai/) accelerates product development on generative AI by creating an innovative AI experiment and production platform. \n",
|
||||
"\n",
|
||||
"This example goes over how to use LangChain to interact with `Fireworks` models."
|
||||
|
@ -25,6 +25,12 @@
|
||||
"id": "bead5ede-d9cc-44b9-b062-99c90a10cf40",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
":::caution\n",
|
||||
"You are currently on a page documenting the use of Google models as [text completion models](/docs/concepts/#llms). Many popular Google models are [chat completion models](/docs/concepts/#chat-models).\n",
|
||||
"\n",
|
||||
"You may be looking for [this page instead](/docs/integrations/chat/google_generative_ai/).\n",
|
||||
":::\n",
|
||||
"\n",
|
||||
"A guide on using [Google Generative AI](https://developers.generativeai.google/) models with Langchain. Note: It's separate from Google Cloud Vertex AI [integration](/docs/integrations/llms/google_vertex_ai_palm)."
|
||||
]
|
||||
},
|
||||
|
@ -15,6 +15,12 @@
|
||||
"source": [
|
||||
"# Google Cloud Vertex AI\n",
|
||||
"\n",
|
||||
":::caution\n",
|
||||
"You are currently on a page documenting the use of Google Vertex [text completion models](/docs/concepts/#llms). Many Google models are [chat completion models](/docs/concepts/#chat-models).\n",
|
||||
"\n",
|
||||
"You may be looking for [this page instead](/docs/integrations/chat/google_vertex_ai_palm/).\n",
|
||||
":::\n",
|
||||
"\n",
|
||||
"**Note:** This is separate from the `Google Generative AI` integration, it exposes [Vertex AI Generative API](https://cloud.google.com/vertex-ai/docs/generative-ai/learn/overview) on `Google Cloud`.\n",
|
||||
"\n",
|
||||
"VertexAI exposes all foundational models available in google cloud:\n",
|
||||
|
@ -6,6 +6,12 @@
|
||||
"source": [
|
||||
"# Ollama\n",
|
||||
"\n",
|
||||
":::caution\n",
|
||||
"You are currently on a page documenting the use of Ollama models as [text completion models](/docs/concepts/#llms). Many popular Ollama models are [chat completion models](/docs/concepts/#chat-models).\n",
|
||||
"\n",
|
||||
"You may be looking for [this page instead](/docs/integrations/chat/ollama/).\n",
|
||||
":::\n",
|
||||
"\n",
|
||||
"[Ollama](https://ollama.ai/) allows you to run open-source large language models, such as Llama 2, locally.\n",
|
||||
"\n",
|
||||
"Ollama bundles model weights, configuration, and data into a single package, defined by a Modelfile. \n",
|
||||
|
@ -7,6 +7,12 @@
|
||||
"source": [
|
||||
"# Together AI\n",
|
||||
"\n",
|
||||
":::caution\n",
|
||||
"You are currently on a page documenting the use of Together AI models as [text completion models](/docs/concepts/#llms). Many popular Together AI models are [chat completion models](/docs/concepts/#chat-models).\n",
|
||||
"\n",
|
||||
"You may be looking for [this page instead](/docs/integrations/chat/together/).\n",
|
||||
":::\n",
|
||||
"\n",
|
||||
"[Together AI](https://www.together.ai/) offers an API to query [50+ leading open-source models](https://docs.together.ai/docs/inference-models) in a couple lines of code.\n",
|
||||
"\n",
|
||||
"This example goes over how to use LangChain to interact with Together AI models."
|
||||
|
@ -14,6 +14,18 @@ pip install -U langchain-anthropic
|
||||
You need to set the `ANTHROPIC_API_KEY` environment variable.
|
||||
You can get an Anthropic API key [here](https://console.anthropic.com/settings/keys)
|
||||
|
||||
## Chat Models
|
||||
|
||||
### ChatAnthropic
|
||||
|
||||
See a [usage example](/docs/integrations/chat/anthropic).
|
||||
|
||||
```python
|
||||
from langchain_anthropic import ChatAnthropic
|
||||
|
||||
model = ChatAnthropic(model='claude-3-opus-20240229')
|
||||
```
|
||||
|
||||
## LLMs
|
||||
|
||||
### [Legacy] AnthropicLLM
|
||||
@ -28,17 +40,3 @@ from langchain_anthropic import AnthropicLLM
|
||||
|
||||
model = AnthropicLLM(model='claude-2.1')
|
||||
```
|
||||
|
||||
## Chat Models
|
||||
|
||||
### ChatAnthropic
|
||||
|
||||
See a [usage example](/docs/integrations/chat/anthropic).
|
||||
|
||||
```python
|
||||
from langchain_anthropic import ChatAnthropic
|
||||
|
||||
model = ChatAnthropic(model='claude-3-opus-20240229')
|
||||
```
|
||||
|
||||
|
||||
|
@ -2,45 +2,10 @@
|
||||
|
||||
All functionality related to [Google Cloud Platform](https://cloud.google.com/) and other `Google` products.
|
||||
|
||||
## LLMs
|
||||
## Chat models
|
||||
|
||||
We recommend individual developers to start with Gemini API (`langchain-google-genai`) and move to Vertex AI (`langchain-google-vertexai`) when they need access to commercial support and higher rate limits. If you’re already Cloud-friendly or Cloud-native, then you can get started in Vertex AI straight away.
|
||||
Please, find more information [here](https://ai.google.dev/gemini-api/docs/migrate-to-cloud).
|
||||
|
||||
### Google Generative AI
|
||||
|
||||
Access GoogleAI `Gemini` models such as `gemini-pro` and `gemini-pro-vision` through the `GoogleGenerativeAI` class.
|
||||
|
||||
Install python package.
|
||||
|
||||
```bash
|
||||
pip install langchain-google-genai
|
||||
```
|
||||
|
||||
See a [usage example](/docs/integrations/llms/google_ai).
|
||||
|
||||
```python
|
||||
from langchain_google_genai import GoogleGenerativeAI
|
||||
```
|
||||
|
||||
### Vertex AI Model Garden
|
||||
|
||||
Access `PaLM` and hundreds of OSS models via `Vertex AI Model Garden` service.
|
||||
|
||||
We need to install `langchain-google-vertexai` python package.
|
||||
|
||||
```bash
|
||||
pip install langchain-google-vertexai
|
||||
```
|
||||
|
||||
See a [usage example](/docs/integrations/llms/google_vertex_ai_palm#vertex-model-garden).
|
||||
|
||||
```python
|
||||
from langchain_google_vertexai import VertexAIModelGarden
|
||||
```
|
||||
|
||||
|
||||
## Chat models
|
||||
Please see [here](https://ai.google.dev/gemini-api/docs/migrate-to-cloud) for more information.
|
||||
|
||||
### Google Generative AI
|
||||
|
||||
@ -107,6 +72,40 @@ See a [usage example](/docs/integrations/chat/google_vertex_ai_palm).
|
||||
from langchain_google_vertexai import ChatVertexAI
|
||||
```
|
||||
|
||||
## LLMs
|
||||
|
||||
### Google Generative AI
|
||||
|
||||
Access GoogleAI `Gemini` models such as `gemini-pro` and `gemini-pro-vision` through the `GoogleGenerativeAI` class.
|
||||
|
||||
Install python package.
|
||||
|
||||
```bash
|
||||
pip install langchain-google-genai
|
||||
```
|
||||
|
||||
See a [usage example](/docs/integrations/llms/google_ai).
|
||||
|
||||
```python
|
||||
from langchain_google_genai import GoogleGenerativeAI
|
||||
```
|
||||
|
||||
### Vertex AI Model Garden
|
||||
|
||||
Access `PaLM` and hundreds of OSS models via `Vertex AI Model Garden` service.
|
||||
|
||||
We need to install `langchain-google-vertexai` python package.
|
||||
|
||||
```bash
|
||||
pip install langchain-google-vertexai
|
||||
```
|
||||
|
||||
See a [usage example](/docs/integrations/llms/google_vertex_ai_palm#vertex-model-garden).
|
||||
|
||||
```python
|
||||
from langchain_google_vertexai import VertexAIModelGarden
|
||||
```
|
||||
|
||||
## Embedding models
|
||||
|
||||
### Google Generative AI Embeddings
|
||||
|
Loading…
Reference in New Issue
Block a user