diff --git a/docs/docs/integrations/llms/anthropic.ipynb b/docs/docs/integrations/llms/anthropic.ipynb index 36c3fe4b80f..77c359e8712 100644 --- a/docs/docs/integrations/llms/anthropic.ipynb +++ b/docs/docs/integrations/llms/anthropic.ipynb @@ -3,10 +3,15 @@ { "cell_type": "raw", "id": "602a52a4", - "metadata": {}, + "metadata": { + "vscode": { + "languageId": "raw" + } + }, "source": [ "---\n", "sidebar_label: Anthropic\n", + "sidebar_class_name: hidden\n", "---" ] }, @@ -17,9 +22,13 @@ "source": [ "# AnthropicLLM\n", "\n", - "This example goes over how to use LangChain to interact with `Anthropic` models.\n", + ":::caution\n", + "You are currently on a page documenting the use of Anthropic legacy Claude 2 models as [text completion models](/docs/concepts/#llms). The latest and most popular Anthropic models are [chat completion models](/docs/concepts/#chat-models).\n", "\n", - "NOTE: AnthropicLLM only supports legacy Claude 2 models. To use the newest Claude 3 models, please use [`ChatAnthropic`](/docs/integrations/chat/anthropic) instead.\n", + "You are probably looking for [this page instead](/docs/integrations/chat/anthropic/).\n", + ":::\n", + "\n", + "This example goes over how to use LangChain to interact with `Anthropic` models.\n", "\n", "## Installation" ] diff --git a/docs/docs/integrations/llms/bedrock.ipynb b/docs/docs/integrations/llms/bedrock.ipynb index 2da8a3ff9e7..d751311452b 100644 --- a/docs/docs/integrations/llms/bedrock.ipynb +++ b/docs/docs/integrations/llms/bedrock.ipynb @@ -11,6 +11,12 @@ "cell_type": "markdown", "metadata": {}, "source": [ + ":::caution\n", + "You are currently on a page documenting the use of Amazon Bedrock models as [text completion models](/docs/concepts/#llms). Many popular models available on Bedrock are [chat completion models](/docs/concepts/#chat-models).\n", + "\n", + "You may be looking for [this page instead](/docs/integrations/chat/bedrock/).\n", + ":::\n", + "\n", ">[Amazon Bedrock](https://aws.amazon.com/bedrock/) is a fully managed service that offers a choice of \n", "> high-performing foundation models (FMs) from leading AI companies like `AI21 Labs`, `Anthropic`, `Cohere`, \n", "> `Meta`, `Stability AI`, and `Amazon` via a single API, along with a broad set of capabilities you need to \n", diff --git a/docs/docs/integrations/llms/cohere.ipynb b/docs/docs/integrations/llms/cohere.ipynb index b5592040c92..2199ac99519 100644 --- a/docs/docs/integrations/llms/cohere.ipynb +++ b/docs/docs/integrations/llms/cohere.ipynb @@ -7,6 +7,12 @@ "source": [ "# Cohere\n", "\n", + ":::caution\n", + "You are currently on a page documenting the use of Cohere models as [text completion models](/docs/concepts/#llms). Many popular Cohere models are [chat completion models](/docs/concepts/#chat-models).\n", + "\n", + "You may be looking for [this page instead](/docs/integrations/chat/cohere/).\n", + ":::\n", + "\n", ">[Cohere](https://cohere.ai/about) is a Canadian startup that provides natural language processing models that help companies improve human-machine interactions.\n", "\n", "Head to the [API reference](https://api.python.langchain.com/en/latest/llms/langchain_community.llms.cohere.Cohere.html) for detailed documentation of all attributes and methods." diff --git a/docs/docs/integrations/llms/fireworks.ipynb b/docs/docs/integrations/llms/fireworks.ipynb index fc84b59a838..532ab6874ca 100644 --- a/docs/docs/integrations/llms/fireworks.ipynb +++ b/docs/docs/integrations/llms/fireworks.ipynb @@ -7,6 +7,12 @@ "source": [ "# Fireworks\n", "\n", + ":::caution\n", + "You are currently on a page documenting the use of Fireworks models as [text completion models](/docs/concepts/#llms). Many popular Fireworks models are [chat completion models](/docs/concepts/#chat-models).\n", + "\n", + "You may be looking for [this page instead](/docs/integrations/chat/fireworks/).\n", + ":::\n", + "\n", ">[Fireworks](https://app.fireworks.ai/) accelerates product development on generative AI by creating an innovative AI experiment and production platform. \n", "\n", "This example goes over how to use LangChain to interact with `Fireworks` models." diff --git a/docs/docs/integrations/llms/google_ai.ipynb b/docs/docs/integrations/llms/google_ai.ipynb index c248d6f5fbe..a3ebe9dfec3 100644 --- a/docs/docs/integrations/llms/google_ai.ipynb +++ b/docs/docs/integrations/llms/google_ai.ipynb @@ -25,6 +25,12 @@ "id": "bead5ede-d9cc-44b9-b062-99c90a10cf40", "metadata": {}, "source": [ + ":::caution\n", + "You are currently on a page documenting the use of Google models as [text completion models](/docs/concepts/#llms). Many popular Google models are [chat completion models](/docs/concepts/#chat-models).\n", + "\n", + "You may be looking for [this page instead](/docs/integrations/chat/google_generative_ai/).\n", + ":::\n", + "\n", "A guide on using [Google Generative AI](https://developers.generativeai.google/) models with Langchain. Note: It's separate from Google Cloud Vertex AI [integration](/docs/integrations/llms/google_vertex_ai_palm)." ] }, diff --git a/docs/docs/integrations/llms/google_vertex_ai_palm.ipynb b/docs/docs/integrations/llms/google_vertex_ai_palm.ipynb index c08be084e10..d55b4514798 100644 --- a/docs/docs/integrations/llms/google_vertex_ai_palm.ipynb +++ b/docs/docs/integrations/llms/google_vertex_ai_palm.ipynb @@ -15,6 +15,12 @@ "source": [ "# Google Cloud Vertex AI\n", "\n", + ":::caution\n", + "You are currently on a page documenting the use of Google Vertex [text completion models](/docs/concepts/#llms). Many Google models are [chat completion models](/docs/concepts/#chat-models).\n", + "\n", + "You may be looking for [this page instead](/docs/integrations/chat/google_vertex_ai_palm/).\n", + ":::\n", + "\n", "**Note:** This is separate from the `Google Generative AI` integration, it exposes [Vertex AI Generative API](https://cloud.google.com/vertex-ai/docs/generative-ai/learn/overview) on `Google Cloud`.\n", "\n", "VertexAI exposes all foundational models available in google cloud:\n", diff --git a/docs/docs/integrations/llms/ollama.ipynb b/docs/docs/integrations/llms/ollama.ipynb index e80ce6e4b77..68a0845376b 100644 --- a/docs/docs/integrations/llms/ollama.ipynb +++ b/docs/docs/integrations/llms/ollama.ipynb @@ -6,6 +6,12 @@ "source": [ "# Ollama\n", "\n", + ":::caution\n", + "You are currently on a page documenting the use of Ollama models as [text completion models](/docs/concepts/#llms). Many popular Ollama models are [chat completion models](/docs/concepts/#chat-models).\n", + "\n", + "You may be looking for [this page instead](/docs/integrations/chat/ollama/).\n", + ":::\n", + "\n", "[Ollama](https://ollama.ai/) allows you to run open-source large language models, such as Llama 2, locally.\n", "\n", "Ollama bundles model weights, configuration, and data into a single package, defined by a Modelfile. \n", diff --git a/docs/docs/integrations/llms/together.ipynb b/docs/docs/integrations/llms/together.ipynb index 4a3b07e57d1..19c306baa85 100644 --- a/docs/docs/integrations/llms/together.ipynb +++ b/docs/docs/integrations/llms/together.ipynb @@ -7,6 +7,12 @@ "source": [ "# Together AI\n", "\n", + ":::caution\n", + "You are currently on a page documenting the use of Together AI models as [text completion models](/docs/concepts/#llms). Many popular Together AI models are [chat completion models](/docs/concepts/#chat-models).\n", + "\n", + "You may be looking for [this page instead](/docs/integrations/chat/together/).\n", + ":::\n", + "\n", "[Together AI](https://www.together.ai/) offers an API to query [50+ leading open-source models](https://docs.together.ai/docs/inference-models) in a couple lines of code.\n", "\n", "This example goes over how to use LangChain to interact with Together AI models." diff --git a/docs/docs/integrations/platforms/anthropic.mdx b/docs/docs/integrations/platforms/anthropic.mdx index 456b158d133..bc38f55065c 100644 --- a/docs/docs/integrations/platforms/anthropic.mdx +++ b/docs/docs/integrations/platforms/anthropic.mdx @@ -14,6 +14,18 @@ pip install -U langchain-anthropic You need to set the `ANTHROPIC_API_KEY` environment variable. You can get an Anthropic API key [here](https://console.anthropic.com/settings/keys) +## Chat Models + +### ChatAnthropic + +See a [usage example](/docs/integrations/chat/anthropic). + +```python +from langchain_anthropic import ChatAnthropic + +model = ChatAnthropic(model='claude-3-opus-20240229') +``` + ## LLMs ### [Legacy] AnthropicLLM @@ -28,17 +40,3 @@ from langchain_anthropic import AnthropicLLM model = AnthropicLLM(model='claude-2.1') ``` - -## Chat Models - -### ChatAnthropic - -See a [usage example](/docs/integrations/chat/anthropic). - -```python -from langchain_anthropic import ChatAnthropic - -model = ChatAnthropic(model='claude-3-opus-20240229') -``` - - diff --git a/docs/docs/integrations/platforms/google.mdx b/docs/docs/integrations/platforms/google.mdx index b8936c064f0..bdb8f61e3d4 100644 --- a/docs/docs/integrations/platforms/google.mdx +++ b/docs/docs/integrations/platforms/google.mdx @@ -2,45 +2,10 @@ All functionality related to [Google Cloud Platform](https://cloud.google.com/) and other `Google` products. -## LLMs +## Chat models We recommend individual developers to start with Gemini API (`langchain-google-genai`) and move to Vertex AI (`langchain-google-vertexai`) when they need access to commercial support and higher rate limits. If you’re already Cloud-friendly or Cloud-native, then you can get started in Vertex AI straight away. -Please, find more information [here](https://ai.google.dev/gemini-api/docs/migrate-to-cloud). - -### Google Generative AI - -Access GoogleAI `Gemini` models such as `gemini-pro` and `gemini-pro-vision` through the `GoogleGenerativeAI` class. - -Install python package. - -```bash -pip install langchain-google-genai -``` - -See a [usage example](/docs/integrations/llms/google_ai). - -```python -from langchain_google_genai import GoogleGenerativeAI -``` - -### Vertex AI Model Garden - -Access `PaLM` and hundreds of OSS models via `Vertex AI Model Garden` service. - -We need to install `langchain-google-vertexai` python package. - -```bash -pip install langchain-google-vertexai -``` - -See a [usage example](/docs/integrations/llms/google_vertex_ai_palm#vertex-model-garden). - -```python -from langchain_google_vertexai import VertexAIModelGarden -``` - - -## Chat models +Please see [here](https://ai.google.dev/gemini-api/docs/migrate-to-cloud) for more information. ### Google Generative AI @@ -107,6 +72,40 @@ See a [usage example](/docs/integrations/chat/google_vertex_ai_palm). from langchain_google_vertexai import ChatVertexAI ``` +## LLMs + +### Google Generative AI + +Access GoogleAI `Gemini` models such as `gemini-pro` and `gemini-pro-vision` through the `GoogleGenerativeAI` class. + +Install python package. + +```bash +pip install langchain-google-genai +``` + +See a [usage example](/docs/integrations/llms/google_ai). + +```python +from langchain_google_genai import GoogleGenerativeAI +``` + +### Vertex AI Model Garden + +Access `PaLM` and hundreds of OSS models via `Vertex AI Model Garden` service. + +We need to install `langchain-google-vertexai` python package. + +```bash +pip install langchain-google-vertexai +``` + +See a [usage example](/docs/integrations/llms/google_vertex_ai_palm#vertex-model-garden). + +```python +from langchain_google_vertexai import VertexAIModelGarden +``` + ## Embedding models ### Google Generative AI Embeddings