docs: providers updates 1 (#20256)

- Proviers pages: added missed integrations; fixed format
- `mistralai` converted from notebook to .mdx format
This commit is contained in:
Leonid Ganeline 2024-05-13 08:54:51 -07:00 committed by GitHub
parent 15cb1133e7
commit 4c48732f94
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
9 changed files with 184 additions and 111 deletions

View File

@ -1,43 +1,44 @@
# Anthropic # Anthropic
All functionality related to Anthropic models. >[Anthropic](https://www.anthropic.com/) is an AI safety and research company, and is the creator of `Claude`.
This page covers all integrations between `Anthropic` models and `LangChain`.
[Anthropic](https://www.anthropic.com/) is an AI safety and research company, and is the creator of Claude. ## Installation and Setup
This page covers all integrations between Anthropic models and LangChain.
## Installation To use `Anthropic` models, you need to install a python package:
To use Anthropic models, you will need to install the `langchain-anthropic` package. ```bash
You can do this with the following command: pip install -U langchain-anthropic
```
pip install langchain-anthropic
``` ```
## Environment Setup You need to set the `ANTHROPIC_API_KEY` environment variable.
To use Anthropic models, you will need to set the `ANTHROPIC_API_KEY` environment variable.
You can get an Anthropic API key [here](https://console.anthropic.com/settings/keys) You can get an Anthropic API key [here](https://console.anthropic.com/settings/keys)
## `ChatAnthropic` ## LLMs
`ChatAnthropic` is a subclass of LangChain's `ChatModel`. ### [Legacy] AnthropicLLM
You can import this wrapper with the following code:
``` **NOTE**: `AnthropicLLM` only supports legacy `Claude 2` models.
from langchain_anthropic import ChatAnthropic To use the newest `Claude 3` models, please use `ChatAnthropic` instead.
model = ChatAnthropic(model='claude-3-opus-20240229')
```
Read more in the [ChatAnthropic documentation](/docs/integrations/chat/anthropic). See a [usage example](/docs/integrations/llms/anthropic).
## [Legacy] `AnthropicLLM`
`AnthropicLLM` is a subclass of LangChain's `LLM`. It is a wrapper around Anthropic's
text-based completion endpoints.
```python ```python
from langchain_anthropic import AnthropicLLM from langchain_anthropic import AnthropicLLM
model = AnthropicLLM(model='claude-2.1') model = AnthropicLLM(model='claude-2.1')
``` ```
## Chat Models
### ChatAnthropic
See a [usage example](/docs/integrations/chat/anthropic).
```python
from langchain_anthropic import ChatAnthropic
model = ChatAnthropic(model='claude-3-opus-20240229')
```

View File

@ -268,6 +268,29 @@ See a [usage example](/docs/integrations/memory/aws_dynamodb).
from langchain.memory import DynamoDBChatMessageHistory from langchain.memory import DynamoDBChatMessageHistory
``` ```
## Graphs
### Amazon Neptune with Cypher
See a [usage example](/docs/integrations/graphs/amazon_neptune_open_cypher).
```python
from langchain_community.graphs import NeptuneGraph
from langchain_community.graphs import NeptuneAnalyticsGraph
from langchain.chains import NeptuneOpenCypherQAChain
```
### Amazon Neptune with SPARQL
See a [usage example](/docs/integrations/graphs/amazon_neptune_sparql).
```python
from langchain_community.graphs import NeptuneRdfGraph
from langchain.chains.graph_qa.neptune_sparql import NeptuneSparqlQAChain
```
## Callbacks ## Callbacks
### SageMaker Tracking ### SageMaker Tracking

View File

@ -317,6 +317,24 @@ from langchain_community.agent_toolkits import PowerBIToolkit
from langchain_community.utilities.powerbi import PowerBIDataset from langchain_community.utilities.powerbi import PowerBIDataset
``` ```
## Graphs
### Azure Cosmos DB for Apache Gremlin
We need to install a python package.
```bash
pip install gremlinpython
```
See a [usage example](/docs/integrations/graphs/azure_cosmosdb_gremlin).
```python
from langchain_community.graphs import GremlinGraph
from langchain_community.graphs.graph_document import GraphDocument, Node, Relationship
```
## Utilities ## Utilities
### Bing Search API ### Bing Search API

View File

@ -19,13 +19,26 @@ pip install langchain-ai21
See a [usage example](/docs/integrations/llms/ai21). See a [usage example](/docs/integrations/llms/ai21).
### AI21 LLM
```python ```python
from langchain_community.llms import AI21 from langchain_ai21 import AI21LLM
```
### AI21 Contextual Answer
You can use AI21s contextual answers model to receive text or document,
serving as a context, and a question and return an answer based entirely on this context.
```python
from langchain_ai21 import AI21ContextualAnswers
``` ```
## Chat models ## Chat models
### AI21 Chat
See a [usage example](/docs/integrations/chat/ai21). See a [usage example](/docs/integrations/chat/ai21).
```python ```python
@ -34,9 +47,21 @@ from langchain_ai21 import ChatAI21
## Embedding models ## Embedding models
### AI21 Embeddings
See a [usage example](/docs/integrations/text_embedding/ai21). See a [usage example](/docs/integrations/text_embedding/ai21).
```python ```python
from langchain_ai21 import AI21Embeddings from langchain_ai21 import AI21Embeddings
``` ```
## Text splitters
### AI21 Semantic Text Splitter
See a [usage example](/docs/integrations/document_transformers/ai21_semantic_text_splitter).
```python
from langchain_ai21 import AI21SemanticTextSplitter
```

View File

@ -48,3 +48,11 @@ See a [usage example](/docs/integrations/vectorstores/baiducloud_vector_search).
```python ```python
from langchain_community.vectorstores import BESVectorStore from langchain_community.vectorstores import BESVectorStore
``` ```
### Baidu VectorDB
See a [usage example](/docs/integrations/vectorstores/baiduvectordb).
```python
from langchain_community.vectorstores import BaiduVectorDB
```

View File

@ -4,6 +4,7 @@ This page covers how to use the [Serper](https://serper.dev) Google Search API w
It is broken into two parts: setup, and then references to the specific Google Serper wrapper. It is broken into two parts: setup, and then references to the specific Google Serper wrapper.
## Setup ## Setup
- Go to [serper.dev](https://serper.dev) to sign up for a free account - Go to [serper.dev](https://serper.dev) to sign up for a free account
- Get the api key and set it as an environment variable (`SERPER_API_KEY`) - Get the api key and set it as an environment variable (`SERPER_API_KEY`)

View File

@ -1,78 +0,0 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# MistralAI\n",
"\n",
"Mistral AI is a platform that offers hosting for their powerful open source models.\n",
"\n",
"You can access them via their [API](https://docs.mistral.ai/api/).\n",
"\n",
"A valid [API key](https://console.mistral.ai/users/api-keys/) is needed to communicate with the API.\n",
"\n",
"You will also need the `langchain-mistralai` package:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%pip install -qU langchain-core langchain-mistralai"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"id": "y8ku6X96sebl"
},
"outputs": [],
"source": [
"from langchain_mistralai import ChatMistralAI, MistralAIEmbeddings"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"See the docs for their\n",
"\n",
"- [Chat Model](/docs/integrations/chat/mistralai)\n",
"- [Embeddings Model](/docs/integrations/text_embedding/mistralai)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": []
}
],
"metadata": {
"colab": {
"provenance": []
},
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.11"
}
},
"nbformat": 4,
"nbformat_minor": 1
}

View File

@ -0,0 +1,34 @@
# MistralAI
>[Mistral AI](https://docs.mistral.ai/api/) is a platform that offers hosting for their powerful open source models.
## Installation and Setup
A valid [API key](https://console.mistral.ai/users/api-keys/) is needed to communicate with the API.
You will also need the `langchain-mistralai` package:
```bash
pip install langchain-mistralai
```
## Chat models
### ChatMistralAI
See a [usage example](/docs/integrations/chat/mistralai).
```python
from langchain_mistralai.chat_models import ChatMistralAI
```
## Embedding models
### MistralAIEmbeddings
See a [usage example](/docs/integrations/text_embedding/mistralai).
```python
from langchain_mistralai import MistralAIEmbeddings
```

View File

@ -4,21 +4,28 @@
> and external source, providing optimized search results and generative answers. > and external source, providing optimized search results and generative answers.
> It can handle video and audio transcription, image content extraction, and document parsing. > It can handle video and audio transcription, image content extraction, and document parsing.
>`Nuclia Understanding API` document transformer splits text into paragraphs and sentences,
> identifies entities, provides a summary of the text and generates embeddings for all the sentences.
## Installation and Setup ## Installation and Setup
We need to install the `nucliadb-protos` package to use the `Nuclia Understanding API`. We need to install the `nucliadb-protos` package to use the `Nuclia Understanding API`
```bash ```bash
pip install nucliadb-protos pip install nucliadb-protos
``` ```
To use the `Nuclia Understanding API`, we need to have a `Nuclia account`. We need to have a `Nuclia account`.
We can create one for free at [https://nuclia.cloud](https://nuclia.cloud), We can create one for free at [https://nuclia.cloud](https://nuclia.cloud),
and then [create a NUA key](https://docs.nuclia.dev/docs/docs/using/understanding/intro). and then [create a NUA key](https://docs.nuclia.dev/docs/docs/using/understanding/intro).
## Document Transformer
### Nuclia
>`Nuclia Understanding API` document transformer splits text into paragraphs and sentences,
> identifies entities, provides a summary of the text and generates embeddings for all the sentences.
To use the Nuclia document transformer, we need to instantiate a `NucliaUnderstandingAPI` To use the Nuclia document transformer, we need to instantiate a `NucliaUnderstandingAPI`
tool with `enable_ml` set to `True`: tool with `enable_ml` set to `True`:
@ -28,10 +35,44 @@ from langchain_community.tools.nuclia import NucliaUnderstandingAPI
nua = NucliaUnderstandingAPI(enable_ml=True) nua = NucliaUnderstandingAPI(enable_ml=True)
``` ```
## Document Transformer
See a [usage example](/docs/integrations/document_transformers/nuclia_transformer). See a [usage example](/docs/integrations/document_transformers/nuclia_transformer).
```python ```python
from langchain_community.document_transformers.nuclia_text_transform import NucliaTextTransformer from langchain_community.document_transformers.nuclia_text_transform import NucliaTextTransformer
``` ```
## Document Loaders
### Nuclea loader
See a [usage example](/docs/integrations/document_loaders/nuclia).
```python
from langchain_community.document_loaders.nuclia import NucliaLoader
```
## Vector store
### NucliaDB
We need to install a python package:
```bash
pip install nuclia
```
See a [usage example](/docs/integrations/vectorstores/nucliadb).
```python
from langchain_community.vectorstores.nucliadb import NucliaDB
```
## Tools
### Nuclia Understanding
See a [usage example](/docs/integrations/tools/nuclia).
```python
from langchain_community.tools.nuclia import NucliaUnderstandingAPI
```