mirror of
https://github.com/hwchase17/langchain.git
synced 2025-07-04 04:07:54 +00:00
docs: partner packages (#16960)
This commit is contained in:
parent
06660bc78c
commit
afdd636999
@ -16,7 +16,8 @@ cp ../cookbook/README.md src/pages/cookbook.mdx
|
|||||||
mkdir -p docs/templates
|
mkdir -p docs/templates
|
||||||
cp ../templates/docs/INDEX.md docs/templates/index.md
|
cp ../templates/docs/INDEX.md docs/templates/index.md
|
||||||
poetry run python scripts/copy_templates.py
|
poetry run python scripts/copy_templates.py
|
||||||
wget https://raw.githubusercontent.com/langchain-ai/langserve/main/README.md -O docs/langserve.md
|
wget -q https://raw.githubusercontent.com/langchain-ai/langserve/main/README.md -O docs/langserve.md
|
||||||
|
wget -q https://raw.githubusercontent.com/langchain-ai/langgraph/main/README.md -O docs/langgraph.md
|
||||||
|
|
||||||
yarn
|
yarn
|
||||||
|
|
||||||
|
@ -19,7 +19,19 @@
|
|||||||
"\n",
|
"\n",
|
||||||
"This notebook covers how to get started with MistralAI chat models, via their [API](https://docs.mistral.ai/api/).\n",
|
"This notebook covers how to get started with MistralAI chat models, via their [API](https://docs.mistral.ai/api/).\n",
|
||||||
"\n",
|
"\n",
|
||||||
"A valid [API key](https://console.mistral.ai/users/api-keys/) is needed to communicate with the API."
|
"A valid [API key](https://console.mistral.ai/users/api-keys/) is needed to communicate with the API.\n",
|
||||||
|
"\n",
|
||||||
|
"You will need the `langchain-mistralai` package to use the API. You can install it via pip:"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"id": "eb978a7e",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"%pip install -qU langchain-core langchain-mistralai"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
|
22
docs/docs/integrations/platforms/index.mdx
Normal file
22
docs/docs/integrations/platforms/index.mdx
Normal file
@ -0,0 +1,22 @@
|
|||||||
|
# Providers
|
||||||
|
|
||||||
|
LangChain integrates with many providers
|
||||||
|
|
||||||
|
## Partner Packages
|
||||||
|
|
||||||
|
- [OpenAI](/docs/integrations/platforms/openai)
|
||||||
|
- [Anthropic](/docs/integrations/platforms/anthropic)
|
||||||
|
- [Google](/docs/integrations/platforms/google)
|
||||||
|
- [MistralAI](/docs/integrations/providers/mistralai)
|
||||||
|
- [NVIDIA AI](/docs/integrations/providers/nvidia)
|
||||||
|
- [Together AI](/docs/integrations/providers/together)
|
||||||
|
- [Robocorp](/docs/integrations/providers/robocorp)
|
||||||
|
- [Exa Search](/docs/integrations/providers/exa_search)
|
||||||
|
- [Nomic](/docs/integrations/providers/nomic)
|
||||||
|
|
||||||
|
|
||||||
|
## Featured Community Providers
|
||||||
|
|
||||||
|
- [AWS](/docs/integrations/platforms/aws)
|
||||||
|
- [Hugging Face](/docs/integrations/platforms/huggingface)
|
||||||
|
- [Microsoft](/docs/integrations/platforms/microsoft)
|
77
docs/docs/integrations/providers/exa_search.ipynb
Normal file
77
docs/docs/integrations/providers/exa_search.ipynb
Normal file
@ -0,0 +1,77 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Exa Search\n",
|
||||||
|
"\n",
|
||||||
|
"Exa's search integration exists in its own [partner package](https://pypi.org/project/langchain-exa/). You can install it with:"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"%pip install -qU langchain-exa"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"In order to use the package, you will also need to set the `EXA_API_KEY` environment variable to your Exa API key.\n",
|
||||||
|
"\n",
|
||||||
|
"## Retriever\n",
|
||||||
|
"\n",
|
||||||
|
"You can use the [`ExaSearchRetriever`](/docs/integrations/tools/exa_search#using-exasearchretriever) in a standard retrieval pipeline. You can import it as follows"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 2,
|
||||||
|
"metadata": {
|
||||||
|
"id": "y8ku6X96sebl"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from langchain_exa import ExaSearchRetriever"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Tools\n",
|
||||||
|
"\n",
|
||||||
|
"You can use Exa as an agent tool as described in the [Exa tool calling docs](/docs/integrations/tools/exa_search#using-the-exa-sdk-as-langchain-agent-tools).\n"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"colab": {
|
||||||
|
"provenance": []
|
||||||
|
},
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3 (ipykernel)",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.10.11"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 1
|
||||||
|
}
|
78
docs/docs/integrations/providers/mistralai.ipynb
Normal file
78
docs/docs/integrations/providers/mistralai.ipynb
Normal file
@ -0,0 +1,78 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# MistralAI\n",
|
||||||
|
"\n",
|
||||||
|
"Mistral AI is a platform that offers hosting for their powerful open source models.\n",
|
||||||
|
"\n",
|
||||||
|
"You can access them via their [API](https://docs.mistral.ai/api/).\n",
|
||||||
|
"\n",
|
||||||
|
"A valid [API key](https://console.mistral.ai/users/api-keys/) is needed to communicate with the API.\n",
|
||||||
|
"\n",
|
||||||
|
"You will also need the `langchain-mistralai` package:"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"%pip install -qU langchain-core langchain-mistralai"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 2,
|
||||||
|
"metadata": {
|
||||||
|
"id": "y8ku6X96sebl"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from langchain_mistralai import ChatMistralAI, MistralAIEmbeddings"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"See the docs for their\n",
|
||||||
|
"\n",
|
||||||
|
"- [Chat Model](/docs/integrations/chat/mistralai)\n",
|
||||||
|
"- [Embeddings Model](/docs/integrations/text_embedding/mistralai)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": []
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"colab": {
|
||||||
|
"provenance": []
|
||||||
|
},
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3 (ipykernel)",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.10.11"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 1
|
||||||
|
}
|
@ -11,6 +11,22 @@
|
|||||||
"- Atlas: their Visual Data Engine\n",
|
"- Atlas: their Visual Data Engine\n",
|
||||||
"- GPT4All: their Open Source Edge Language Model Ecosystem\n",
|
"- GPT4All: their Open Source Edge Language Model Ecosystem\n",
|
||||||
"\n",
|
"\n",
|
||||||
|
"The Nomic integration exists in its own [partner package](https://pypi.org/project/langchain-nomic/). You can install it with:"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"%pip install -qU langchain-nomic"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
"Currently, you can import their hosted [embedding model](/docs/integrations/text_embedding/nomic) as follows:"
|
"Currently, you can import their hosted [embedding model](/docs/integrations/text_embedding/nomic) as follows:"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
78
docs/docs/integrations/providers/together.ipynb
Normal file
78
docs/docs/integrations/providers/together.ipynb
Normal file
@ -0,0 +1,78 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Together AI\n",
|
||||||
|
"\n",
|
||||||
|
"> The Together API makes it easy to fine-tune or run leading open-source models with a couple lines of code. We have integrated the world’s leading open-source models, including Llama-2, RedPajama, Falcon, Alpaca, Stable Diffusion XL, and more. Read more: https://together.ai\n",
|
||||||
|
"\n",
|
||||||
|
"To use, you'll need an API key which you can find here:\n",
|
||||||
|
"https://api.together.xyz/settings/api-keys. This can be passed in as init param\n",
|
||||||
|
"``together_api_key`` or set as environment variable ``TOGETHER_API_KEY``.\n",
|
||||||
|
"\n",
|
||||||
|
"Together API reference: https://docs.together.ai/reference/inference\n",
|
||||||
|
"\n",
|
||||||
|
"You will also need to install the `langchain-together` integration package:"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"%pip install --upgrade --quiet langchain-together"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 2,
|
||||||
|
"metadata": {
|
||||||
|
"id": "y8ku6X96sebl"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from __module_name__ import (\n",
|
||||||
|
" Together, # LLM\n",
|
||||||
|
" TogetherEmbeddings,\n",
|
||||||
|
")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"See the docs for their\n",
|
||||||
|
"\n",
|
||||||
|
"- [LLM](/docs/integrations/llms/together)\n",
|
||||||
|
"- [Embeddings Model](/docs/integrations/text_embedding/together)"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"colab": {
|
||||||
|
"provenance": []
|
||||||
|
},
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3 (ipykernel)",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.10.11"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 1
|
||||||
|
}
|
@ -60,7 +60,7 @@
|
|||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"# Using ExaSearchRetriever\n",
|
"## Using ExaSearchRetriever\n",
|
||||||
"\n",
|
"\n",
|
||||||
"ExaSearchRetriever is a retriever that uses Exa Search to retrieve relevant documents."
|
"ExaSearchRetriever is a retriever that uses Exa Search to retrieve relevant documents."
|
||||||
]
|
]
|
||||||
|
@ -87,11 +87,11 @@ module.exports = {
|
|||||||
collapsible: false,
|
collapsible: false,
|
||||||
items: [
|
items: [
|
||||||
{ type: "autogenerated", dirName: "integrations/platforms" },
|
{ type: "autogenerated", dirName: "integrations/platforms" },
|
||||||
{ type: "category", label: "More", collapsed: true, items: [{type:"autogenerated", dirName: "integrations/providers" }]},
|
{ type: "category", label: "More", collapsed: true, items: [{type:"autogenerated", dirName: "integrations/providers" }], link: { type: 'generated-index', slug: "integrations/providers", }},
|
||||||
],
|
],
|
||||||
link: {
|
link: {
|
||||||
type: 'generated-index',
|
type: 'doc',
|
||||||
slug: "integrations/providers",
|
id: 'integrations/platforms/index'
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
|
@ -17,9 +17,9 @@
|
|||||||
},
|
},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"from __module_name__.chat_models import __ModuleName__Chat\n",
|
"from __module_name__ import Chat__ModuleName__\n",
|
||||||
"from __module_name__.llms import __ModuleName__LLM\n",
|
"from __module_name__ import __ModuleName__LLM\n",
|
||||||
"from __module_name__.vectorstores import __ModuleName__VectorStore"
|
"from __module_name__ import __ModuleName__VectorStore"
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
|
Loading…
Reference in New Issue
Block a user