mirror of
https://github.com/hwchase17/langchain.git
synced 2025-09-08 06:23:20 +00:00
mv popular and additional chains to use cases (#8242)
This commit is contained in:
@@ -14,7 +14,7 @@
|
||||
"> using both human and machine feedback. We provide support for each step in the MLOps cycle, \n",
|
||||
"> from data labeling to model monitoring.\n",
|
||||
"\n",
|
||||
"<a target=\"_blank\" href=\"https://colab.research.google.com/github/hwchase17/langchain/blob/master/docs/modules/callbacks/integrations/argilla.html\">\n",
|
||||
"<a target=\"_blank\" href=\"https://colab.research.google.com/github/hwchase17/langchain/blob/master/docs/integrations/callbacks/argilla.html\">\n",
|
||||
" <img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/>\n",
|
||||
"</a>"
|
||||
]
|
||||
|
@@ -113,7 +113,7 @@
|
||||
"\n",
|
||||
"The modules are (from least to most complex):\n",
|
||||
"\n",
|
||||
"- [Models](https://python.langchain.com/en/latest/modules/models.html): Supported model types and integrations.\n",
|
||||
"- [Models](https://python.langchain.com/docs/modules/model_io/models/): Supported model types and integrations.\n",
|
||||
"\n",
|
||||
"- [Prompts](https://python.langchain.com/en/latest/modules/prompts.html): Prompt management, optimization, and serialization.\n",
|
||||
"\n",
|
||||
|
@@ -13,7 +13,7 @@ pip install python-arango
|
||||
|
||||
Connect your ArangoDB Database with a Chat Model to get insights on your data.
|
||||
|
||||
See the notebook example [here](/docs/modules/chains/additional/graph_arangodb_qa.html).
|
||||
See the notebook example [here](/docs/use_cases/graph/graph_arangodb_qa.html).
|
||||
|
||||
```python
|
||||
from arango import ArangoClient
|
||||
|
@@ -22,7 +22,7 @@ If you don't you can refer to [Argilla - 🚀 Quickstart](https://docs.argilla.i
|
||||
|
||||
## Tracking
|
||||
|
||||
See a [usage example of `ArgillaCallbackHandler`](/docs/modules/callbacks/integrations/argilla.html).
|
||||
See a [usage example of `ArgillaCallbackHandler`](/docs/integrations/callbacks/argilla.html).
|
||||
|
||||
```python
|
||||
from langchain.callbacks import ArgillaCallbackHandler
|
||||
|
@@ -28,7 +28,7 @@ from langchain.memory import CassandraChatMessageHistory
|
||||
|
||||
## Memory
|
||||
|
||||
See a [usage example](/docs/modules/memory/integrations/cassandra_chat_message_history).
|
||||
See a [usage example](/docs/integrations/memory/cassandra_chat_message_history).
|
||||
|
||||
```python
|
||||
from langchain.memory import CassandraChatMessageHistory
|
||||
|
@@ -166,7 +166,7 @@
|
||||
"source": [
|
||||
"### SQL Database Agent example\n",
|
||||
"\n",
|
||||
"This example demonstrates the use of the [SQL Database Agent](/docs/modules/agents/toolkits/sql_database.html) for answering questions over a Databricks database."
|
||||
"This example demonstrates the use of the [SQL Database Agent](/docs/integrations/toolkits/sql_database.html) for answering questions over a Databricks database."
|
||||
]
|
||||
},
|
||||
{
|
||||
|
@@ -32,11 +32,11 @@ See [MLflow AI Gateway](/docs/ecosystem/integrations/mlflow_ai_gateway).
|
||||
Databricks as an LLM provider
|
||||
-----------------------------
|
||||
|
||||
The notebook [Wrap Databricks endpoints as LLMs](/docs/modules/model_io/models/llms/integrations/databricks.html) illustrates the method to wrap Databricks endpoints as LLMs in LangChain. It supports two types of endpoints: the serving endpoint, which is recommended for both production and development, and the cluster driver proxy app, which is recommended for interactive development.
|
||||
The notebook [Wrap Databricks endpoints as LLMs](/docs/integrations/llms/databricks.html) illustrates the method to wrap Databricks endpoints as LLMs in LangChain. It supports two types of endpoints: the serving endpoint, which is recommended for both production and development, and the cluster driver proxy app, which is recommended for interactive development.
|
||||
|
||||
Databricks endpoints support Dolly, but are also great for hosting models like MPT-7B or any other models from the Hugging Face ecosystem. Databricks endpoints can also be used with proprietary models like OpenAI to provide a governance layer for enterprises.
|
||||
|
||||
Databricks Dolly
|
||||
----------------
|
||||
|
||||
Databricks’ Dolly is an instruction-following large language model trained on the Databricks machine learning platform that is licensed for commercial use. The model is available on Hugging Face Hub as databricks/dolly-v2-12b. See the notebook [Hugging Face Hub](/docs/modules/model_io/models/llms/integrations/huggingface_hub.html) for instructions to access it through the Hugging Face Hub integration with LangChain.
|
||||
Databricks’ Dolly is an instruction-following large language model trained on the Databricks machine learning platform that is licensed for commercial use. The model is available on Hugging Face Hub as databricks/dolly-v2-12b. See the notebook [Hugging Face Hub](/docs/integrations/llms/huggingface_hub.html) for instructions to access it through the Hugging Face Hub integration with LangChain.
|
||||
|
@@ -51,4 +51,4 @@ Momento can be used as a distributed memory store for LLMs.
|
||||
|
||||
### Chat Message History Memory
|
||||
|
||||
See [this notebook](/docs/modules/memory/integrations/momento_chat_message_history.html) for a walkthrough of how to use Momento as a memory store for chat message history.
|
||||
See [this notebook](/docs/integrations/memory/momento_chat_message_history.html) for a walkthrough of how to use Momento as a memory store for chat message history.
|
||||
|
@@ -31,7 +31,7 @@ db = SQLDatabase.from_uri(conn_str)
|
||||
db_chain = SQLDatabaseChain.from_llm(OpenAI(temperature=0), db, verbose=True)
|
||||
```
|
||||
|
||||
From here, see the [SQL Chain](/docs/modules/chains/popular/sqlite.html) documentation on how to use.
|
||||
From here, see the [SQL Chain](/docs/use_cases/tabular/sqlite.html) documentation on how to use.
|
||||
|
||||
|
||||
## LLMCache
|
||||
|
@@ -58,7 +58,7 @@ For a more detailed walkthrough of this, see [this notebook](/docs/modules/data_
|
||||
|
||||
## Chain
|
||||
|
||||
See a [usage example](/docs/modules/chains/additional/moderation).
|
||||
See a [usage example](/docs/guides/safety/moderation).
|
||||
|
||||
```python
|
||||
from langchain.chains import OpenAIModerationChain
|
||||
|
@@ -106,4 +106,4 @@ Redis can be used to persist LLM conversations.
|
||||
For a more detailed walkthrough of the `VectorStoreRetrieverMemory` wrapper, see [this notebook](/docs/modules/memory/integrations/vectorstore_retriever_memory.html).
|
||||
|
||||
#### Chat Message History Memory
|
||||
For a detailed example of Redis to cache conversation message history, see [this notebook](/docs/modules/memory/integrations/redis_chat_message_history.html).
|
||||
For a detailed example of Redis to cache conversation message history, see [this notebook](/docs/integrations/memory/redis_chat_message_history.html).
|
||||
|
@@ -9,7 +9,7 @@
|
||||
"\n",
|
||||
"Natural Language API Toolkits (NLAToolkits) permit LangChain Agents to efficiently plan and combine calls across endpoints. This notebook demonstrates a sample composition of the Speak, Klarna, and Spoonacluar APIs.\n",
|
||||
"\n",
|
||||
"For a detailed walkthrough of the OpenAPI chains wrapped within the NLAToolkit, see the [OpenAPI Operation Chain](/docs/modules/chains/additional/openapi.html) notebook.\n",
|
||||
"For a detailed walkthrough of the OpenAPI chains wrapped within the NLAToolkit, see the [OpenAPI Operation Chain](/docs/use_cases/apis/openapi.html) notebook.\n",
|
||||
"\n",
|
||||
"### First, import dependencies and load the LLM"
|
||||
]
|
||||
|
@@ -6,7 +6,7 @@
|
||||
"source": [
|
||||
"# Spark SQL Agent\n",
|
||||
"\n",
|
||||
"This notebook shows how to use agents to interact with a Spark SQL. Similar to [SQL Database Agent](https://python.langchain.com/en/latest/modules/agents/toolkits/examples/sql_database.html), it is designed to address general inquiries about Spark SQL and facilitate error recovery.\n",
|
||||
"This notebook shows how to use agents to interact with a Spark SQL. Similar to [SQL Database Agent](https://python.langchain.com/docs/integrations/toolkits/sql_database), it is designed to address general inquiries about Spark SQL and facilitate error recovery.\n",
|
||||
"\n",
|
||||
"**NOTE: Note that, as this agent is in active development, all answers might not be correct. Additionally, it is not guaranteed that the agent won't perform DML statements on your Spark cluster given certain questions. Be careful running it on sensitive data!**"
|
||||
]
|
||||
|
@@ -8,7 +8,7 @@
|
||||
"source": [
|
||||
"# SQL Database Agent\n",
|
||||
"\n",
|
||||
"This notebook showcases an agent designed to interact with a sql databases. The agent builds off of [SQLDatabaseChain](https://python.langchain.com/docs/modules/chains/popular/sqlite) and is designed to answer more general questions about a database, as well as recover from errors.\n",
|
||||
"This notebook showcases an agent designed to interact with a sql databases. The agent builds off of [SQLDatabaseChain](https://python.langchain.com/docs/use_cases/tabular/sqlite) and is designed to answer more general questions about a database, as well as recover from errors.\n",
|
||||
"\n",
|
||||
"Note that, as this agent is in active development, all answers might not be correct. Additionally, it is not guaranteed that the agent won't perform DML statements on your database given certain questions. Be careful running it on sensitive data!\n",
|
||||
"\n",
|
||||
|
Reference in New Issue
Block a user