mirror of
https://github.com/hwchase17/langchain.git
synced 2025-06-25 16:13:25 +00:00
docs integrations/vectorstores/
cleanup (#13487)
- updated titles to consistent format - added/updated descriptions and links - format heading
This commit is contained in:
parent
1d2981114f
commit
e3a5cd7969
@ -4,13 +4,15 @@
|
||||
"cell_type": "markdown",
|
||||
"id": "357f24224a8e818f",
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"## Hippo\n",
|
||||
"# Hippo\n",
|
||||
"\n",
|
||||
">[Hippo](https://www.transwarp.cn/starwarp) Please visit our official website for how to run a Hippo instance and\n",
|
||||
"how to use functionality related to the Hippo vector database\n",
|
||||
">[Transwarp Hippo](https://www.transwarp.cn/en/subproduct/hippo) is an enterprise-level cloud-native distributed vector database that supports storage, retrieval, and management of massive vector-based datasets. It efficiently solves problems such as vector similarity search and high-density vector clustering. `Hippo` features high availability, high performance, and easy scalability. It has many functions, such as multiple vector search indexes, data partitioning and sharding, data persistence, incremental data ingestion, vector scalar field filtering, and mixed queries. It can effectively meet the high real-time search demands of enterprises for massive vector data\n",
|
||||
"\n",
|
||||
"## Getting Started\n",
|
||||
"\n",
|
||||
@ -21,12 +23,15 @@
|
||||
"cell_type": "markdown",
|
||||
"id": "a92d2ce26df7ac4c",
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"## Installing Dependencies\n",
|
||||
"\n",
|
||||
"Initially, we require the installation of certain dependencies, such as OpenAI, Langchain, and Hippo-API. Please note, you should install the appropriate versions tailored to your environment."
|
||||
"Initially, we require the installation of certain dependencies, such as OpenAI, Langchain, and Hippo-API. Please note, that you should install the appropriate versions tailored to your environment."
|
||||
]
|
||||
},
|
||||
{
|
||||
@ -38,7 +43,10 @@
|
||||
"end_time": "2023-10-30T06:47:54.718488Z",
|
||||
"start_time": "2023-10-30T06:47:53.563129Z"
|
||||
},
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"outputs": [
|
||||
{
|
||||
@ -59,12 +67,15 @@
|
||||
"cell_type": "markdown",
|
||||
"id": "554081137df2c252",
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"Note: Python version needs to be >=3.8.\n",
|
||||
"\n",
|
||||
"## Best Practice\n",
|
||||
"## Best Practices\n",
|
||||
"### Importing Dependency Packages"
|
||||
]
|
||||
},
|
||||
@ -77,7 +88,10 @@
|
||||
"end_time": "2023-10-30T06:47:56.003409Z",
|
||||
"start_time": "2023-10-30T06:47:55.998839Z"
|
||||
},
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
@ -94,7 +108,10 @@
|
||||
"cell_type": "markdown",
|
||||
"id": "dad255dae8aea755",
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"### Loading Knowledge Documents"
|
||||
@ -109,7 +126,10 @@
|
||||
"end_time": "2023-10-30T06:47:59.027869Z",
|
||||
"start_time": "2023-10-30T06:47:59.023934Z"
|
||||
},
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
@ -122,7 +142,10 @@
|
||||
"cell_type": "markdown",
|
||||
"id": "e9b93c330f1c6160",
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"### Segmenting the Knowledge Document\n",
|
||||
@ -139,7 +162,10 @@
|
||||
"end_time": "2023-10-30T06:48:00.279351Z",
|
||||
"start_time": "2023-10-30T06:48:00.275763Z"
|
||||
},
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
@ -151,7 +177,10 @@
|
||||
"cell_type": "markdown",
|
||||
"id": "eefe28c7c993ffdf",
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"### Declaring the Embedding Model\n",
|
||||
@ -167,7 +196,10 @@
|
||||
"end_time": "2023-10-30T06:48:11.686166Z",
|
||||
"start_time": "2023-10-30T06:48:11.664355Z"
|
||||
},
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
@ -188,7 +220,10 @@
|
||||
"cell_type": "markdown",
|
||||
"id": "e60235602ed91d3c",
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"### Declaring Hippo Client"
|
||||
@ -203,7 +238,10 @@
|
||||
"end_time": "2023-10-30T06:48:48.594298Z",
|
||||
"start_time": "2023-10-30T06:48:48.585267Z"
|
||||
},
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
@ -214,7 +252,10 @@
|
||||
"cell_type": "markdown",
|
||||
"id": "43ee6dbd765c3172",
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"### Storing the Document"
|
||||
@ -229,7 +270,10 @@
|
||||
"end_time": "2023-10-30T06:51:12.661741Z",
|
||||
"start_time": "2023-10-30T06:51:06.257156Z"
|
||||
},
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"outputs": [
|
||||
{
|
||||
@ -257,7 +301,10 @@
|
||||
"cell_type": "markdown",
|
||||
"id": "89077cc9763d5dd0",
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"### Conducting Knowledge-based Question and Answer\n",
|
||||
@ -274,7 +321,10 @@
|
||||
"end_time": "2023-10-30T06:51:28.329351Z",
|
||||
"start_time": "2023-10-30T06:51:28.318713Z"
|
||||
},
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
@ -293,7 +343,10 @@
|
||||
"cell_type": "markdown",
|
||||
"id": "a4c5d73016a9db0c",
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"### Acquiring Related Knowledge Based on the Question:"
|
||||
@ -308,7 +361,10 @@
|
||||
"end_time": "2023-10-30T06:51:33.195634Z",
|
||||
"start_time": "2023-10-30T06:51:32.196493Z"
|
||||
},
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
@ -328,7 +384,10 @@
|
||||
"cell_type": "markdown",
|
||||
"id": "e5adbaaa7086d1ae",
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"### Constructing a Prompt Template"
|
||||
@ -343,7 +402,10 @@
|
||||
"end_time": "2023-10-30T06:51:35.649376Z",
|
||||
"start_time": "2023-10-30T06:51:35.645763Z"
|
||||
},
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
@ -358,7 +420,10 @@
|
||||
"cell_type": "markdown",
|
||||
"id": "b36b6a9adbec8a82",
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"### Waiting for the Large Language Model to Generate an Answer"
|
||||
@ -373,7 +438,10 @@
|
||||
"end_time": "2023-10-30T06:52:17.967885Z",
|
||||
"start_time": "2023-10-30T06:51:37.692819Z"
|
||||
},
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"outputs": [
|
||||
{
|
||||
@ -402,7 +470,10 @@
|
||||
"ExecuteTime": {
|
||||
"start_time": "2023-10-30T06:42:42.172639Z"
|
||||
},
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
@ -410,21 +481,21 @@
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3",
|
||||
"display_name": "Python 3 (ipykernel)",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 2
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython2",
|
||||
"version": "2.7.6"
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.10.12"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
|
@ -217,7 +217,7 @@
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.10.6"
|
||||
"version": "3.10.12"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
|
@ -3,12 +3,15 @@
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"# sqlite-vss\n",
|
||||
"# SQLite-VSS\n",
|
||||
"\n",
|
||||
">[sqlite-vss](https://alexgarcia.xyz/sqlite-vss/) is an SQLite extension designed for vector search, emphasizing local-first operations and easy integration into applications without external servers. Leveraging the Faiss library, it offers efficient similarity search and clustering capabilities.\n",
|
||||
">[SQLite-VSS](https://alexgarcia.xyz/sqlite-vss/) is an `SQLite` extension designed for vector search, emphasizing local-first operations and easy integration into applications without external servers. Leveraging the `Faiss` library, it offers efficient similarity search and clustering capabilities.\n",
|
||||
"\n",
|
||||
"This notebook shows how to use the `SQLiteVSS` vector database."
|
||||
]
|
||||
@ -17,7 +20,10 @@
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
@ -28,10 +34,13 @@
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"### Quickstart"
|
||||
"## Quickstart"
|
||||
]
|
||||
},
|
||||
{
|
||||
@ -42,7 +51,10 @@
|
||||
"end_time": "2023-09-06T14:55:55.370351Z",
|
||||
"start_time": "2023-09-06T14:55:53.547755Z"
|
||||
},
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"outputs": [
|
||||
{
|
||||
@ -97,10 +109,13 @@
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"### Using existing sqlite connection"
|
||||
"## Using existing SQLite connection"
|
||||
]
|
||||
},
|
||||
{
|
||||
@ -111,7 +126,10 @@
|
||||
"end_time": "2023-09-06T14:59:22.086252Z",
|
||||
"start_time": "2023-09-06T14:59:21.693237Z"
|
||||
},
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"outputs": [
|
||||
{
|
||||
@ -166,7 +184,10 @@
|
||||
"end_time": "2023-09-06T15:01:15.550318Z",
|
||||
"start_time": "2023-09-06T15:01:15.546428Z"
|
||||
},
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
@ -180,7 +201,10 @@
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
@ -188,23 +212,23 @@
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3",
|
||||
"display_name": "Python 3 (ipykernel)",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 2
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython2",
|
||||
"version": "2.7.6"
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.10.12"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 0
|
||||
"nbformat_minor": 4
|
||||
}
|
||||
|
@ -7,28 +7,30 @@
|
||||
"source": [
|
||||
"# Timescale Vector (Postgres)\n",
|
||||
"\n",
|
||||
">[Timescale Vector](https://www.timescale.com/ai?utm_campaign=vectorlaunch&utm_source=langchain&utm_medium=referral) is `PostgreSQL++` vector database for AI applications.\n",
|
||||
"\n",
|
||||
"This notebook shows how to use the Postgres vector database `Timescale Vector`. You'll learn how to use TimescaleVector for (1) semantic search, (2) time-based vector search, (3) self-querying, and (4) how to create indexes to speed up queries.\n",
|
||||
"\n",
|
||||
"## What is Timescale Vector?\n",
|
||||
"**[Timescale Vector](https://www.timescale.com/ai?utm_campaign=vectorlaunch&utm_source=langchain&utm_medium=referral) is PostgreSQL++ for AI applications.**\n",
|
||||
"\n",
|
||||
"Timescale Vector enables you to efficiently store and query millions of vector embeddings in `PostgreSQL`.\n",
|
||||
"`Timescale Vector` enables you to efficiently store and query millions of vector embeddings in `PostgreSQL`.\n",
|
||||
"- Enhances `pgvector` with faster and more accurate similarity search on 100M+ vectors via `DiskANN` inspired indexing algorithm.\n",
|
||||
"- Enables fast time-based vector search via automatic time-based partitioning and indexing.\n",
|
||||
"- Provides a familiar SQL interface for querying vector embeddings and relational data.\n",
|
||||
"\n",
|
||||
"Timescale Vector is cloud PostgreSQL for AI that scales with you from POC to production:\n",
|
||||
"`Timescale Vector` is cloud `PostgreSQL` for AI that scales with you from POC to production:\n",
|
||||
"- Simplifies operations by enabling you to store relational metadata, vector embeddings, and time-series data in a single database.\n",
|
||||
"- Benefits from rock-solid PostgreSQL foundation with enterprise-grade feature liked streaming backups and replication, high-availability and row-level security.\n",
|
||||
"- Benefits from rock-solid PostgreSQL foundation with enterprise-grade features like streaming backups and replication, high availability and row-level security.\n",
|
||||
"- Enables a worry-free experience with enterprise-grade security and compliance.\n",
|
||||
"\n",
|
||||
"## How to access Timescale Vector\n",
|
||||
"Timescale Vector is available on [Timescale](https://www.timescale.com/ai?utm_campaign=vectorlaunch&utm_source=langchain&utm_medium=referral), the cloud PostgreSQL platform. (There is no self-hosted version at this time.)\n",
|
||||
"\n",
|
||||
"`Timescale Vector` is available on [Timescale](https://www.timescale.com/ai?utm_campaign=vectorlaunch&utm_source=langchain&utm_medium=referral), the cloud PostgreSQL platform. (There is no self-hosted version at this time.)\n",
|
||||
"\n",
|
||||
"LangChain users get a 90-day free trial for Timescale Vector.\n",
|
||||
"- To get started, [signup](https://console.cloud.timescale.com/signup?utm_campaign=vectorlaunch&utm_source=langchain&utm_medium=referral) to Timescale, create a new database and follow this notebook!\n",
|
||||
"- See the [Timescale Vector explainer blog](https://www.timescale.com/blog/how-we-made-postgresql-the-best-vector-database/?utm_campaign=vectorlaunch&utm_source=langchain&utm_medium=referral) for more details and performance benchmarks.\n",
|
||||
"- See the [installation instructions](https://github.com/timescale/python-vector) for more details on using Timescale Vector in python."
|
||||
"- See the [installation instructions](https://github.com/timescale/python-vector) for more details on using Timescale Vector in Python."
|
||||
]
|
||||
},
|
||||
{
|
||||
@ -1726,7 +1728,7 @@
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.8.16"
|
||||
"version": "3.10.12"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
|
@ -1,5 +1,43 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Vearch\n",
|
||||
"\n",
|
||||
">[Vearch](https://vearch.readthedocs.io) is the vector search infrastructure for deeping learning and AI applications.\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Setting up\n",
|
||||
"\n",
|
||||
"Follow [instructions](https://vearch.readthedocs.io/en/latest/quick-start-guide.html#)."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"!pip install vearch\n",
|
||||
"\n",
|
||||
"# OR\n",
|
||||
"\n",
|
||||
"!pip install vearch_cluster"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Example"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 2,
|
||||
@ -16,10 +54,11 @@
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"from langchain.vectorstores.vearch import Vearch\n",
|
||||
"\n",
|
||||
"from langchain.document_loaders import TextLoader\n",
|
||||
"from langchain.embeddings.huggingface import HuggingFaceEmbeddings\n",
|
||||
"from langchain.text_splitter import RecursiveCharacterTextSplitter\n",
|
||||
"from langchain.vectorstores.vearch import Vearch\n",
|
||||
"from transformers import AutoModel, AutoTokenizer\n",
|
||||
"\n",
|
||||
"# repalce to your local model path\n",
|
||||
@ -464,7 +503,7 @@
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3.10.13 ('vearch_cluster_langchain')",
|
||||
"display_name": "Python 3 (ipykernel)",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
@ -478,9 +517,8 @@
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.10.13"
|
||||
"version": "3.10.12"
|
||||
},
|
||||
"orig_nbformat": 4,
|
||||
"vscode": {
|
||||
"interpreter": {
|
||||
"hash": "f1da10a89896267ed34b497c9568817f36cc7ea79826b5cfca4d96376f5b4835"
|
||||
@ -488,5 +526,5 @@
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
"nbformat_minor": 4
|
||||
}
|
||||
|
@ -4,27 +4,21 @@
|
||||
"cell_type": "markdown",
|
||||
"id": "9eb8dfa6fdb71ef5",
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"# Zep\n",
|
||||
"## VectorStore Example for [Zep](https://docs.getzep.com/) - Fast, scalable building blocks for LLM Apps\n",
|
||||
"\n",
|
||||
"### More on Zep:\n",
|
||||
">[Zep](https://docs.getzep.com/) is an open-source platform for LLM apps. Go from a prototype\n",
|
||||
">built in LangChain or LlamaIndex, or a custom app, to production in minutes without rewriting code.\n",
|
||||
"\n",
|
||||
"Zep is an open source platform for productionizing LLM apps. Go from a prototype\n",
|
||||
"built in LangChain or LlamaIndex, or a custom app, to production in minutes without\n",
|
||||
"rewriting code.\n",
|
||||
"## Key Features:\n",
|
||||
"\n",
|
||||
"## Fast, Scalable Building Blocks for LLM Apps\n",
|
||||
"Zep is an open source platform for productionizing LLM apps. Go from a prototype\n",
|
||||
"built in LangChain or LlamaIndex, or a custom app, to production in minutes without\n",
|
||||
"rewriting code.\n",
|
||||
"\n",
|
||||
"Key Features:\n",
|
||||
"\n",
|
||||
"- **Fast!** Zep operates independently of the your chat loop, ensuring a snappy user experience.\n",
|
||||
"- **Chat History Memory, Archival, and Enrichment**, populate your prompts with relevant chat history, sumamries, named entities, intent data, and more.\n",
|
||||
"- **Fast!** `Zep` operates independently of your chat loop, ensuring a snappy user experience.\n",
|
||||
"- **Chat History Memory, Archival, and Enrichment**, populate your prompts with relevant chat history, summaries, named entities, intent data, and more.\n",
|
||||
"- **Vector Search over Chat History and Documents** Automatic embedding of documents, chat histories, and summaries. Use Zep's similarity or native MMR Re-ranked search to find the most relevant.\n",
|
||||
"- **Manage Users and their Chat Sessions** Users and their Chat Sessions are first-class citizens in Zep, allowing you to manage user interactions with your bots or agents easily.\n",
|
||||
"- **Records Retention and Privacy Compliance** Comply with corporate and regulatory mandates for records retention while ensuring compliance with privacy regulations such as CCPA and GDPR. Fulfill *Right To Be Forgotten* requests with a single API call\n",
|
||||
@ -34,14 +28,15 @@
|
||||
"and searching your user's chat history.\n",
|
||||
"\n",
|
||||
"## Installation\n",
|
||||
"Follow the [Zep Quickstart Guide](https://docs.getzep.com/deployment/quickstart/) to install and get started with Zep.\n",
|
||||
"\n",
|
||||
"## Usage\n",
|
||||
"Follow the [Zep Quickstart Guide](https://docs.getzep.com/deployment/quickstart/) to install and get started with Zep.\n",
|
||||
"\n",
|
||||
"You'll need your Zep API URL and optionally an API key to use the Zep VectorStore. \n",
|
||||
"See the [Zep docs](https://docs.getzep.com) for more information.\n",
|
||||
"\n",
|
||||
"In the examples below, we're using Zep's auto-embedding feature which automatically embed documents on the Zep server \n",
|
||||
"## Usage\n",
|
||||
"\n",
|
||||
"In the examples below, we're using Zep's auto-embedding feature which automatically embeds documents on the Zep server \n",
|
||||
"using low-latency embedding models.\n",
|
||||
"\n",
|
||||
"## Note\n",
|
||||
@ -55,7 +50,10 @@
|
||||
"cell_type": "markdown",
|
||||
"id": "9a3a11aab1412d98",
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"## Load or create a Collection from documents"
|
||||
@ -70,7 +68,10 @@
|
||||
"end_time": "2023-08-13T01:07:50.672390Z",
|
||||
"start_time": "2023-08-13T01:07:48.777799Z"
|
||||
},
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
@ -124,7 +125,10 @@
|
||||
"end_time": "2023-08-13T01:07:53.807663Z",
|
||||
"start_time": "2023-08-13T01:07:50.671241Z"
|
||||
},
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"outputs": [
|
||||
{
|
||||
@ -170,7 +174,10 @@
|
||||
"cell_type": "markdown",
|
||||
"id": "94ca9dfa7d0ecaa5",
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"## Simarility Search Query over the Collection"
|
||||
@ -185,7 +192,10 @@
|
||||
"end_time": "2023-08-13T01:07:54.195988Z",
|
||||
"start_time": "2023-08-13T01:07:53.808550Z"
|
||||
},
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"outputs": [
|
||||
{
|
||||
@ -237,7 +247,10 @@
|
||||
"cell_type": "markdown",
|
||||
"id": "e02b61a9af0b2c80",
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"## Search over Collection Re-ranked by MMR\n",
|
||||
@ -254,7 +267,10 @@
|
||||
"end_time": "2023-08-13T01:07:54.394873Z",
|
||||
"start_time": "2023-08-13T01:07:54.180901Z"
|
||||
},
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"outputs": [
|
||||
{
|
||||
@ -304,7 +320,10 @@
|
||||
"cell_type": "markdown",
|
||||
"id": "42455e31d4ab0d68",
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"# Filter by Metadata\n",
|
||||
@ -321,7 +340,10 @@
|
||||
"end_time": "2023-08-13T01:08:06.323569Z",
|
||||
"start_time": "2023-08-13T01:07:54.381822Z"
|
||||
},
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"outputs": [
|
||||
{
|
||||
@ -367,10 +389,13 @@
|
||||
"cell_type": "markdown",
|
||||
"id": "5b225f3ae1e61de8",
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"### We see results from both books. Note the `source` metadata"
|
||||
"We see results from both books. Note the `source` metadata"
|
||||
]
|
||||
},
|
||||
{
|
||||
@ -382,7 +407,10 @@
|
||||
"end_time": "2023-08-13T01:08:06.504769Z",
|
||||
"start_time": "2023-08-13T01:08:06.325435Z"
|
||||
},
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"outputs": [
|
||||
{
|
||||
@ -431,10 +459,13 @@
|
||||
"cell_type": "markdown",
|
||||
"id": "7b81d7cae351a1ec",
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"### Let's try again using a filter for only the Sherlock Holmes document."
|
||||
"Now, we set up a filter"
|
||||
]
|
||||
},
|
||||
{
|
||||
@ -446,7 +477,10 @@
|
||||
"end_time": "2023-08-13T01:08:06.672836Z",
|
||||
"start_time": "2023-08-13T01:08:06.505944Z"
|
||||
},
|
||||
"collapsed": false
|
||||
"collapsed": false,
|
||||
"jupyter": {
|
||||
"outputs_hidden": false
|
||||
}
|
||||
},
|
||||
"outputs": [
|
||||
{
|
||||
@ -515,7 +549,7 @@
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3",
|
||||
"display_name": "Python 3 (ipykernel)",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
@ -529,7 +563,7 @@
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.11.6"
|
||||
"version": "3.10.12"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
|
Loading…
Reference in New Issue
Block a user