docs: integrations/memory consistency (#10255)

- updated titles and descriptions of the `integrations/memory` notebooks
into consistent and laconic format;
- removed
`docs/extras/integrations/memory/motorhead_memory_managed.ipynb` file as
a duplicate of the
`docs/extras/integrations/memory/motorhead_memory.ipynb`;
- added `integrations/providers` Integration Cards for `dynamodb`,
`motorhead`.
- updated `integrations/providers/redis.mdx` with links
- renamed several notebooks; updated `vercel.json` to reroute new names.
This commit is contained in:
Leonid Ganeline 2023-09-30 16:35:55 -07:00 committed by GitHub
parent 33eb5f8300
commit 240190db3f
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
19 changed files with 737 additions and 722 deletions

View File

@ -2628,6 +2628,18 @@
"source": "/docs/modules/memory/integrations/cassandra_chat_message_history",
"destination": "/docs/integrations/memory/cassandra_chat_message_history"
},
{
"source": "/docs/integrations/memory/motorhead_memory_managed",
"destination": "/docs/integrations/memory/motorhead_memory"
},
{
"source": "/docs/integrations/memory/dynamodb_chat_message_history",
"destination": "/docs/integrations/memory/aws_dynamodb"
},
{
"source": "/docs/integrations/memory/entity_memory_with_sqlite",
"destination": "/docs/integrations/memory/sqlite"
},
{
"source": "/en/latest/modules/memory/examples/dynamodb_chat_message_history.html",
"destination": "/docs/integrations/memory/dynamodb_chat_message_history"

View File

@ -0,0 +1,371 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "91c6a7ef",
"metadata": {},
"source": [
"# AWS DynamoDB\n",
"\n",
">[Amazon AWS DynamoDB](https://awscli.amazonaws.com/v2/documentation/api/latest/reference/dynamodb/index.html) is a fully managed `NoSQL` database service that provides fast and predictable performance with seamless scalability.\n",
"\n",
"This notebook goes over how to use `DynamoDB` to store chat message history."
]
},
{
"cell_type": "markdown",
"id": "3f608be0",
"metadata": {},
"source": [
"First make sure you have correctly configured the [AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-configure.html). Then make sure you have installed `boto3`."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "3a7e89c2-4c55-4a66-91ec-9bf9a37467eb",
"metadata": {},
"outputs": [],
"source": [
"!pip install boto3"
]
},
{
"cell_type": "markdown",
"id": "030d784f",
"metadata": {},
"source": [
"Next, create the `DynamoDB` Table where we will be storing messages:"
]
},
{
"cell_type": "code",
"execution_count": 10,
"id": "93ce1811",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"0\n"
]
}
],
"source": [
"import boto3\n",
"\n",
"# Get the service resource.\n",
"dynamodb = boto3.resource(\"dynamodb\")\n",
"\n",
"# Create the DynamoDB table.\n",
"table = dynamodb.create_table(\n",
" TableName=\"SessionTable\",\n",
" KeySchema=[{\"AttributeName\": \"SessionId\", \"KeyType\": \"HASH\"}],\n",
" AttributeDefinitions=[{\"AttributeName\": \"SessionId\", \"AttributeType\": \"S\"}],\n",
" BillingMode=\"PAY_PER_REQUEST\",\n",
")\n",
"\n",
"# Wait until the table exists.\n",
"table.meta.client.get_waiter(\"table_exists\").wait(TableName=\"SessionTable\")\n",
"\n",
"# Print out some data about the table.\n",
"print(table.item_count)"
]
},
{
"cell_type": "markdown",
"id": "1a9b310b",
"metadata": {},
"source": [
"## DynamoDBChatMessageHistory"
]
},
{
"cell_type": "code",
"execution_count": 11,
"id": "d15e3302",
"metadata": {},
"outputs": [],
"source": [
"from langchain.memory.chat_message_histories import DynamoDBChatMessageHistory\n",
"\n",
"history = DynamoDBChatMessageHistory(table_name=\"SessionTable\", session_id=\"0\")\n",
"\n",
"history.add_user_message(\"hi!\")\n",
"\n",
"history.add_ai_message(\"whats up?\")"
]
},
{
"cell_type": "code",
"execution_count": 12,
"id": "64fc465e",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"[HumanMessage(content='hi!', additional_kwargs={}, example=False),\n",
" AIMessage(content='whats up?', additional_kwargs={}, example=False),\n",
" HumanMessage(content='hi!', additional_kwargs={}, example=False),\n",
" AIMessage(content='whats up?', additional_kwargs={}, example=False)]"
]
},
"execution_count": 12,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"history.messages"
]
},
{
"cell_type": "markdown",
"id": "955f1b15",
"metadata": {},
"source": [
"## DynamoDBChatMessageHistory with Custom Endpoint URL\n",
"\n",
"Sometimes it is useful to specify the URL to the AWS endpoint to connect to. For instance, when you are running locally against [Localstack](https://localstack.cloud/). For those cases you can specify the URL via the `endpoint_url` parameter in the constructor."
]
},
{
"cell_type": "code",
"execution_count": 13,
"id": "225713c8",
"metadata": {},
"outputs": [],
"source": [
"from langchain.memory.chat_message_histories import DynamoDBChatMessageHistory\n",
"\n",
"history = DynamoDBChatMessageHistory(\n",
" table_name=\"SessionTable\",\n",
" session_id=\"0\",\n",
" endpoint_url=\"http://localhost.localstack.cloud:4566\",\n",
")"
]
},
{
"cell_type": "markdown",
"id": "97f8578a",
"metadata": {},
"source": [
"## DynamoDBChatMessageHistory With Different Keys Composite Keys\n",
"The default key for DynamoDBChatMessageHistory is ```{\"SessionId\": self.session_id}```, but you can modify this to match your table design.\n",
"\n",
"### Primary Key Name\n",
"You may modify the primary key by passing in a primary_key_name value in the constructor, resulting in the following:\n",
"```{self.primary_key_name: self.session_id}```\n",
"\n",
"### Composite Keys\n",
"When using an existing DynamoDB table, you may need to modify the key structure from the default of to something including a Sort Key. To do this you may use the ```key``` parameter.\n",
"\n",
"Passing a value for key will override the primary_key parameter, and the resulting key structure will be the passed value.\n"
]
},
{
"cell_type": "code",
"execution_count": 14,
"id": "088c037c",
"metadata": {
"collapsed": false,
"jupyter": {
"outputs_hidden": false
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"0\n"
]
},
{
"data": {
"text/plain": [
"[HumanMessage(content='hello, composite dynamodb table!', additional_kwargs={}, example=False)]"
]
},
"execution_count": 14,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"from langchain.memory.chat_message_histories import DynamoDBChatMessageHistory\n",
"\n",
"composite_table = dynamodb.create_table(\n",
" TableName=\"CompositeTable\",\n",
" KeySchema=[{\"AttributeName\": \"PK\", \"KeyType\": \"HASH\"}, {\"AttributeName\": \"SK\", \"KeyType\": \"RANGE\"}],\n",
" AttributeDefinitions=[{\"AttributeName\": \"PK\", \"AttributeType\": \"S\"}, {\"AttributeName\": \"SK\", \"AttributeType\": \"S\"}],\n",
" BillingMode=\"PAY_PER_REQUEST\",\n",
")\n",
"\n",
"# Wait until the table exists.\n",
"composite_table.meta.client.get_waiter(\"table_exists\").wait(TableName=\"CompositeTable\")\n",
"\n",
"# Print out some data about the table.\n",
"print(composite_table.item_count)\n",
"\n",
"my_key = {\n",
" \"PK\": \"session_id::0\",\n",
" \"SK\": \"langchain_history\",\n",
"}\n",
"\n",
"composite_key_history = DynamoDBChatMessageHistory(\n",
" table_name=\"CompositeTable\",\n",
" session_id=\"0\",\n",
" endpoint_url=\"http://localhost.localstack.cloud:4566\",\n",
" key=my_key,\n",
")\n",
"\n",
"composite_key_history.add_user_message(\"hello, composite dynamodb table!\")\n",
"\n",
"composite_key_history.messages"
]
},
{
"cell_type": "markdown",
"id": "3b33c988",
"metadata": {},
"source": [
"## Agent with DynamoDB Memory"
]
},
{
"cell_type": "code",
"execution_count": 15,
"id": "f92d9499",
"metadata": {},
"outputs": [],
"source": [
"from langchain.agents import Tool\n",
"from langchain.memory import ConversationBufferMemory\n",
"from langchain.chat_models import ChatOpenAI\n",
"from langchain.agents import initialize_agent\n",
"from langchain.agents import AgentType\n",
"from langchain.utilities import PythonREPL\n",
"from getpass import getpass\n",
"\n",
"message_history = DynamoDBChatMessageHistory(table_name=\"SessionTable\", session_id=\"1\")\n",
"memory = ConversationBufferMemory(\n",
" memory_key=\"chat_history\", chat_memory=message_history, return_messages=True\n",
")"
]
},
{
"cell_type": "code",
"execution_count": 16,
"id": "1167eeba",
"metadata": {},
"outputs": [],
"source": [
"python_repl = PythonREPL()\n",
"\n",
"# You can create the tool to pass to an agent\n",
"tools = [\n",
" Tool(\n",
" name=\"python_repl\",\n",
" description=\"A Python shell. Use this to execute python commands. Input should be a valid python command. If you want to see the output of a value, you should print it out with `print(...)`.\",\n",
" func=python_repl.run,\n",
" )\n",
"]"
]
},
{
"cell_type": "code",
"execution_count": 17,
"id": "fce085c5",
"metadata": {},
"outputs": [
{
"ename": "ValidationError",
"evalue": "1 validation error for ChatOpenAI\n__root__\n Did not find openai_api_key, please add an environment variable `OPENAI_API_KEY` which contains it, or pass `openai_api_key` as a named parameter. (type=value_error)",
"output_type": "error",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mValidationError\u001b[0m Traceback (most recent call last)",
"Cell \u001b[0;32mIn[17], line 1\u001b[0m\n\u001b[0;32m----> 1\u001b[0m llm \u001b[38;5;241m=\u001b[39m \u001b[43mChatOpenAI\u001b[49m\u001b[43m(\u001b[49m\u001b[43mtemperature\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;241;43m0\u001b[39;49m\u001b[43m)\u001b[49m\n\u001b[1;32m 2\u001b[0m agent_chain \u001b[38;5;241m=\u001b[39m initialize_agent(\n\u001b[1;32m 3\u001b[0m tools,\n\u001b[1;32m 4\u001b[0m llm,\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 7\u001b[0m memory\u001b[38;5;241m=\u001b[39mmemory,\n\u001b[1;32m 8\u001b[0m )\n",
"File \u001b[0;32m~/Documents/projects/langchain/libs/langchain/langchain/load/serializable.py:74\u001b[0m, in \u001b[0;36mSerializable.__init__\u001b[0;34m(self, **kwargs)\u001b[0m\n\u001b[1;32m 73\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21m__init__\u001b[39m(\u001b[38;5;28mself\u001b[39m, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs: Any) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m \u001b[38;5;28;01mNone\u001b[39;00m:\n\u001b[0;32m---> 74\u001b[0m \u001b[38;5;28;43msuper\u001b[39;49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[38;5;21;43m__init__\u001b[39;49m\u001b[43m(\u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 75\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_lc_kwargs \u001b[38;5;241m=\u001b[39m kwargs\n",
"File \u001b[0;32m~/Documents/projects/langchain/.venv/lib/python3.9/site-packages/pydantic/main.py:341\u001b[0m, in \u001b[0;36mpydantic.main.BaseModel.__init__\u001b[0;34m()\u001b[0m\n",
"\u001b[0;31mValidationError\u001b[0m: 1 validation error for ChatOpenAI\n__root__\n Did not find openai_api_key, please add an environment variable `OPENAI_API_KEY` which contains it, or pass `openai_api_key` as a named parameter. (type=value_error)"
]
}
],
"source": [
"llm = ChatOpenAI(temperature=0)\n",
"agent_chain = initialize_agent(\n",
" tools,\n",
" llm,\n",
" agent=AgentType.CHAT_CONVERSATIONAL_REACT_DESCRIPTION,\n",
" verbose=True,\n",
" memory=memory,\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "952a3103",
"metadata": {},
"outputs": [],
"source": [
"agent_chain.run(input=\"Hello!\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "54c4aaf4",
"metadata": {},
"outputs": [],
"source": [
"agent_chain.run(input=\"Who owns Twitter?\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "f9013118",
"metadata": {},
"outputs": [],
"source": [
"agent_chain.run(input=\"My name is Bob.\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "405e5315",
"metadata": {},
"outputs": [],
"source": [
"agent_chain.run(input=\"Who am I?\")\n"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.12"
}
},
"nbformat": 4,
"nbformat_minor": 5
}

View File

@ -5,15 +5,24 @@
"id": "90cd3ded",
"metadata": {},
"source": [
"# Cassandra Chat Message History\n",
"# Cassandra \n",
"\n",
">[Apache Cassandra®](https://cassandra.apache.org) is a NoSQL, row-oriented, highly scalable and highly available database, well suited for storing large amounts of data.\n",
">[Apache Cassandra®](https://cassandra.apache.org) is a `NoSQL`, row-oriented, highly scalable and highly available database, well suited for storing large amounts of data.\n",
"\n",
"Cassandra is a good choice for storing chat message history because it is easy to scale and can handle a large number of writes.\n",
">`Cassandra` is a good choice for storing chat message history because it is easy to scale and can handle a large number of writes.\n",
"\n",
"This notebook goes over how to use Cassandra to store chat message history.\n",
"\n"
]
},
{
"cell_type": "markdown",
"id": "f507f58b-bf22-4a48-8daf-68d869bcd1ba",
"metadata": {},
"source": [
"## Setting up\n",
"\n",
"To run this notebook you need either a running Cassandra cluster or a DataStax Astra DB instance running in the cloud (you can get one for free at [datastax.com](https://astra.datastax.com)). Check [cassio.org](https://cassio.org/start_here/) for more information."
"To run this notebook you need either a running `Cassandra` cluster or a `DataStax Astra DB` instance running in the cloud (you can get one for free at [datastax.com](https://astra.datastax.com)). Check [cassio.org](https://cassio.org/start_here/) for more information."
]
},
{
@ -31,7 +40,7 @@
"id": "e3d97b65",
"metadata": {},
"source": [
"### Please provide database connection parameters and secrets:"
"### Set up the database connection parameters and secrets"
]
},
{
@ -63,7 +72,7 @@
"id": "55860b2d",
"metadata": {},
"source": [
"#### depending on whether local or cloud-based Astra DB, create the corresponding database connection \"Session\" object"
"Depending on whether local or cloud-based Astra DB, create the corresponding database connection \"Session\" object."
]
},
{
@ -105,7 +114,7 @@
"id": "36c163e8",
"metadata": {},
"source": [
"### Creation and usage of the Chat Message History"
"## Example"
]
},
{

View File

@ -1,352 +0,0 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "91c6a7ef",
"metadata": {},
"source": [
"# Dynamodb Chat Message History\n",
"\n",
"This notebook goes over how to use Dynamodb to store chat message history."
]
},
{
"cell_type": "markdown",
"id": "3f608be0",
"metadata": {},
"source": [
"First make sure you have correctly configured the [AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-configure.html). Then make sure you have installed boto3."
]
},
{
"cell_type": "markdown",
"id": "030d784f",
"metadata": {},
"source": [
"Next, create the DynamoDB Table where we will be storing messages:"
]
},
{
"cell_type": "code",
"execution_count": 10,
"id": "93ce1811",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"0\n"
]
}
],
"source": [
"import boto3\n",
"\n",
"# Get the service resource.\n",
"dynamodb = boto3.resource(\"dynamodb\")\n",
"\n",
"# Create the DynamoDB table.\n",
"table = dynamodb.create_table(\n",
" TableName=\"SessionTable\",\n",
" KeySchema=[{\"AttributeName\": \"SessionId\", \"KeyType\": \"HASH\"}],\n",
" AttributeDefinitions=[{\"AttributeName\": \"SessionId\", \"AttributeType\": \"S\"}],\n",
" BillingMode=\"PAY_PER_REQUEST\",\n",
")\n",
"\n",
"# Wait until the table exists.\n",
"table.meta.client.get_waiter(\"table_exists\").wait(TableName=\"SessionTable\")\n",
"\n",
"# Print out some data about the table.\n",
"print(table.item_count)"
]
},
{
"cell_type": "markdown",
"id": "1a9b310b",
"metadata": {},
"source": [
"## DynamoDBChatMessageHistory"
]
},
{
"cell_type": "code",
"execution_count": 11,
"id": "d15e3302",
"metadata": {},
"outputs": [],
"source": [
"from langchain.memory.chat_message_histories import DynamoDBChatMessageHistory\n",
"\n",
"history = DynamoDBChatMessageHistory(table_name=\"SessionTable\", session_id=\"0\")\n",
"\n",
"history.add_user_message(\"hi!\")\n",
"\n",
"history.add_ai_message(\"whats up?\")"
]
},
{
"cell_type": "code",
"execution_count": 12,
"id": "64fc465e",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": "[HumanMessage(content='hi!', additional_kwargs={}, example=False),\n AIMessage(content='whats up?', additional_kwargs={}, example=False),\n HumanMessage(content='hi!', additional_kwargs={}, example=False),\n AIMessage(content='whats up?', additional_kwargs={}, example=False)]"
},
"execution_count": 12,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"history.messages"
]
},
{
"cell_type": "markdown",
"id": "955f1b15",
"metadata": {},
"source": [
"## DynamoDBChatMessageHistory with Custom Endpoint URL\n",
"\n",
"Sometimes it is useful to specify the URL to the AWS endpoint to connect to. For instance, when you are running locally against [Localstack](https://localstack.cloud/). For those cases you can specify the URL via the `endpoint_url` parameter in the constructor."
]
},
{
"cell_type": "code",
"execution_count": 13,
"id": "225713c8",
"metadata": {},
"outputs": [],
"source": [
"from langchain.memory.chat_message_histories import DynamoDBChatMessageHistory\n",
"\n",
"history = DynamoDBChatMessageHistory(\n",
" table_name=\"SessionTable\",\n",
" session_id=\"0\",\n",
" endpoint_url=\"http://localhost.localstack.cloud:4566\",\n",
")"
]
},
{
"cell_type": "markdown",
"source": [
"## DynamoDBChatMessageHistory With Different Keys Composite Keys\n",
"The default key for DynamoDBChatMessageHistory is ```{\"SessionId\": self.session_id}```, but you can modify this to match your table design.\n",
"\n",
"### Primary Key Name\n",
"You may modify the primary key by passing in a primary_key_name value in the constructor, resulting in the following:\n",
"```{self.primary_key_name: self.session_id}```\n",
"\n",
"### Composite Keys\n",
"When using an existing DynamoDB table, you may need to modify the key structure from the default of to something including a Sort Key. To do this you may use the ```key``` parameter.\n",
"\n",
"Passing a value for key will override the primary_key parameter, and the resulting key structure will be the passed value.\n"
],
"metadata": {
"collapsed": false
},
"id": "c9bc0693"
},
{
"cell_type": "code",
"execution_count": 14,
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"0\n"
]
},
{
"data": {
"text/plain": "[HumanMessage(content='hello, composite dynamodb table!', additional_kwargs={}, example=False)]"
},
"execution_count": 14,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"from langchain.memory.chat_message_histories import DynamoDBChatMessageHistory\n",
"\n",
"composite_table = dynamodb.create_table(\n",
" TableName=\"CompositeTable\",\n",
" KeySchema=[{\"AttributeName\": \"PK\", \"KeyType\": \"HASH\"}, {\"AttributeName\": \"SK\", \"KeyType\": \"RANGE\"}],\n",
" AttributeDefinitions=[{\"AttributeName\": \"PK\", \"AttributeType\": \"S\"}, {\"AttributeName\": \"SK\", \"AttributeType\": \"S\"}],\n",
" BillingMode=\"PAY_PER_REQUEST\",\n",
")\n",
"\n",
"# Wait until the table exists.\n",
"composite_table.meta.client.get_waiter(\"table_exists\").wait(TableName=\"CompositeTable\")\n",
"\n",
"# Print out some data about the table.\n",
"print(composite_table.item_count)\n",
"\n",
"my_key = {\n",
" \"PK\": \"session_id::0\",\n",
" \"SK\": \"langchain_history\",\n",
"}\n",
"\n",
"composite_key_history = DynamoDBChatMessageHistory(\n",
" table_name=\"CompositeTable\",\n",
" session_id=\"0\",\n",
" endpoint_url=\"http://localhost.localstack.cloud:4566\",\n",
" key=my_key,\n",
")\n",
"\n",
"composite_key_history.add_user_message(\"hello, composite dynamodb table!\")\n",
"\n",
"composite_key_history.messages"
],
"metadata": {
"collapsed": false
},
"id": "a7fa0331"
},
{
"attachments": {},
"cell_type": "markdown",
"id": "3b33c988",
"metadata": {},
"source": [
"## Agent with DynamoDB Memory"
]
},
{
"cell_type": "code",
"execution_count": 15,
"id": "f92d9499",
"metadata": {},
"outputs": [],
"source": [
"from langchain.agents import Tool\n",
"from langchain.memory import ConversationBufferMemory\n",
"from langchain.chat_models import ChatOpenAI\n",
"from langchain.agents import initialize_agent\n",
"from langchain.agents import AgentType\n",
"from langchain.utilities import PythonREPL\n",
"from getpass import getpass\n",
"\n",
"message_history = DynamoDBChatMessageHistory(table_name=\"SessionTable\", session_id=\"1\")\n",
"memory = ConversationBufferMemory(\n",
" memory_key=\"chat_history\", chat_memory=message_history, return_messages=True\n",
")"
]
},
{
"cell_type": "code",
"execution_count": 16,
"id": "1167eeba",
"metadata": {},
"outputs": [],
"source": [
"python_repl = PythonREPL()\n",
"\n",
"# You can create the tool to pass to an agent\n",
"tools = [\n",
" Tool(\n",
" name=\"python_repl\",\n",
" description=\"A Python shell. Use this to execute python commands. Input should be a valid python command. If you want to see the output of a value, you should print it out with `print(...)`.\",\n",
" func=python_repl.run,\n",
" )\n",
"]"
]
},
{
"cell_type": "code",
"execution_count": 17,
"id": "fce085c5",
"metadata": {},
"outputs": [
{
"ename": "ValidationError",
"evalue": "1 validation error for ChatOpenAI\n__root__\n Did not find openai_api_key, please add an environment variable `OPENAI_API_KEY` which contains it, or pass `openai_api_key` as a named parameter. (type=value_error)",
"output_type": "error",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mValidationError\u001b[0m Traceback (most recent call last)",
"Cell \u001b[0;32mIn[17], line 1\u001b[0m\n\u001b[0;32m----> 1\u001b[0m llm \u001b[38;5;241m=\u001b[39m \u001b[43mChatOpenAI\u001b[49m\u001b[43m(\u001b[49m\u001b[43mtemperature\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;241;43m0\u001b[39;49m\u001b[43m)\u001b[49m\n\u001b[1;32m 2\u001b[0m agent_chain \u001b[38;5;241m=\u001b[39m initialize_agent(\n\u001b[1;32m 3\u001b[0m tools,\n\u001b[1;32m 4\u001b[0m llm,\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 7\u001b[0m memory\u001b[38;5;241m=\u001b[39mmemory,\n\u001b[1;32m 8\u001b[0m )\n",
"File \u001b[0;32m~/Documents/projects/langchain/libs/langchain/langchain/load/serializable.py:74\u001b[0m, in \u001b[0;36mSerializable.__init__\u001b[0;34m(self, **kwargs)\u001b[0m\n\u001b[1;32m 73\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21m__init__\u001b[39m(\u001b[38;5;28mself\u001b[39m, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs: Any) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m \u001b[38;5;28;01mNone\u001b[39;00m:\n\u001b[0;32m---> 74\u001b[0m \u001b[38;5;28;43msuper\u001b[39;49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[38;5;21;43m__init__\u001b[39;49m\u001b[43m(\u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 75\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_lc_kwargs \u001b[38;5;241m=\u001b[39m kwargs\n",
"File \u001b[0;32m~/Documents/projects/langchain/.venv/lib/python3.9/site-packages/pydantic/main.py:341\u001b[0m, in \u001b[0;36mpydantic.main.BaseModel.__init__\u001b[0;34m()\u001b[0m\n",
"\u001b[0;31mValidationError\u001b[0m: 1 validation error for ChatOpenAI\n__root__\n Did not find openai_api_key, please add an environment variable `OPENAI_API_KEY` which contains it, or pass `openai_api_key` as a named parameter. (type=value_error)"
]
}
],
"source": [
"llm = ChatOpenAI(temperature=0)\n",
"agent_chain = initialize_agent(\n",
" tools,\n",
" llm,\n",
" agent=AgentType.CHAT_CONVERSATIONAL_REACT_DESCRIPTION,\n",
" verbose=True,\n",
" memory=memory,\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "952a3103",
"metadata": {},
"outputs": [],
"source": [
"agent_chain.run(input=\"Hello!\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "54c4aaf4",
"metadata": {},
"outputs": [],
"source": [
"agent_chain.run(input=\"Who owns Twitter?\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "f9013118",
"metadata": {},
"outputs": [],
"source": [
"agent_chain.run(input=\"My name is Bob.\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "405e5315",
"metadata": {},
"outputs": [],
"source": [
"agent_chain.run(input=\"Who am I?\")\n"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.3"
}
},
"nbformat": 4,
"nbformat_minor": 5
}

View File

@ -5,9 +5,13 @@
"id": "91c6a7ef",
"metadata": {},
"source": [
"# Momento Chat Message History\n",
"# Momento Cache\n",
"\n",
"This notebook goes over how to use [Momento Cache](https://gomomento.com) to store chat message history using the `MomentoChatMessageHistory` class. See the Momento [docs](https://docs.momentohq.com/getting-started) for more detail on how to get set up with Momento.\n",
">[Momento Cache](https://docs.momentohq.com/) is the world's first truly serverless caching service. It provides instant elasticity, scale-to-zero \n",
"> capability, and blazing-fast performance. \n",
"\n",
"\n",
"This notebook goes over how to use [Momento Cache](https://www.gomomento.com/services/cache) to store chat message history using the `MomentoChatMessageHistory` class. See the Momento [docs](https://docs.momentohq.com/getting-started) for more detail on how to get set up with Momento.\n",
"\n",
"Note that, by default we will create a cache if one with the given name doesn't already exist.\n",
"\n",
@ -78,7 +82,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.3"
"version": "3.10.12"
}
},
"nbformat": 4,

View File

@ -1,18 +1,35 @@
{
"cells": [
{
"attachments": {},
"cell_type": "markdown",
"id": "91c6a7ef",
"metadata": {},
"source": [
"# Mongodb Chat Message History\n",
"# MongodDB\n",
"\n",
"This notebook goes over how to use Mongodb to store chat message history.\n",
">`MongoDB` is a source-available cross-platform document-oriented database program. Classified as a NoSQL database program, `MongoDB` uses `JSON`-like documents with optional schemas.\n",
">\n",
">`MongoDB` is developed by MongoDB Inc. and licensed under the Server Side Public License (SSPL). - [Wikipedia](https://en.wikipedia.org/wiki/MongoDB)\n",
"\n",
"MongoDB is a source-available cross-platform document-oriented database program. Classified as a NoSQL database program, MongoDB uses JSON-like documents with optional schemas.\n",
"\n",
"MongoDB is developed by MongoDB Inc. and licensed under the Server Side Public License (SSPL). - [Wikipedia](https://en.wikipedia.org/wiki/MongoDB)"
"This notebook goes over how to use Mongodb to store chat message history.\n"
]
},
{
"cell_type": "markdown",
"id": "2d6ed3c8-b70a-498c-bc9e-41b91797d3b7",
"metadata": {},
"source": [
"## Setting up"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "5a7f3b3f-d9b8-4577-a7ef-bdd8ecaedb70",
"metadata": {},
"outputs": [],
"source": [
"!pip install pymongo"
]
},
{
@ -26,6 +43,14 @@
"connection_string = \"mongodb://mongo_user:password123@mongo:27017\""
]
},
{
"cell_type": "markdown",
"id": "a8e63850-3e14-46fe-a59e-be6d6bf8fe61",
"metadata": {},
"source": [
"## Example"
]
},
{
"cell_type": "code",
"execution_count": 4,
@ -83,7 +108,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.3"
"version": "3.10.12"
}
},
"nbformat": 4,

View File

@ -4,13 +4,29 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# Motörhead Memory\n",
"[Motörhead](https://github.com/getmetal/motorhead) is a memory server implemented in Rust. It automatically handles incremental summarization in the background and allows for stateless applications.\n",
"# Motörhead\n",
"\n",
">[Motörhead](https://github.com/getmetal/motorhead) is a memory server implemented in Rust. It automatically handles incremental summarization in the background and allows for stateless applications.\n",
"\n",
"## Setup\n",
"\n",
"See instructions at [Motörhead](https://github.com/getmetal/motorhead) for running the server locally.\n",
"\n"
"See instructions at [Motörhead](https://github.com/getmetal/motorhead) for running the server locally."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from langchain.memory.motorhead_memory import MotorheadMemory"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Example"
]
},
{
@ -19,7 +35,6 @@
"metadata": {},
"outputs": [],
"source": [
"from langchain.memory.motorhead_memory import MotorheadMemory\n",
"from langchain.llms import OpenAI\nfrom langchain.chains import LLMChain\nfrom langchain.prompts import PromptTemplate\n",
"\n",
"template = \"\"\"You are a chatbot having a conversation with a human.\n",

View File

@ -1,198 +0,0 @@
{
"cells": [
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"# Motörhead Memory (Managed)\n",
"[Motörhead](https://github.com/getmetal/motorhead) is a memory server implemented in Rust. It automatically handles incremental summarization in the background and allows for stateless applications.\n",
"\n",
"## Setup\n",
"\n",
"See instructions at [Motörhead](https://docs.getmetal.io/motorhead/introduction) for running the managed version of Motorhead. You can retrieve your `api_key` and `client_id` by creating an account on [Metal](https://getmetal.io).\n",
"\n"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"from langchain.memory.motorhead_memory import MotorheadMemory\n",
"from langchain.llms import OpenAI\nfrom langchain.chains import LLMChain\nfrom langchain.prompts import PromptTemplate\n",
"\n",
"template = \"\"\"You are a chatbot having a conversation with a human.\n",
"\n",
"{chat_history}\n",
"Human: {human_input}\n",
"AI:\"\"\"\n",
"\n",
"prompt = PromptTemplate(\n",
" input_variables=[\"chat_history\", \"human_input\"], \n",
" template=template\n",
")\n",
"memory = MotorheadMemory(\n",
" api_key=\"YOUR_API_KEY\",\n",
" client_id=\"YOUR_CLIENT_ID\"\n",
" session_id=\"testing-1\",\n",
" memory_key=\"chat_history\"\n",
")\n",
"\n",
"await memory.init(); # loads previous state from Motörhead 🤘\n",
"\n",
"llm_chain = LLMChain(\n",
" llm=OpenAI(), \n",
" prompt=prompt, \n",
" verbose=True, \n",
" memory=memory,\n",
")\n",
"\n"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"\n",
"\u001b[1m> Entering new LLMChain chain...\u001b[0m\n",
"Prompt after formatting:\n",
"\u001b[32;1m\u001b[1;3mYou are a chatbot having a conversation with a human.\n",
"\n",
"\n",
"Human: hi im bob\n",
"AI:\u001b[0m\n",
"\n",
"\u001b[1m> Finished chain.\u001b[0m\n"
]
},
{
"data": {
"text/plain": [
"' Hi Bob, nice to meet you! How are you doing today?'"
]
},
"execution_count": 2,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"llm_chain.run(\"hi im bob\")"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"\n",
"\u001b[1m> Entering new LLMChain chain...\u001b[0m\n",
"Prompt after formatting:\n",
"\u001b[32;1m\u001b[1;3mYou are a chatbot having a conversation with a human.\n",
"\n",
"Human: hi im bob\n",
"AI: Hi Bob, nice to meet you! How are you doing today?\n",
"Human: whats my name?\n",
"AI:\u001b[0m\n",
"\n",
"\u001b[1m> Finished chain.\u001b[0m\n"
]
},
{
"data": {
"text/plain": [
"' You said your name is Bob. Is that correct?'"
]
},
"execution_count": 3,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"llm_chain.run(\"whats my name?\")"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"\n",
"\u001b[1m> Entering new LLMChain chain...\u001b[0m\n",
"Prompt after formatting:\n",
"\u001b[32;1m\u001b[1;3mYou are a chatbot having a conversation with a human.\n",
"\n",
"Human: hi im bob\n",
"AI: Hi Bob, nice to meet you! How are you doing today?\n",
"Human: whats my name?\n",
"AI: You said your name is Bob. Is that correct?\n",
"Human: whats for dinner?\n",
"AI:\u001b[0m\n",
"\n",
"\u001b[1m> Finished chain.\u001b[0m\n"
]
},
{
"data": {
"text/plain": [
"\" I'm sorry, I'm not sure what you're asking. Could you please rephrase your question?\""
]
},
"execution_count": 4,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"llm_chain.run(\"whats for dinner?\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.1"
}
},
"nbformat": 4,
"nbformat_minor": 2
}

View File

@ -1,12 +1,13 @@
{
"cells": [
{
"attachments": {},
"cell_type": "markdown",
"id": "91c6a7ef",
"metadata": {},
"source": [
"# Postgres Chat Message History\n",
"# Postgres\n",
"\n",
">[PostgreSQL](https://en.wikipedia.org/wiki/PostgreSQL) also known as `Postgres`, is a free and open-source relational database management system (RDBMS) emphasizing extensibility and SQL compliance.\n",
"\n",
"This notebook goes over how to use Postgres to store chat message history."
]
@ -57,7 +58,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.2"
"version": "3.10.12"
}
},
"nbformat": 4,

View File

@ -5,9 +5,11 @@
"id": "91c6a7ef",
"metadata": {},
"source": [
"# Redis Chat Message History\n",
"# Redis\n",
"\n",
"This notebook goes over how to use Redis to store chat message history."
">[Redis (Remote Dictionary Server)](https://en.wikipedia.org/wiki/Redis) is an open-source in-memory storage, used as a distributed, in-memory keyvalue database, cache and message broker, with optional durability. Because it holds all data in memory and because of its design, `Redis` offers low-latency reads and writes, making it particularly suitable for use cases that require a cache. Redis is the most popular NoSQL database, and one of the most popular databases overall.\n",
"\n",
"This notebook goes over how to use `Redis` to store chat message history."
]
},
{
@ -73,7 +75,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.1"
"version": "3.10.12"
}
},
"nbformat": 4,

View File

@ -1,17 +1,47 @@
{
"cells": [
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"# Rockset Chat Message History\n",
"# Rockset\n",
"\n",
"This notebook goes over how to use [Rockset](https://rockset.com/docs) to store chat message history. \n",
">[Rockset](https://rockset.com/product/) is a real-time analytics database service for serving low latency, high concurrency analytical queries at scale. It builds a Converged Index™ on structured and semi-structured data with an efficient store for vector embeddings. Its support for running SQL on schemaless data makes it a perfect choice for running vector search with metadata filters. \n",
"\n",
"\n",
"This notebook goes over how to use [Rockset](https://rockset.com/docs) to store chat message history. \n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Setting up"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!pip install rockset"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"To begin, with get your API key from the [Rockset console](https://console.rockset.com/apikeys). Find your API region for the Rockset [API reference](https://rockset.com/docs/rest-api#introduction)."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Example"
]
},
{
"cell_type": "code",
"execution_count": null,
@ -40,7 +70,6 @@
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
@ -57,11 +86,24 @@
}
],
"metadata": {
"language_info": {
"name": "python"
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"orig_nbformat": 4
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.12"
}
},
"nbformat": 4,
"nbformat_minor": 2
"nbformat_minor": 4
}

View File

@ -2,36 +2,58 @@
"cells": [
{
"cell_type": "markdown",
"id": "f22eab3f84cbeb37",
"metadata": {},
"source": [
"# SQL Chat Message History\n",
"# SQL (SQLAlchemy)\n",
"\n",
"This notebook goes over a **SQLChatMessageHistory** class that allows to store chat history in any database supported by SQLAlchemy.\n",
">[Structured Query Language (SQL)](https://en.wikipedia.org/wiki/SQL) is a domain-specific language used in programming and designed for managing data held in a relational database management system (RDBMS), or for stream processing in a relational data stream management system (RDSMS). It is particularly useful in handling structured data, i.e., data incorporating relations among entities and variables.\n",
"\n",
"Please note that to use it with databases other than SQLite, you will need to install the corresponding database driver."
],
"metadata": {
"collapsed": false
},
"id": "f22eab3f84cbeb37"
">[SQLAlchemy](https://github.com/sqlalchemy/sqlalchemy) is an open-source `SQL` toolkit and object-relational mapper (ORM) for the Python programming language released under the MIT License.\n",
"\n",
"This notebook goes over a `SQLChatMessageHistory` class that allows to store chat history in any database supported by `SQLAlchemy`.\n",
"\n",
"Please note that to use it with databases other than `SQLite`, you will need to install the corresponding database driver."
]
},
{
"cell_type": "markdown",
"id": "f8f2830ee9ca1e01",
"metadata": {},
"source": [
"### Basic Usage\n",
"## Basic Usage\n",
"\n",
"To use the storage you need to provide only 2 things:\n",
"\n",
"1. Session Id - a unique identifier of the session, like user name, email, chat id etc.\n",
"2. Connection string - a string that specifies the database connection. It will be passed to SQLAlchemy create_engine function."
],
"metadata": {
"collapsed": false
},
"id": "f8f2830ee9ca1e01"
"2. Connection string - a string that specifies the database connection. It will be passed to SQLAlchemy create_engine function.\n",
"3. Install `SQLAlchemy` python package."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "ab016290-3823-4e1b-9610-ae9a1b71cb07",
"metadata": {},
"outputs": [],
"source": [
"!pip install SQLAlchemy"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "4576e914a866fb40",
"metadata": {
"ExecuteTime": {
"end_time": "2023-08-28T10:04:38.077748Z",
"start_time": "2023-08-28T10:04:36.105894Z"
},
"collapsed": false,
"jupyter": {
"outputs_hidden": false
}
},
"outputs": [],
"source": [
"from langchain.memory.chat_message_histories import SQLChatMessageHistory\n",
@ -43,23 +65,29 @@
"\n",
"chat_message_history.add_user_message('Hello')\n",
"chat_message_history.add_ai_message('Hi')"
],
"metadata": {
"collapsed": false,
"ExecuteTime": {
"end_time": "2023-08-28T10:04:38.077748Z",
"start_time": "2023-08-28T10:04:36.105894Z"
}
},
"id": "4576e914a866fb40"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "b476688cbb32ba90",
"metadata": {
"ExecuteTime": {
"end_time": "2023-08-28T10:04:38.929396Z",
"start_time": "2023-08-28T10:04:38.915727Z"
},
"collapsed": false,
"jupyter": {
"outputs_hidden": false
}
},
"outputs": [
{
"data": {
"text/plain": "[HumanMessage(content='Hello', additional_kwargs={}, example=False),\n AIMessage(content='Hi', additional_kwargs={}, example=False)]"
"text/plain": [
"[HumanMessage(content='Hello', additional_kwargs={}, example=False),\n",
" AIMessage(content='Hi', additional_kwargs={}, example=False)]"
]
},
"execution_count": 2,
"metadata": {},
@ -68,35 +96,36 @@
],
"source": [
"chat_message_history.messages"
],
"metadata": {
"collapsed": false,
"ExecuteTime": {
"end_time": "2023-08-28T10:04:38.929396Z",
"start_time": "2023-08-28T10:04:38.915727Z"
}
},
"id": "b476688cbb32ba90"
]
},
{
"cell_type": "markdown",
"id": "2e5337719d5614fd",
"metadata": {},
"source": [
"### Custom Storage Format\n",
"## Custom Storage Format\n",
"\n",
"By default, only the session id and message dictionary are stored in the table.\n",
"\n",
"However, sometimes you might want to store some additional information, like message date, author, language etc.\n",
"\n",
"To do that, you can create a custom message converter, by implementing **BaseMessageConverter** interface."
],
"metadata": {
"collapsed": false
},
"id": "2e5337719d5614fd"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "fdfde84c07d071bb",
"metadata": {
"ExecuteTime": {
"end_time": "2023-08-28T10:04:41.510498Z",
"start_time": "2023-08-28T10:04:41.494912Z"
},
"collapsed": false,
"jupyter": {
"outputs_hidden": false
}
},
"outputs": [],
"source": [
"from datetime import datetime\n",
@ -165,23 +194,29 @@
"\n",
"chat_message_history.add_user_message('Hello')\n",
"chat_message_history.add_ai_message('Hi')"
],
"metadata": {
"collapsed": false,
"ExecuteTime": {
"end_time": "2023-08-28T10:04:41.510498Z",
"start_time": "2023-08-28T10:04:41.494912Z"
}
},
"id": "fdfde84c07d071bb"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "4a6a54d8a9e2856f",
"metadata": {
"ExecuteTime": {
"end_time": "2023-08-28T10:04:43.497990Z",
"start_time": "2023-08-28T10:04:43.492517Z"
},
"collapsed": false,
"jupyter": {
"outputs_hidden": false
}
},
"outputs": [
{
"data": {
"text/plain": "[HumanMessage(content='Hello', additional_kwargs={}, example=False),\n AIMessage(content='Hi', additional_kwargs={}, example=False)]"
"text/plain": [
"[HumanMessage(content='Hello', additional_kwargs={}, example=False),\n",
" AIMessage(content='Hi', additional_kwargs={}, example=False)]"
]
},
"execution_count": 4,
"metadata": {},
@ -190,44 +225,34 @@
],
"source": [
"chat_message_history.messages"
],
"metadata": {
"collapsed": false,
"ExecuteTime": {
"end_time": "2023-08-28T10:04:43.497990Z",
"start_time": "2023-08-28T10:04:43.492517Z"
}
},
"id": "4a6a54d8a9e2856f"
]
},
{
"cell_type": "markdown",
"id": "622aded629a1adeb",
"metadata": {},
"source": [
"You also might want to change the name of session_id column. In this case you'll need to specify `session_id_field_name` parameter."
],
"metadata": {
"collapsed": false
},
"id": "622aded629a1adeb"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 2
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython2",
"version": "2.7.6"
"pygments_lexer": "ipython3",
"version": "3.10.12"
}
},
"nbformat": 4,

View File

@ -2,19 +2,32 @@
"cells": [
{
"cell_type": "markdown",
"id": "d464a12a",
"metadata": {
"id": "eg0Hwptz9g5q"
},
"source": [
"# Entity Memory with SQLite storage\n",
"# SQLite\n",
"\n",
"In this walkthrough we'll create a simple conversation chain which uses ConversationEntityMemory backed by a SqliteEntityStore."
],
"id": "d464a12a"
">[SQLite](https://en.wikipedia.org/wiki/SQLite) is a database engine written in the C programming language. It is not a standalone app; rather, it is a library that software developers embed in their apps. As such, it belongs to the family of embedded databases. It is the most widely deployed database engine, as it is used by several of the top web browsers, operating systems, mobile phones, and other embedded systems.\n",
"\n",
"In this walkthrough we'll create a simple conversation chain which uses `ConversationEntityMemory` backed by a `SqliteEntityStore`."
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "d0a07a30-028f-4e16-8b11-45b2416f7b0f",
"metadata": {},
"outputs": [],
"source": [
"#!pip install sqlite3"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "db59b901",
"metadata": {
"id": "2wUMSUoF8ffn"
},
@ -25,12 +38,12 @@
"from langchain.memory import ConversationEntityMemory\n",
"from langchain.memory.entity import SQLiteEntityStore\n",
"from langchain.memory.prompt import ENTITY_MEMORY_CONVERSATION_TEMPLATE"
],
"id": "db59b901"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "ca6dee29",
"metadata": {
"id": "8TpJZti99gxV"
},
@ -45,22 +58,22 @@
" memory=memory,\n",
" verbose=True,\n",
")"
],
"id": "ca6dee29"
]
},
{
"cell_type": "markdown",
"id": "f9b4c3a0",
"metadata": {
"id": "HEAHG1L79ca1"
},
"source": [
"Notice the usage of `EntitySqliteStore` as parameter to `entity_store` on the `memory` property."
],
"id": "f9b4c3a0"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "297e78a6",
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
@ -111,12 +124,12 @@
],
"source": [
"conversation.run(\"Deven & Sam are working on a hackathon project\")"
],
"id": "297e78a6"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "7e71f1dc",
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
@ -139,12 +152,12 @@
],
"source": [
"conversation.memory.entity_store.get(\"Deven\")"
],
"id": "7e71f1dc"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "316f2e8d",
"metadata": {},
"outputs": [
{
@ -160,16 +173,15 @@
],
"source": [
"conversation.memory.entity_store.get(\"Sam\")"
],
"id": "316f2e8d"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "b85f8427",
"metadata": {},
"outputs": [],
"source": [],
"id": "b85f8427"
"source": []
}
],
"metadata": {
@ -177,9 +189,9 @@
"provenance": []
},
"kernelspec": {
"display_name": "venv",
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "venv"
"name": "python3"
},
"language_info": {
"codemirror_mode": {
@ -191,9 +203,9 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.3"
"version": "3.10.12"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
}

View File

@ -5,13 +5,17 @@
"id": "91c6a7ef",
"metadata": {},
"source": [
"# Streamlit Chat Message History\n",
"# Streamlit\n",
"\n",
"This notebook goes over how to store and use chat message history in a Streamlit app. StreamlitChatMessageHistory will store messages in\n",
">[Streamlit](https://docs.streamlit.io/) is an open-source Python library that makes it easy to create and share beautiful, \n",
"custom web apps for machine learning and data science.\n",
"\n",
"\n",
"This notebook goes over how to store and use chat message history in a `Streamlit` app. `StreamlitChatMessageHistory` will store messages in\n",
"[Streamlit session state](https://docs.streamlit.io/library/api-reference/session-state)\n",
"at the specified `key=`. The default key is `\"langchain_messages\"`.\n",
"\n",
"- Note, StreamlitChatMessageHistory only works when run in a Streamlit app.\n",
"- Note, `StreamlitChatMessageHistory` only works when run in a Streamlit app.\n",
"- You may also be interested in [StreamlitCallbackHandler](/docs/integrations/callbacks/streamlit) for LangChain.\n",
"- For more on Streamlit check out their\n",
"[getting started documentation](https://docs.streamlit.io/library/get-started).\n",
@ -50,7 +54,7 @@
"id": "b60dc735",
"metadata": {},
"source": [
"You can integrate StreamlitChatMessageHistory into ConversationBufferMemory and chains or agents as usual. The history will be persisted across re-runs of the Streamlit app within a given user session. A given StreamlitChatMessageHistory will NOT be persisted or shared across user sessions."
"You can integrate `StreamlitChatMessageHistory` into `ConversationBufferMemory` and chains or agents as usual. The history will be persisted across re-runs of the Streamlit app within a given user session. A given `StreamlitChatMessageHistory` will NOT be persisted or shared across user sessions."
]
},
{
@ -132,9 +136,9 @@
],
"metadata": {
"kernelspec": {
"display_name": "poetry-venv",
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "poetry-venv"
"name": "python3"
},
"language_info": {
"codemirror_mode": {
@ -146,7 +150,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.1"
"version": "3.10.12"
}
},
"nbformat": 4,

View File

@ -4,9 +4,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# Xata chat memory\n",
"# Xata\n",
"\n",
"[Xata](https://xata.io) is a serverless data platform, based on PostgreSQL and Elasticsearch. It provides a Python SDK for interacting with your database, and a UI for managing your data. With the `XataChatMessageHistory` class, you can use Xata databases for longer-term persistence of chat sessions.\n",
">[Xata](https://xata.io) is a serverless data platform, based on `PostgreSQL` and `Elasticsearch`. It provides a Python SDK for interacting with your database, and a UI for managing your data. With the `XataChatMessageHistory` class, you can use Xata databases for longer-term persistence of chat sessions.\n",
"\n",
"This notebook covers:\n",
"\n",
@ -318,7 +318,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.9"
"version": "3.10.12"
}
},
"nbformat": 4,

View File

@ -1,26 +1,14 @@
{
"cells": [
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"# Zep Memory\n",
"# Zep\n",
"\n",
"## REACT Agent Chat Message History with Zep - A long-term memory store for LLM applications.\n",
"\n",
"This notebook demonstrates how to use the [Zep Long-term Memory Store](https://docs.getzep.com/) as memory for your chatbot.\n",
"\n",
"We'll demonstrate:\n",
"\n",
"1. Adding conversation history to the Zep memory store.\n",
"2. Running an agent and having message automatically added to the store.\n",
"3. Viewing the enriched messages.\n",
"4. Vector search over the conversation history.\n",
"\n",
"### More on Zep:\n",
"\n",
"Zep stores, summarizes, embeds, indexes, and enriches conversational AI chat histories, and exposes them via simple, low-latency APIs.\n",
">[Zep](https://docs.getzep.com/) is a long-term memory store for LLM applications.\n",
">\n",
">`Zep` stores, summarizes, embeds, indexes, and enriches conversational AI chat histories, and exposes them via simple, low-latency APIs.\n",
"\n",
"Key Features:\n",
"\n",
@ -32,8 +20,21 @@
"- **Auto-token counting** of memories and summaries, allowing finer-grained control over prompt assembly.\n",
"- Python and JavaScript SDKs.\n",
"\n",
"Zep project: [https://github.com/getzep/zep](https://github.com/getzep/zep)\n",
"Docs: [https://docs.getzep.com/](https://docs.getzep.com/)\n"
"`Zep` project: [https://github.com/getzep/zep](https://github.com/getzep/zep)\n",
"Docs: [https://docs.getzep.com/](https://docs.getzep.com/)\n",
"\n",
"\n",
"## Example\n",
"\n",
"This notebook demonstrates how to use the [Zep Long-term Memory Store](https://docs.getzep.com/) as memory for your chatbot.\n",
"REACT Agent Chat Message History with Zep - A long-term memory store for LLM applications.\n",
"\n",
"We'll demonstrate:\n",
"\n",
"1. Adding conversation history to the Zep memory store.\n",
"2. Running an agent and having message automatically added to the store.\n",
"3. Viewing the enriched messages.\n",
"4. Vector search over the conversation history."
]
},
{
@ -96,7 +97,6 @@
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
@ -143,7 +143,6 @@
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
@ -232,7 +231,6 @@
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
@ -257,16 +255,18 @@
"text": [
"\n",
"\n",
"\u001B[1m> Entering new chain...\u001B[0m\n",
"\u001B[32;1m\u001B[1;3mThought: Do I need to use a tool? No\n",
"AI: Parable of the Sower is a prescient novel that speaks to the challenges facing contemporary society, such as climate change, inequality, and violence. It is a cautionary tale that warns of the dangers of unchecked greed and the need for individuals to take responsibility for their own lives and the lives of those around them.\u001B[0m\n",
"\u001b[1m> Entering new chain...\u001b[0m\n",
"\u001b[32;1m\u001b[1;3mThought: Do I need to use a tool? No\n",
"AI: Parable of the Sower is a prescient novel that speaks to the challenges facing contemporary society, such as climate change, inequality, and violence. It is a cautionary tale that warns of the dangers of unchecked greed and the need for individuals to take responsibility for their own lives and the lives of those around them.\u001b[0m\n",
"\n",
"\u001B[1m> Finished chain.\u001B[0m\n"
"\u001b[1m> Finished chain.\u001b[0m\n"
]
},
{
"data": {
"text/plain": "'Parable of the Sower is a prescient novel that speaks to the challenges facing contemporary society, such as climate change, inequality, and violence. It is a cautionary tale that warns of the dangers of unchecked greed and the need for individuals to take responsibility for their own lives and the lives of those around them.'"
"text/plain": [
"'Parable of the Sower is a prescient novel that speaks to the challenges facing contemporary society, such as climate change, inequality, and violence. It is a cautionary tale that warns of the dangers of unchecked greed and the need for individuals to take responsibility for their own lives and the lives of those around them.'"
]
},
"execution_count": 6,
"metadata": {},
@ -280,7 +280,6 @@
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
@ -341,7 +340,6 @@
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
@ -391,11 +389,14 @@
{
"cell_type": "code",
"execution_count": null,
"outputs": [],
"source": [],
"metadata": {
"collapsed": false
}
"collapsed": false,
"jupyter": {
"outputs_hidden": false
}
},
"outputs": [],
"source": []
}
],
"metadata": {
@ -414,7 +415,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.4"
"version": "3.10.12"
}
},
"nbformat": 4,

View File

@ -0,0 +1,23 @@
# AWS DynamoDB
>[AWS DynamoDB](https://awscli.amazonaws.com/v2/documentation/api/latest/reference/dynamodb/index.html)
> is a fully managed `NoSQL` database service that provides fast and predictable performance with seamless scalability.
## Installation and Setup
We have to configur the [AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-configure.html).
We need to install the `boto3` library.
```bash
pip install boto3
```
## Memory
See a [usage example](/docs/integrations/memory/aws_dynamodb).
```python
from langchain.memory import DynamoDBChatMessageHistory
```

View File

@ -0,0 +1,16 @@
# Motörhead
>[Motörhead](https://github.com/getmetal/motorhead) is a memory server implemented in Rust. It automatically handles incremental summarization in the background and allows for stateless applications.
## Installation and Setup
See instructions at [Motörhead](https://github.com/getmetal/motorhead) for running the server locally.
## Memory
See a [usage example](/docs/integrations/memory/motorhead_memory).
```python
from langchain.memory import MotorheadMemory
```

View File

@ -1,7 +1,10 @@
# Redis
>[Redis](https://redis.com) is an open-source key-value store that can be used as a cache,
> message broker, database, vector database and more.
>[Redis (Remote Dictionary Server)](https://en.wikipedia.org/wiki/Redis) is an open-source in-memory storage,
> used as a distributed, in-memory keyvalue database, cache and message broker, with optional durability.
> Because it holds all data in memory and because of its design, `Redis` offers low-latency reads and writes,
> making it particularly suitable for use cases that require a cache. Redis is the most popular NoSQL database,
> and one of the most popular databases overall.
This page covers how to use the [Redis](https://redis.com) ecosystem within LangChain.
It is broken into two parts: installation and setup, and then references to specific Redis wrappers.
@ -111,7 +114,7 @@ Redis can be used to persist LLM conversations.
#### Vector Store Retriever Memory
For a more detailed walkthrough of the `VectorStoreRetrieverMemory` wrapper, see [this notebook](/docs/modules/memory/integrations/vectorstore_retriever_memory.html).
For a more detailed walkthrough of the `VectorStoreRetrieverMemory` wrapper, see [this notebook](/docs/modules/memory/types/vectorstore_retriever_memory.html).
#### Chat Message History Memory
For a detailed example of Redis to cache conversation message history, see [this notebook](/docs/integrations/memory/redis_chat_message_history.html).