mirror of
https://github.com/hwchase17/langchain.git
synced 2025-06-25 16:13:25 +00:00
docs: Message history for Neptune chains (#29260)
Expanded the Amazon Neptune documentation with new sections detailing usage of chat message history with the `create_neptune_opencypher_qa_chain` and `create_neptune_sparql_qa_chain` functions.
This commit is contained in:
parent
d5360b9bd6
commit
36ff83a0b5
@ -70,9 +70,17 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"execution_count": 12,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Austin airport has 98 outgoing routes.\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"from langchain_aws import ChatBedrockConverse\n",
|
||||
"from langchain_aws.chains import create_neptune_opencypher_qa_chain\n",
|
||||
@ -83,13 +91,161 @@
|
||||
" temperature=0,\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"chain = create_neptune_opencypher_qa_chain(\n",
|
||||
" llm=llm,\n",
|
||||
" graph=graph,\n",
|
||||
")\n",
|
||||
"chain = create_neptune_opencypher_qa_chain(llm=llm, graph=graph)\n",
|
||||
"\n",
|
||||
"result = chain.invoke(\n",
|
||||
" {\"query\": \"How many outgoing routes does the Austin airport have?\"}\n",
|
||||
"result = chain.invoke(\"How many outgoing routes does the Austin airport have?\")\n",
|
||||
"print(result[\"result\"].content)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Adding Message History\n",
|
||||
"\n",
|
||||
"The Neptune openCypher QA chain has the ability to be wrapped by [`RunnableWithMessageHistory`](https://python.langchain.com/v0.2/api_reference/core/runnables/langchain_core.runnables.history.RunnableWithMessageHistory.html#langchain_core.runnables.history.RunnableWithMessageHistory). This adds message history to the chain, allowing us to create a chatbot that retains conversation state across multiple invocations.\n",
|
||||
"\n",
|
||||
"To start, we need a way to store and load the message history. For this purpose, each thread will be created as an instance of [`InMemoryChatMessageHistory`](https://python.langchain.com/api_reference/core/chat_history/langchain_core.chat_history.InMemoryChatMessageHistory.html), and stored into a dictionary for repeated access.\n",
|
||||
"\n",
|
||||
"(Also see: https://python.langchain.com/docs/versions/migrating_memory/chat_history/#chatmessagehistory)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from langchain_core.chat_history import InMemoryChatMessageHistory\n",
|
||||
"\n",
|
||||
"chats_by_session_id = {}\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"def get_chat_history(session_id: str) -> InMemoryChatMessageHistory:\n",
|
||||
" chat_history = chats_by_session_id.get(session_id)\n",
|
||||
" if chat_history is None:\n",
|
||||
" chat_history = InMemoryChatMessageHistory()\n",
|
||||
" chats_by_session_id[session_id] = chat_history\n",
|
||||
" return chat_history"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Now, the QA chain and message history storage can be used to create the new `RunnableWithMessageHistory`. Note that we must set `query` as the input key to match the format expected by the base chain."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from langchain_core.runnables.history import RunnableWithMessageHistory\n",
|
||||
"\n",
|
||||
"runnable_with_history = RunnableWithMessageHistory(\n",
|
||||
" chain,\n",
|
||||
" get_chat_history,\n",
|
||||
" input_messages_key=\"query\",\n",
|
||||
")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Before invoking the chain, a unique `session_id` needs to be generated for the conversation that the new `InMemoryChatMessageHistory` will remember."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import uuid\n",
|
||||
"\n",
|
||||
"session_id = uuid.uuid4()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Finally, invoke the message history enabled chain with the `session_id`."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 8,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"You can fly directly to 98 destinations from Austin airport.\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"result = runnable_with_history.invoke(\n",
|
||||
" {\"query\": \"How many destinations can I fly to directly from Austin airport?\"},\n",
|
||||
" config={\"configurable\": {\"session_id\": session_id}},\n",
|
||||
")\n",
|
||||
"print(result[\"result\"].content)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"As the chain continues to be invoked with the same `session_id`, responses will be returned in the context of previous queries in the conversation.\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 9,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"You can fly directly to 4 destinations in Europe from Austin airport.\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"result = runnable_with_history.invoke(\n",
|
||||
" {\"query\": \"Out of those destinations, how many are in Europe?\"},\n",
|
||||
" config={\"configurable\": {\"session_id\": session_id}},\n",
|
||||
")\n",
|
||||
"print(result[\"result\"].content)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 10,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"The four European destinations you can fly to directly from Austin airport are:\n",
|
||||
"- AMS (Amsterdam Airport Schiphol)\n",
|
||||
"- FRA (Frankfurt am Main)\n",
|
||||
"- LGW (London Gatwick)\n",
|
||||
"- LHR (London Heathrow)\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"result = runnable_with_history.invoke(\n",
|
||||
" {\"query\": \"Give me the codes and names of those airports.\"},\n",
|
||||
" config={\"configurable\": {\"session_id\": session_id}},\n",
|
||||
")\n",
|
||||
"print(result[\"result\"].content)"
|
||||
]
|
||||
@ -97,7 +253,7 @@
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3 (ipykernel)",
|
||||
"display_name": "Python 3",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
@ -111,7 +267,7 @@
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.10.12"
|
||||
"version": "3.10.13"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
|
@ -48,7 +48,7 @@
|
||||
"\n",
|
||||
"Seed the W3C organizational data, W3C org ontology plus some instances. \n",
|
||||
" \n",
|
||||
"You will need an S3 bucket in the same region and account. Set `STAGE_BUCKET`as the name of that bucket."
|
||||
"You will need an S3 bucket in the same region and account as the Neptune cluster. Set `STAGE_BUCKET`as the name of that bucket."
|
||||
]
|
||||
},
|
||||
{
|
||||
@ -60,11 +60,6 @@
|
||||
"STAGE_BUCKET = \"<bucket-name>\""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": ""
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
@ -89,7 +84,50 @@
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Bulk-load the org ttl - both ontology and instances"
|
||||
"We will use the `%load` magic command from the `graph-notebook` package to insert the W3C data into the Neptune graph. Before running `%load`, use `%%graph_notebook_config` to set the graph connection parameters."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"!pip install --upgrade --quiet graph-notebook"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"%load_ext graph_notebook.magics"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"%%graph_notebook_config\n",
|
||||
"{\n",
|
||||
" \"host\": \"<neptune-endpoint>\",\n",
|
||||
" \"neptune_service\": \"neptune-db\",\n",
|
||||
" \"port\": 8182,\n",
|
||||
" \"auth_mode\": \"<[DEFAULT|IAM]>\",\n",
|
||||
" \"load_from_s3_arn\": \"<neptune-cluster-load-role-arn>\",\n",
|
||||
" \"ssl\": true,\n",
|
||||
" \"aws_region\": \"<region>\"\n",
|
||||
"}"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Bulk-load the org ttl - both ontology and instances."
|
||||
]
|
||||
},
|
||||
{
|
||||
@ -246,7 +284,9 @@
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": "### Create the Neptune Database RDF Graph"
|
||||
"source": [
|
||||
"### Create the Neptune Database RDF Graph"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
@ -297,7 +337,7 @@
|
||||
" examples=EXAMPLES,\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"result = chain.invoke({\"query\": \"How many organizations are in the graph?\"})\n",
|
||||
"result = chain.invoke(\"How many organizations are in the graph?\")\n",
|
||||
"print(result[\"result\"].content)"
|
||||
]
|
||||
},
|
||||
@ -305,7 +345,6 @@
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Extra questions\n",
|
||||
"Here are a few more prompts to try on the graph data that was ingested.\n"
|
||||
]
|
||||
},
|
||||
@ -315,7 +354,8 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"chain.invoke({\"query\": \"Are there any mergers or acquisitions?\"})"
|
||||
"result = chain.invoke(\"Are there any mergers or acquisitions?\")\n",
|
||||
"print(result[\"result\"].content)"
|
||||
]
|
||||
},
|
||||
{
|
||||
@ -324,7 +364,8 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"chain.invoke({\"query\": \"Find organizations.\"})"
|
||||
"result = chain.invoke(\"Find organizations.\")\n",
|
||||
"print(result[\"result\"].content)"
|
||||
]
|
||||
},
|
||||
{
|
||||
@ -333,7 +374,8 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"chain.invoke({\"query\": \"Find sites of MegaSystems or MegaFinancial.\"})"
|
||||
"result = chain.invoke(\"Find sites of MegaSystems or MegaFinancial.\")\n",
|
||||
"print(result[\"result\"].content)"
|
||||
]
|
||||
},
|
||||
{
|
||||
@ -342,7 +384,8 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"chain.invoke({\"query\": \"Find a member who is a manager of one or more members.\"})"
|
||||
"result = chain.invoke(\"Find a member who is a manager of one or more members.\")\n",
|
||||
"print(result[\"result\"].content)"
|
||||
]
|
||||
},
|
||||
{
|
||||
@ -351,7 +394,8 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"chain.invoke({\"query\": \"Find five members and their managers.\"})"
|
||||
"result = chain.invoke(\"Find five members and their managers.\")\n",
|
||||
"print(result[\"result\"].content)"
|
||||
]
|
||||
},
|
||||
{
|
||||
@ -360,17 +404,128 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"chain.invoke(\n",
|
||||
" {\n",
|
||||
" \"query\": \"Find org units or suborganizations of The Mega Group. What are the sites of those units?\"\n",
|
||||
" }\n",
|
||||
"result = chain.invoke(\n",
|
||||
" \"Find org units or suborganizations of The Mega Group. What are the sites of those units?\"\n",
|
||||
")\n",
|
||||
"print(result[\"result\"].content)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Adding Message History\n",
|
||||
"\n",
|
||||
"The Neptune SPARQL QA chain has the ability to be wrapped by [`RunnableWithMessageHistory`](https://python.langchain.com/v0.2/api_reference/core/runnables/langchain_core.runnables.history.RunnableWithMessageHistory.html#langchain_core.runnables.history.RunnableWithMessageHistory). This adds message history to the chain, allowing us to create a chatbot that retains conversation state across multiple invocations.\n",
|
||||
"\n",
|
||||
"To start, we need a way to store and load the message history. For this purpose, each thread will be created as an instance of [`InMemoryChatMessageHistory`](https://python.langchain.com/api_reference/core/chat_history/langchain_core.chat_history.InMemoryChatMessageHistory.html), and stored into a dictionary for repeated access.\n",
|
||||
"\n",
|
||||
"(Also see: https://python.langchain.com/docs/versions/migrating_memory/chat_history/#chatmessagehistory)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from langchain_core.chat_history import InMemoryChatMessageHistory\n",
|
||||
"\n",
|
||||
"chats_by_session_id = {}\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"def get_chat_history(session_id: str) -> InMemoryChatMessageHistory:\n",
|
||||
" chat_history = chats_by_session_id.get(session_id)\n",
|
||||
" if chat_history is None:\n",
|
||||
" chat_history = InMemoryChatMessageHistory()\n",
|
||||
" chats_by_session_id[session_id] = chat_history\n",
|
||||
" return chat_history"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Now, the QA chain and message history storage can be used to create the new `RunnableWithMessageHistory`. Note that we must set `query` as the input key to match the format expected by the base chain."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from langchain_core.runnables.history import RunnableWithMessageHistory\n",
|
||||
"\n",
|
||||
"runnable_with_history = RunnableWithMessageHistory(\n",
|
||||
" chain,\n",
|
||||
" get_chat_history,\n",
|
||||
" input_messages_key=\"query\",\n",
|
||||
")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Before invoking the chain, a unique `session_id` needs to be generated for the conversation that the new `InMemoryChatMessageHistory` will remember.\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import uuid\n",
|
||||
"\n",
|
||||
"session_id = uuid.uuid4()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Finally, invoke the message history enabled chain with the `session_id`.\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"result = runnable_with_history.invoke(\n",
|
||||
" {\"query\": \"How many org units or suborganizations does the The Mega Group have?\"},\n",
|
||||
" config={\"configurable\": {\"session_id\": session_id}},\n",
|
||||
")\n",
|
||||
"print(result[\"result\"].content)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"As the chain continues to be invoked with the same `session_id`, responses will be returned in the context of previous queries in the conversation.\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"result = runnable_with_history.invoke(\n",
|
||||
" {\"query\": \"List the sites for each of the units.\"},\n",
|
||||
" config={\"configurable\": {\"session_id\": session_id}},\n",
|
||||
")\n",
|
||||
"print(result[\"result\"].content)"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3 (ipykernel)",
|
||||
"display_name": "Python 3",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
@ -384,7 +539,7 @@
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.10.12"
|
||||
"version": "3.10.13"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
|
Loading…
Reference in New Issue
Block a user