Extend the StreamlitChatMessageHistory docs with a fuller example and… (#8774)

Add more details to the [notebook for
StreamlitChatMessageHistory](https://python.langchain.com/docs/integrations/memory/streamlit_chat_message_history),
including a link to a [running example
app](https://langchain-st-memory.streamlit.app/).

Original PR: https://github.com/langchain-ai/langchain/pull/8497
This commit is contained in:
Joshua Carroll 2023-08-04 14:27:46 -07:00 committed by GitHub
parent 19dfe166c9
commit e5fed7d535
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
2 changed files with 99 additions and 3 deletions

View File

@ -71,3 +71,6 @@ or any other local ENV management tool.
Currently `StreamlitCallbackHandler` is geared towards use with a LangChain Agent Executor. Support for additional agent types, Currently `StreamlitCallbackHandler` is geared towards use with a LangChain Agent Executor. Support for additional agent types,
use directly with Chains, etc will be added in the future. use directly with Chains, etc will be added in the future.
You may also be interested in using
[StreamlitChatMessageHistory](/docs/integrations/memory/streamlit_chat_message_history) for LangChain.

View File

@ -7,8 +7,17 @@
"source": [ "source": [
"# Streamlit Chat Message History\n", "# Streamlit Chat Message History\n",
"\n", "\n",
"This notebook goes over how to use Streamlit to store chat message history. Note, StreamlitChatMessageHistory only works when run in a Streamlit app. For more on Streamlit check out their\n", "This notebook goes over how to store and use chat message history in a Streamlit app. StreamlitChatMessageHistory will store messages in\n",
"[getting started documentation](https://docs.streamlit.io/library/get-started)." "[Streamlit session state](https://docs.streamlit.io/library/api-reference/session-state)\n",
"at the specified `key=`. The default key is `\"langchain_messages\"`.\n",
"\n",
"- Note, StreamlitChatMessageHistory only works when run in a Streamlit app.\n",
"- You may also be interested in [StreamlitCallbackHandler](/docs/integrations/callbacks/streamlit) for LangChain.\n",
"- For more on Streamlit check out their\n",
"[getting started documentation](https://docs.streamlit.io/library/get-started).\n",
"\n",
"You can see the [full app example running here](https://langchain-st-memory.streamlit.app/), and more examples in\n",
"[github.com/langchain-ai/streamlit-agent](https://github.com/langchain-ai/streamlit-agent)."
] ]
}, },
{ {
@ -20,7 +29,7 @@
"source": [ "source": [
"from langchain.memory import StreamlitChatMessageHistory\n", "from langchain.memory import StreamlitChatMessageHistory\n",
"\n", "\n",
"history = StreamlitChatMessageHistory(\"foo\")\n", "history = StreamlitChatMessageHistory(key=\"chat_messages\")\n",
"\n", "\n",
"history.add_user_message(\"hi!\")\n", "history.add_user_message(\"hi!\")\n",
"history.add_ai_message(\"whats up?\")" "history.add_ai_message(\"whats up?\")"
@ -35,6 +44,90 @@
"source": [ "source": [
"history.messages" "history.messages"
] ]
},
{
"cell_type": "markdown",
"id": "b60dc735",
"metadata": {},
"source": [
"You can integrate StreamlitChatMessageHistory into ConversationBufferMemory and chains or agents as usual. The history will be persisted across re-runs of the Streamlit app within a given user session. A given StreamlitChatMessageHistory will NOT be persisted or shared across user sessions."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "42ab5bf3",
"metadata": {},
"outputs": [],
"source": [
"\n",
"from langchain.memory import ConversationBufferMemory\n",
"from langchain.memory.chat_message_histories import StreamlitChatMessageHistory\n",
"\n",
"# Optionally, specify your own session_state key for storing messages\n",
"msgs = StreamlitChatMessageHistory(key=\"special_app_key\")\n",
"\n",
"memory = ConversationBufferMemory(memory_key=\"history\", chat_memory=msgs)\n",
"if len(msgs.messages) == 0:\n",
" msgs.add_ai_message(\"How can I help you?\")\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "a29252de",
"metadata": {},
"outputs": [],
"source": [
"from langchain.chains import LLMChain\n",
"from langchain.llms import OpenAI\n",
"from langchain.prompts import PromptTemplate\n",
"template = \"\"\"You are an AI chatbot having a conversation with a human.\n",
"\n",
"{history}\n",
"Human: {human_input}\n",
"AI: \"\"\"\n",
"prompt = PromptTemplate(input_variables=[\"history\", \"human_input\"], template=template)\n",
"\n",
"# Add the memory to an LLMChain as usual\n",
"llm_chain = LLMChain(llm=OpenAI(), prompt=prompt, memory=memory)"
]
},
{
"cell_type": "markdown",
"id": "7cd99b4b",
"metadata": {},
"source": [
"Conversational Streamlit apps will often re-draw each previous chat message on every re-run. This is easy to do by iterating through `StreamlitChatMessageHistory.messages`:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "3bdb637b",
"metadata": {},
"outputs": [],
"source": [
"import streamlit as st\n",
"\n",
"for msg in msgs.messages:\n",
" st.chat_message(msg.type).write(msg.content)\n",
"\n",
"if prompt := st.chat_input():\n",
" st.chat_message(\"human\").write(prompt)\n",
"\n",
" # As usual, new messages are added to StreamlitChatMessageHistory when the Chain is called.\n",
" response = llm_chain.run(prompt)\n",
" st.chat_message(\"ai\").write(response)"
]
},
{
"cell_type": "markdown",
"id": "7adaf3d6",
"metadata": {},
"source": [
"**[View the final app](https://langchain-st-memory.streamlit.app/).**"
]
} }
], ],
"metadata": { "metadata": {