docs: Fix StreamlitChatMessageHistory docs to latest API (#16072)

- **Description:** Update [this
page](https://python.langchain.com/docs/integrations/memory/streamlit_chat_message_history)
to use the latest API
  - **Issue:** https://github.com/langchain-ai/langchain/issues/13995
  - **Dependencies:** None
  - **Twitter handle:** @OhSynap
This commit is contained in:
Joshua Carroll 2024-01-17 09:42:10 -08:00 committed by GitHub
parent 8597484195
commit bc0cb1148a
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -10,7 +10,6 @@
">[Streamlit](https://docs.streamlit.io/) is an open-source Python library that makes it easy to create and share beautiful, \n", ">[Streamlit](https://docs.streamlit.io/) is an open-source Python library that makes it easy to create and share beautiful, \n",
"custom web apps for machine learning and data science.\n", "custom web apps for machine learning and data science.\n",
"\n", "\n",
"\n",
"This notebook goes over how to store and use chat message history in a `Streamlit` app. `StreamlitChatMessageHistory` will store messages in\n", "This notebook goes over how to store and use chat message history in a `Streamlit` app. `StreamlitChatMessageHistory` will store messages in\n",
"[Streamlit session state](https://docs.streamlit.io/library/api-reference/session-state)\n", "[Streamlit session state](https://docs.streamlit.io/library/api-reference/session-state)\n",
"at the specified `key=`. The default key is `\"langchain_messages\"`.\n", "at the specified `key=`. The default key is `\"langchain_messages\"`.\n",
@ -20,6 +19,12 @@
"- For more on Streamlit check out their\n", "- For more on Streamlit check out their\n",
"[getting started documentation](https://docs.streamlit.io/library/get-started).\n", "[getting started documentation](https://docs.streamlit.io/library/get-started).\n",
"\n", "\n",
"The integration lives in the `langchain-community` package, so we need to install that. We also need to install `streamlit`.\n",
"\n",
"```\n",
"pip install -U langchain-community streamlit\n",
"```\n",
"\n",
"You can see the [full app example running here](https://langchain-st-memory.streamlit.app/), and more examples in\n", "You can see the [full app example running here](https://langchain-st-memory.streamlit.app/), and more examples in\n",
"[github.com/langchain-ai/streamlit-agent](https://github.com/langchain-ai/streamlit-agent)." "[github.com/langchain-ai/streamlit-agent](https://github.com/langchain-ai/streamlit-agent)."
] ]
@ -31,7 +36,7 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"from langchain.memory import StreamlitChatMessageHistory\n", "from langchain_community.chat_message_histories import StreamlitChatMessageHistory\n",
"\n", "\n",
"history = StreamlitChatMessageHistory(key=\"chat_messages\")\n", "history = StreamlitChatMessageHistory(key=\"chat_messages\")\n",
"\n", "\n",
@ -54,7 +59,9 @@
"id": "b60dc735", "id": "b60dc735",
"metadata": {}, "metadata": {},
"source": [ "source": [
"You can integrate `StreamlitChatMessageHistory` into `ConversationBufferMemory` and chains or agents as usual. The history will be persisted across re-runs of the Streamlit app within a given user session. A given `StreamlitChatMessageHistory` will NOT be persisted or shared across user sessions." "We can easily combine this message history class with [LCEL Runnables](https://python.langchain.com/docs/expression_language/how_to/message_history).\n",
"\n",
"The history will be persisted across re-runs of the Streamlit app within a given user session. A given `StreamlitChatMessageHistory` will NOT be persisted or shared across user sessions."
] ]
}, },
{ {
@ -64,13 +71,11 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"from langchain.memory import ConversationBufferMemory\n",
"from langchain_community.chat_message_histories import StreamlitChatMessageHistory\n", "from langchain_community.chat_message_histories import StreamlitChatMessageHistory\n",
"\n", "\n",
"# Optionally, specify your own session_state key for storing messages\n", "# Optionally, specify your own session_state key for storing messages\n",
"msgs = StreamlitChatMessageHistory(key=\"special_app_key\")\n", "msgs = StreamlitChatMessageHistory(key=\"special_app_key\")\n",
"\n", "\n",
"memory = ConversationBufferMemory(memory_key=\"history\", chat_memory=msgs)\n",
"if len(msgs.messages) == 0:\n", "if len(msgs.messages) == 0:\n",
" msgs.add_ai_message(\"How can I help you?\")" " msgs.add_ai_message(\"How can I help you?\")"
] ]
@ -82,19 +87,34 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"from langchain.chains import LLMChain\n", "from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder\n",
"from langchain.prompts import PromptTemplate\n", "from langchain_core.runnables.history import RunnableWithMessageHistory\n",
"from langchain_openai import OpenAI\n", "from langchain_openai import ChatOpenAI\n",
"\n", "\n",
"template = \"\"\"You are an AI chatbot having a conversation with a human.\n", "prompt = ChatPromptTemplate.from_messages(\n",
" [\n",
" (\"system\", \"You are an AI chatbot having a conversation with a human.\"),\n",
" MessagesPlaceholder(variable_name=\"history\"),\n",
" (\"human\", \"{question}\"),\n",
" ]\n",
")\n",
"\n", "\n",
"{history}\n", "chain = prompt | ChatOpenAI()"
"Human: {human_input}\n", ]
"AI: \"\"\"\n", },
"prompt = PromptTemplate(input_variables=[\"history\", \"human_input\"], template=template)\n", {
"\n", "cell_type": "code",
"# Add the memory to an LLMChain as usual\n", "execution_count": null,
"llm_chain = LLMChain(llm=OpenAI(), prompt=prompt, memory=memory)" "id": "dac3d94f",
"metadata": {},
"outputs": [],
"source": [
"chain_with_history = RunnableWithMessageHistory(\n",
" chain,\n",
" lambda session_id: msgs, # Always return the instance created earlier\n",
" input_messages_key=\"question\",\n",
" history_messages_key=\"history\",\n",
")"
] ]
}, },
{ {
@ -121,8 +141,9 @@
" st.chat_message(\"human\").write(prompt)\n", " st.chat_message(\"human\").write(prompt)\n",
"\n", "\n",
" # As usual, new messages are added to StreamlitChatMessageHistory when the Chain is called.\n", " # As usual, new messages are added to StreamlitChatMessageHistory when the Chain is called.\n",
" response = llm_chain.run(prompt)\n", " config = {\"configurable\": {\"session_id\": \"any\"}}\n",
" st.chat_message(\"ai\").write(response)" " response = chain_with_history.invoke({\"question\": prompt}, config)\n",
" st.chat_message(\"ai\").write(response.content)"
] ]
}, },
{ {