add documentation on how to load different chain types (#595)

This commit is contained in:
Harrison Chase
2023-01-12 06:47:38 -08:00
committed by GitHub
parent 956416c150
commit d574bf0a27
6 changed files with 383 additions and 194 deletions

View File

@@ -46,7 +46,7 @@
"metadata": {},
"outputs": [],
"source": [
"qa = VectorDBQA.from_llm(llm=OpenAI(), vectorstore=docsearch)"
"qa = VectorDBQA.from_chain_type(llm=OpenAI(), chain_type=\"stuff\", vectorstore=docsearch)"
]
},
{
@@ -58,7 +58,7 @@
{
"data": {
"text/plain": [
"' The president said that Ketanji Brown Jackson is one of the nations top legal minds and that she will continue Justice Breyers legacy of excellence.'"
"\" The president said that Ketanji Brown Jackson is one of the nation's top legal minds, a former top litigator in private practice, a former federal public defender, from a family of public school educators and police officers, a consensus builder, and has received a broad range of support from the Fraternal Order of Police to former judges appointed by Democrats and Republicans.\""
]
},
"execution_count": 4,
@@ -71,6 +71,91 @@
"qa.run(query)"
]
},
{
"cell_type": "markdown",
"id": "c28f1f64",
"metadata": {},
"source": [
"## Chain Type\n",
"You can easily specify different chain types to load and use in the VectorDBQA chain. For a more detailed walkthrough of these types, please see [this notebook](question_answering.ipynb).\n",
"\n",
"There are two ways to load different chain types. First, you can specify the chain type argument in the `from_chain_type` method. This allows you to pass in the name of the chain type you want to use. For example, in the below we change the chain type to `map_reduce`."
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "22d2417d",
"metadata": {},
"outputs": [],
"source": [
"qa = VectorDBQA.from_chain_type(llm=OpenAI(), chain_type=\"map_reduce\", vectorstore=docsearch)"
]
},
{
"cell_type": "code",
"execution_count": 6,
"id": "43204ad1",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"\" The president said that Ketanji Brown Jackson is one of the nation's top legal minds, a former top litigator in private practice, a former federal public defender, from a family of public school educators and police officers, a consensus builder, and has received a broad range of support from the Fraternal Order of Police to former judges appointed by Democrats and Republicans.\""
]
},
"execution_count": 6,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"query = \"What did the president say about Ketanji Brown Jackson\"\n",
"qa.run(query)"
]
},
{
"cell_type": "markdown",
"id": "60368f38",
"metadata": {},
"source": [
"The above way allows you to really simply change the chain_type, but it does provide a ton of flexibility over parameters to that chain type. If you want to control those parameters, you can load the chain directly (as you did in [this notebook](question_answering.ipynb)) and then pass that directly to the the VectorDBQA chain with the `combine_documents_chain` parameter. For example:"
]
},
{
"cell_type": "code",
"execution_count": 18,
"id": "7b403f0d",
"metadata": {},
"outputs": [],
"source": [
"from langchain.chains.question_answering import load_qa_chain\n",
"qa_chain = load_qa_chain(OpenAI(temperature=0), chain_type=\"stuff\")\n",
"qa = VectorDBQA(combine_documents_chain=qa_chain, vectorstore=docsearch)"
]
},
{
"cell_type": "code",
"execution_count": 19,
"id": "9e04a9ac",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"\" The president said that Ketanji Brown Jackson is one of the nation's top legal minds, a former top litigator in private practice, a former federal public defender, and from a family of public school educators and police officers. He also said that she is a consensus builder and has received a broad range of support from the Fraternal Order of Police to former judges appointed by Democrats and Republicans.\""
]
},
"execution_count": 19,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"query = \"What did the president say about Ketanji Brown Jackson\"\n",
"qa.run(query)"
]
},
{
"cell_type": "markdown",
"id": "0b8c37f7",
@@ -87,7 +172,7 @@
"metadata": {},
"outputs": [],
"source": [
"qa = VectorDBQA.from_llm(llm=OpenAI(), vectorstore=docsearch, return_source_documents=True)"
"qa = VectorDBQA.from_chain_type(llm=OpenAI(), chain_type=\"stuff\", return_source_documents=True)"
]
},
{

View File

@@ -26,7 +26,7 @@
},
{
"cell_type": "code",
"execution_count": 3,
"execution_count": 2,
"id": "17d1306e",
"metadata": {},
"outputs": [],
@@ -41,7 +41,7 @@
},
{
"cell_type": "code",
"execution_count": 4,
"execution_count": 3,
"id": "0e745d99",
"metadata": {},
"outputs": [],
@@ -51,7 +51,7 @@
},
{
"cell_type": "code",
"execution_count": 5,
"execution_count": 4,
"id": "f42d79dc",
"metadata": {},
"outputs": [],
@@ -63,7 +63,7 @@
},
{
"cell_type": "code",
"execution_count": 6,
"execution_count": 5,
"id": "8aa571ae",
"metadata": {},
"outputs": [],
@@ -73,26 +73,69 @@
},
{
"cell_type": "code",
"execution_count": 8,
"execution_count": 6,
"id": "aa859d4c",
"metadata": {},
"outputs": [],
"source": [
"from langchain import OpenAI\n",
"\n",
"chain = VectorDBQAWithSourcesChain.from_llm(OpenAI(temperature=0), vectorstore=docsearch)"
"chain = VectorDBQAWithSourcesChain.from_chain_type(OpenAI(temperature=0), chain_type=\"stuff\", vectorstore=docsearch)"
]
},
{
"cell_type": "code",
"execution_count": 9,
"execution_count": 7,
"id": "8ba36fa7",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"{'answer': ' The president thanked Justice Breyer for his service.',\n",
"{'answer': ' The president thanked Justice Breyer for his service.\\n',\n",
" 'sources': '30-pl'}"
]
},
"execution_count": 7,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"chain({\"question\": \"What did the president say about Justice Breyer\"}, return_only_outputs=True)"
]
},
{
"cell_type": "markdown",
"id": "718ecbda",
"metadata": {},
"source": [
"## Chain Type\n",
"You can easily specify different chain types to load and use in the VectorDBQAWithSourcesChain chain. For a more detailed walkthrough of these types, please see [this notebook](qa_with_sources.ipynb).\n",
"\n",
"There are two ways to load different chain types. First, you can specify the chain type argument in the `from_chain_type` method. This allows you to pass in the name of the chain type you want to use. For example, in the below we change the chain type to `map_reduce`."
]
},
{
"cell_type": "code",
"execution_count": 8,
"id": "8b35b30a",
"metadata": {},
"outputs": [],
"source": [
"chain = VectorDBQAWithSourcesChain.from_chain_type(OpenAI(temperature=0), chain_type=\"map_reduce\", vectorstore=docsearch)"
]
},
{
"cell_type": "code",
"execution_count": 9,
"id": "58bd424f",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"{'answer': ' The president honored Justice Stephen Breyer for his service.\\n',\n",
" 'sources': '30-pl'}"
]
},
@@ -104,11 +147,53 @@
"source": [
"chain({\"question\": \"What did the president say about Justice Breyer\"}, return_only_outputs=True)"
]
},
{
"cell_type": "markdown",
"id": "21e14eed",
"metadata": {},
"source": [
"The above way allows you to really simply change the chain_type, but it does provide a ton of flexibility over parameters to that chain type. If you want to control those parameters, you can load the chain directly (as you did in [this notebook](qa_with_sources.ipynb)) and then pass that directly to the the VectorDBQA chain with the `combine_documents_chain` parameter. For example:"
]
},
{
"cell_type": "code",
"execution_count": 12,
"id": "af35f0c6",
"metadata": {},
"outputs": [],
"source": [
"from langchain.chains.qa_with_sources import load_qa_with_sources_chain\n",
"qa_chain = load_qa_with_sources_chain(OpenAI(temperature=0), chain_type=\"stuff\")\n",
"qa = VectorDBQAWithSourcesChain(combine_document_chain=qa_chain, vectorstore=docsearch)"
]
},
{
"cell_type": "code",
"execution_count": 11,
"id": "c91fdc8a",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"{'answer': ' The president honored Justice Stephen Breyer for his service.\\n',\n",
" 'sources': '30-pl'}"
]
},
"execution_count": 11,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"chain({\"question\": \"What did the president say about Justice Breyer\"}, return_only_outputs=True)"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3.9.0 64-bit ('llm-env')",
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
@@ -122,7 +207,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.0"
"version": "3.10.9"
},
"vscode": {
"interpreter": {