Skip to content

Commit d574bf0

Browse files
authored
add documentation on how to load different chain types (#595)
1 parent 956416c commit d574bf0

File tree

6 files changed

+383
-194
lines changed

6 files changed

+383
-194
lines changed

docs/modules/chains/combine_docs_examples/vector_db_qa.ipynb

Lines changed: 88 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@
4646
"metadata": {},
4747
"outputs": [],
4848
"source": [
49-
"qa = VectorDBQA.from_llm(llm=OpenAI(), vectorstore=docsearch)"
49+
"qa = VectorDBQA.from_chain_type(llm=OpenAI(), chain_type=\"stuff\", vectorstore=docsearch)"
5050
]
5151
},
5252
{
@@ -58,7 +58,7 @@
5858
{
5959
"data": {
6060
"text/plain": [
61-
"' The president said that Ketanji Brown Jackson is one of the nations top legal minds and that she will continue Justice Breyer’s legacy of excellence.'"
61+
"\" The president said that Ketanji Brown Jackson is one of the nation's top legal minds, a former top litigator in private practice, a former federal public defender, from a family of public school educators and police officers, a consensus builder, and has received a broad range of support from the Fraternal Order of Police to former judges appointed by Democrats and Republicans.\""
6262
]
6363
},
6464
"execution_count": 4,
@@ -71,6 +71,91 @@
7171
"qa.run(query)"
7272
]
7373
},
74+
{
75+
"cell_type": "markdown",
76+
"id": "c28f1f64",
77+
"metadata": {},
78+
"source": [
79+
"## Chain Type\n",
80+
"You can easily specify different chain types to load and use in the VectorDBQA chain. For a more detailed walkthrough of these types, please see [this notebook](question_answering.ipynb).\n",
81+
"\n",
82+
"There are two ways to load different chain types. First, you can specify the chain type argument in the `from_chain_type` method. This allows you to pass in the name of the chain type you want to use. For example, in the below we change the chain type to `map_reduce`."
83+
]
84+
},
85+
{
86+
"cell_type": "code",
87+
"execution_count": 5,
88+
"id": "22d2417d",
89+
"metadata": {},
90+
"outputs": [],
91+
"source": [
92+
"qa = VectorDBQA.from_chain_type(llm=OpenAI(), chain_type=\"map_reduce\", vectorstore=docsearch)"
93+
]
94+
},
95+
{
96+
"cell_type": "code",
97+
"execution_count": 6,
98+
"id": "43204ad1",
99+
"metadata": {},
100+
"outputs": [
101+
{
102+
"data": {
103+
"text/plain": [
104+
"\" The president said that Ketanji Brown Jackson is one of the nation's top legal minds, a former top litigator in private practice, a former federal public defender, from a family of public school educators and police officers, a consensus builder, and has received a broad range of support from the Fraternal Order of Police to former judges appointed by Democrats and Republicans.\""
105+
]
106+
},
107+
"execution_count": 6,
108+
"metadata": {},
109+
"output_type": "execute_result"
110+
}
111+
],
112+
"source": [
113+
"query = \"What did the president say about Ketanji Brown Jackson\"\n",
114+
"qa.run(query)"
115+
]
116+
},
117+
{
118+
"cell_type": "markdown",
119+
"id": "60368f38",
120+
"metadata": {},
121+
"source": [
122+
"The above way allows you to really simply change the chain_type, but it does provide a ton of flexibility over parameters to that chain type. If you want to control those parameters, you can load the chain directly (as you did in [this notebook](question_answering.ipynb)) and then pass that directly to the the VectorDBQA chain with the `combine_documents_chain` parameter. For example:"
123+
]
124+
},
125+
{
126+
"cell_type": "code",
127+
"execution_count": 18,
128+
"id": "7b403f0d",
129+
"metadata": {},
130+
"outputs": [],
131+
"source": [
132+
"from langchain.chains.question_answering import load_qa_chain\n",
133+
"qa_chain = load_qa_chain(OpenAI(temperature=0), chain_type=\"stuff\")\n",
134+
"qa = VectorDBQA(combine_documents_chain=qa_chain, vectorstore=docsearch)"
135+
]
136+
},
137+
{
138+
"cell_type": "code",
139+
"execution_count": 19,
140+
"id": "9e04a9ac",
141+
"metadata": {},
142+
"outputs": [
143+
{
144+
"data": {
145+
"text/plain": [
146+
"\" The president said that Ketanji Brown Jackson is one of the nation's top legal minds, a former top litigator in private practice, a former federal public defender, and from a family of public school educators and police officers. He also said that she is a consensus builder and has received a broad range of support from the Fraternal Order of Police to former judges appointed by Democrats and Republicans.\""
147+
]
148+
},
149+
"execution_count": 19,
150+
"metadata": {},
151+
"output_type": "execute_result"
152+
}
153+
],
154+
"source": [
155+
"query = \"What did the president say about Ketanji Brown Jackson\"\n",
156+
"qa.run(query)"
157+
]
158+
},
74159
{
75160
"cell_type": "markdown",
76161
"id": "0b8c37f7",
@@ -87,7 +172,7 @@
87172
"metadata": {},
88173
"outputs": [],
89174
"source": [
90-
"qa = VectorDBQA.from_llm(llm=OpenAI(), vectorstore=docsearch, return_source_documents=True)"
175+
"qa = VectorDBQA.from_chain_type(llm=OpenAI(), chain_type=\"stuff\", return_source_documents=True)"
91176
]
92177
},
93178
{

docs/modules/chains/combine_docs_examples/vector_db_qa_with_sources.ipynb

Lines changed: 95 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@
2626
},
2727
{
2828
"cell_type": "code",
29-
"execution_count": 3,
29+
"execution_count": 2,
3030
"id": "17d1306e",
3131
"metadata": {},
3232
"outputs": [],
@@ -41,7 +41,7 @@
4141
},
4242
{
4343
"cell_type": "code",
44-
"execution_count": 4,
44+
"execution_count": 3,
4545
"id": "0e745d99",
4646
"metadata": {},
4747
"outputs": [],
@@ -51,7 +51,7 @@
5151
},
5252
{
5353
"cell_type": "code",
54-
"execution_count": 5,
54+
"execution_count": 4,
5555
"id": "f42d79dc",
5656
"metadata": {},
5757
"outputs": [],
@@ -63,7 +63,7 @@
6363
},
6464
{
6565
"cell_type": "code",
66-
"execution_count": 6,
66+
"execution_count": 5,
6767
"id": "8aa571ae",
6868
"metadata": {},
6969
"outputs": [],
@@ -73,26 +73,69 @@
7373
},
7474
{
7575
"cell_type": "code",
76-
"execution_count": 8,
76+
"execution_count": 6,
7777
"id": "aa859d4c",
7878
"metadata": {},
7979
"outputs": [],
8080
"source": [
8181
"from langchain import OpenAI\n",
8282
"\n",
83-
"chain = VectorDBQAWithSourcesChain.from_llm(OpenAI(temperature=0), vectorstore=docsearch)"
83+
"chain = VectorDBQAWithSourcesChain.from_chain_type(OpenAI(temperature=0), chain_type=\"stuff\", vectorstore=docsearch)"
8484
]
8585
},
8686
{
8787
"cell_type": "code",
88-
"execution_count": 9,
88+
"execution_count": 7,
8989
"id": "8ba36fa7",
9090
"metadata": {},
9191
"outputs": [
9292
{
9393
"data": {
9494
"text/plain": [
95-
"{'answer': ' The president thanked Justice Breyer for his service.',\n",
95+
"{'answer': ' The president thanked Justice Breyer for his service.\\n',\n",
96+
" 'sources': '30-pl'}"
97+
]
98+
},
99+
"execution_count": 7,
100+
"metadata": {},
101+
"output_type": "execute_result"
102+
}
103+
],
104+
"source": [
105+
"chain({\"question\": \"What did the president say about Justice Breyer\"}, return_only_outputs=True)"
106+
]
107+
},
108+
{
109+
"cell_type": "markdown",
110+
"id": "718ecbda",
111+
"metadata": {},
112+
"source": [
113+
"## Chain Type\n",
114+
"You can easily specify different chain types to load and use in the VectorDBQAWithSourcesChain chain. For a more detailed walkthrough of these types, please see [this notebook](qa_with_sources.ipynb).\n",
115+
"\n",
116+
"There are two ways to load different chain types. First, you can specify the chain type argument in the `from_chain_type` method. This allows you to pass in the name of the chain type you want to use. For example, in the below we change the chain type to `map_reduce`."
117+
]
118+
},
119+
{
120+
"cell_type": "code",
121+
"execution_count": 8,
122+
"id": "8b35b30a",
123+
"metadata": {},
124+
"outputs": [],
125+
"source": [
126+
"chain = VectorDBQAWithSourcesChain.from_chain_type(OpenAI(temperature=0), chain_type=\"map_reduce\", vectorstore=docsearch)"
127+
]
128+
},
129+
{
130+
"cell_type": "code",
131+
"execution_count": 9,
132+
"id": "58bd424f",
133+
"metadata": {},
134+
"outputs": [
135+
{
136+
"data": {
137+
"text/plain": [
138+
"{'answer': ' The president honored Justice Stephen Breyer for his service.\\n',\n",
96139
" 'sources': '30-pl'}"
97140
]
98141
},
@@ -104,11 +147,53 @@
104147
"source": [
105148
"chain({\"question\": \"What did the president say about Justice Breyer\"}, return_only_outputs=True)"
106149
]
150+
},
151+
{
152+
"cell_type": "markdown",
153+
"id": "21e14eed",
154+
"metadata": {},
155+
"source": [
156+
"The above way allows you to really simply change the chain_type, but it does provide a ton of flexibility over parameters to that chain type. If you want to control those parameters, you can load the chain directly (as you did in [this notebook](qa_with_sources.ipynb)) and then pass that directly to the the VectorDBQA chain with the `combine_documents_chain` parameter. For example:"
157+
]
158+
},
159+
{
160+
"cell_type": "code",
161+
"execution_count": 12,
162+
"id": "af35f0c6",
163+
"metadata": {},
164+
"outputs": [],
165+
"source": [
166+
"from langchain.chains.qa_with_sources import load_qa_with_sources_chain\n",
167+
"qa_chain = load_qa_with_sources_chain(OpenAI(temperature=0), chain_type=\"stuff\")\n",
168+
"qa = VectorDBQAWithSourcesChain(combine_document_chain=qa_chain, vectorstore=docsearch)"
169+
]
170+
},
171+
{
172+
"cell_type": "code",
173+
"execution_count": 11,
174+
"id": "c91fdc8a",
175+
"metadata": {},
176+
"outputs": [
177+
{
178+
"data": {
179+
"text/plain": [
180+
"{'answer': ' The president honored Justice Stephen Breyer for his service.\\n',\n",
181+
" 'sources': '30-pl'}"
182+
]
183+
},
184+
"execution_count": 11,
185+
"metadata": {},
186+
"output_type": "execute_result"
187+
}
188+
],
189+
"source": [
190+
"chain({\"question\": \"What did the president say about Justice Breyer\"}, return_only_outputs=True)"
191+
]
107192
}
108193
],
109194
"metadata": {
110195
"kernelspec": {
111-
"display_name": "Python 3.9.0 64-bit ('llm-env')",
196+
"display_name": "Python 3 (ipykernel)",
112197
"language": "python",
113198
"name": "python3"
114199
},
@@ -122,7 +207,7 @@
122207
"name": "python",
123208
"nbconvert_exporter": "python",
124209
"pygments_lexer": "ipython3",
125-
"version": "3.9.0"
210+
"version": "3.10.9"
126211
},
127212
"vscode": {
128213
"interpreter": {

0 commit comments

Comments
 (0)