Skip to content

Commit b140d16

Browse files
authored
docs: update ChatAnthropic guide (#31849)
1 parent 2090f85 commit b140d16

File tree

1 file changed

+138
-1
lines changed

1 file changed

+138
-1
lines changed

docs/docs/integrations/chat/anthropic.ipynb

Lines changed: 138 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -893,7 +893,7 @@
893893
"source": [
894894
"## Citations\n",
895895
"\n",
896-
"Anthropic supports a [citations](https://docs.anthropic.com/en/docs/build-with-claude/citations) feature that lets Claude attach context to its answers based on source documents supplied by the user. When [document content blocks](https://docs.anthropic.com/en/docs/build-with-claude/citations#document-types) with `\"citations\": {\"enabled\": True}` are included in a query, Claude may generate citations in its response.\n",
896+
"Anthropic supports a [citations](https://docs.anthropic.com/en/docs/build-with-claude/citations) feature that lets Claude attach context to its answers based on source documents supplied by the user. When [document](https://docs.anthropic.com/en/docs/build-with-claude/citations#document-types) or `search result` content blocks with `\"citations\": {\"enabled\": True}` are included in a query, Claude may generate citations in its response.\n",
897897
"\n",
898898
"### Simple example\n",
899899
"\n",
@@ -963,6 +963,143 @@
963963
"response.content"
964964
]
965965
},
966+
{
967+
"cell_type": "markdown",
968+
"id": "4ca82106-69b3-4266-bf23-b2ffba873ee2",
969+
"metadata": {},
970+
"source": [
971+
"### In tool results (agentic RAG)\n",
972+
"\n",
973+
":::info Requires ``langchain-anthropic>=0.3.17``\n",
974+
"\n",
975+
":::\n",
976+
"\n",
977+
"Claude supports a [search_result](https://docs.anthropic.com/en/docs/build-with-claude/search-results) content block representing citable results from queries against a knowledge base or other custom source. These content blocks can be passed to claude both top-line (as in the above example) and within a tool result. This allows Claude to cite elements of its response using the result of a tool call.\n",
978+
"\n",
979+
"To pass search results in response to tool calls, define a tool that returns a list of `search_result` content blocks in Anthropic's native format. For example:\n",
980+
"```python\n",
981+
"def retrieval_tool(query: str) -> list[dict]:\n",
982+
" \"\"\"Access my knowledge base.\"\"\"\n",
983+
"\n",
984+
" # Run a search (e.g., with a LangChain vector store)\n",
985+
" results = vector_store.similarity_search(query=query, k=2)\n",
986+
"\n",
987+
" # Package results into search_result blocks\n",
988+
" return [\n",
989+
" {\n",
990+
" \"type\": \"search_result\",\n",
991+
" \"title\": \"Leave policy\",\n",
992+
" \"source\": \"HR Leave Policy 2025\",\n",
993+
" \"citations\": { \"enabled\": True },\n",
994+
" \"content\": [{\"type\": \"text\", \"text\": doc.page_content}],\n",
995+
" }\n",
996+
" for doc in results\n",
997+
" ]\n",
998+
"```\n",
999+
"\n",
1000+
"We also need to specify the `search-results-2025-06-09` beta when instantiating ChatAnthropic. You can see an end-to-end example below.\n",
1001+
"\n",
1002+
"<details>\n",
1003+
"<summary>End to end example with LangGraph</summary>\n",
1004+
"\n",
1005+
"Here we demonstrate an end-to-end example in which we populate a LangChain [vector store](/docs/concepts/vectorstores/) with sample documents and equip Claude with a tool that queries those documents.\n",
1006+
"The tool here takes a search query and a `category` string literal, but any valid tool signature can be used.\n",
1007+
"\n",
1008+
"```python\n",
1009+
"from typing import Literal\n",
1010+
"\n",
1011+
"from langchain.chat_models import init_chat_model\n",
1012+
"from langchain.embeddings import init_embeddings\n",
1013+
"from langchain_core.documents import Document\n",
1014+
"from langchain_core.vectorstores import InMemoryVectorStore\n",
1015+
"from langgraph.checkpoint.memory import InMemorySaver\n",
1016+
"from langgraph.prebuilt import create_react_agent\n",
1017+
"\n",
1018+
"\n",
1019+
"# Set up vector store\n",
1020+
"embeddings = init_embeddings(\"openai:text-embedding-3-small\")\n",
1021+
"vector_store = InMemoryVectorStore(embeddings)\n",
1022+
"\n",
1023+
"document_1 = Document(\n",
1024+
" id=\"1\",\n",
1025+
" page_content=(\n",
1026+
" \"To request vacation days, submit a leave request form through the \"\n",
1027+
" \"HR portal. Approval will be sent by email.\"\n",
1028+
" ),\n",
1029+
" metadata={\"category\": \"HR Policy\"},\n",
1030+
")\n",
1031+
"document_2 = Document(\n",
1032+
" id=\"2\",\n",
1033+
" page_content=\"Managers will review vacation requests within 3 business days.\",\n",
1034+
" metadata={\"category\": \"HR Policy\"},\n",
1035+
")\n",
1036+
"document_3 = Document(\n",
1037+
" id=\"3\",\n",
1038+
" page_content=(\n",
1039+
" \"Employees with over 6 months tenure are eligible for 20 paid vacation days \"\n",
1040+
" \"per year.\"\n",
1041+
" ),\n",
1042+
" metadata={\"category\": \"Benefits Policy\"},\n",
1043+
")\n",
1044+
"\n",
1045+
"documents = [document_1, document_2, document_3]\n",
1046+
"vector_store.add_documents(documents=documents)\n",
1047+
"\n",
1048+
"\n",
1049+
"# Define tool\n",
1050+
"async def retrieval_tool(\n",
1051+
" query: str, category: Literal[\"HR Policy\", \"Benefits Policy\"]\n",
1052+
") -> list[dict]:\n",
1053+
" \"\"\"Access my knowledge base.\"\"\"\n",
1054+
"\n",
1055+
" def _filter_function(doc: Document) -> bool:\n",
1056+
" return doc.metadata.get(\"category\") == category\n",
1057+
"\n",
1058+
" results = vector_store.similarity_search(\n",
1059+
" query=query, k=2, filter=_filter_function\n",
1060+
" )\n",
1061+
"\n",
1062+
" return [\n",
1063+
" {\n",
1064+
" \"type\": \"search_result\",\n",
1065+
" \"title\": \"Leave policy\",\n",
1066+
" \"source\": \"HR Leave Policy 2025\",\n",
1067+
" \"citations\": { \"enabled\": True },\n",
1068+
" \"content\": [{\"type\": \"text\", \"text\": doc.page_content}],\n",
1069+
" }\n",
1070+
" for doc in results\n",
1071+
" ]\n",
1072+
"\n",
1073+
"\n",
1074+
"\n",
1075+
"# Create agent\n",
1076+
"llm = init_chat_model(\n",
1077+
" \"anthropic:claude-3-5-haiku-latest\",\n",
1078+
" betas=[\"search-results-2025-06-09\"],\n",
1079+
")\n",
1080+
"\n",
1081+
"checkpointer = InMemorySaver()\n",
1082+
"agent = create_react_agent(llm, [retrieval_tool], checkpointer=checkpointer)\n",
1083+
"\n",
1084+
"\n",
1085+
"# Invoke on a query\n",
1086+
"config = {\"configurable\": {\"thread_id\": \"session_1\"}}\n",
1087+
"\n",
1088+
"input_message = {\n",
1089+
" \"role\": \"user\",\n",
1090+
" \"content\": \"How do I request vacation days?\",\n",
1091+
"}\n",
1092+
"async for step in agent.astream(\n",
1093+
" {\"messages\": [input_message]},\n",
1094+
" config,\n",
1095+
" stream_mode=\"values\",\n",
1096+
"):\n",
1097+
" step[\"messages\"][-1].pretty_print()\n",
1098+
"```\n",
1099+
"\n",
1100+
"</details>"
1101+
]
1102+
},
9661103
{
9671104
"cell_type": "markdown",
9681105
"id": "69956596-0e6c-492b-934d-c08ed3c9de9a",

0 commit comments

Comments
 (0)