| Information | Details |
|---|---|
| Agent type | Synchronous |
| Agentic Framework | Langgraph |
| LLM model | Anthropic Claude 3 Haiku |
| Components | AgentCore Runtime |
| Example complexity | Easy |
| SDK used | Amazon BedrockAgentCore Python SDK |
This example demonstrates how to integrate a LangGraph agent with AWS Bedrock AgentCore, enabling you to deploy a web search-capable agent as a managed service.
- Python 3.10+
- uv - Fast Python package installer and resolver
- AWS account with Bedrock access
# Install uv if you don't have it already
pip install uv
# Create and activate a virtual environment
uv venv
source .venv/bin/activate # On Windows: .venv\Scripts\activateuv pip install -r requirements.txtThe langgraph_agent_web_search.py file contains a LangGraph agent with web search capabilities, integrated with Bedrock AgentCore:
from typing import Annotated
from langchain.chat_models import init_chat_model
from typing_extensions import TypedDict
from langgraph.graph import StateGraph, START
from langgraph.graph.message import add_messages
from langgraph.prebuilt import ToolNode, tools_condition
# Initialize the LLM with Bedrock
llm = init_chat_model(
"global.anthropic.claude-haiku-4-5-20251001-v1:0",
model_provider="bedrock_converse",
)
# Define search tool
from langchain_community.tools import DuckDuckGoSearchRun
search = DuckDuckGoSearchRun()
tools = [search]
llm_with_tools = llm.bind_tools(tools)
# Define state
class State(TypedDict):
messages: Annotated[list, add_messages]
# Build the graph
graph_builder = StateGraph(State)
def chatbot(state: State):
return {"messages": [llm_with_tools.invoke(state["messages"])]}
graph_builder.add_node("chatbot", chatbot)
tool_node = ToolNode(tools=tools)
graph_builder.add_node("tools", tool_node)
graph_builder.add_conditional_edges("chatbot", tools_condition)
graph_builder.add_edge("tools", "chatbot")
graph_builder.add_edge(START, "chatbot")
graph = graph_builder.compile()
# Integrate with Bedrock AgentCore
from bedrock_agentcore.runtime import BedrockAgentCoreApp
app = BedrockAgentCoreApp()
@app.entrypoint
def agent_invocation(payload, context):
tmp_msg = {"messages": [{"role": "user", "content": payload.get("prompt", "No prompt found in input")}]}
tmp_output = graph.invoke(tmp_msg)
return {"result": tmp_output['messages'][-1].content}
app.run()# Configure your agent for deployment
agentcore configure
# Deploy your agent
agentcore launch -e langgraph_agent_web_search.pyDuring configuration, you'll be prompted to:
- Select your AWS region
- Choose a deployment name
- Configure other deployment settings
Once deployed, you can test your agent using:
agentcore invoke {"prompt":"What are the latest developments in quantum computing?"}The agent will:
- Process your query
- Use DuckDuckGo to search for relevant information
- Provide a comprehensive response based on the search results
To remove your deployed agent:
agentcore destroyThis agent uses LangGraph to create a directed graph for agent reasoning:
- The user query is sent to the chatbot node
- The chatbot decides whether to use tools based on the query
- If tools are needed, the query is sent to the tools node
- The tools node executes the search and returns results
- Results are sent back to the chatbot for final response generation
The Bedrock AgentCore framework handles deployment, scaling, and management of the agent in AWS.