Skip to content

Commit 9a1b285

Browse files
authored
feat(langgraph): add python integration docs (#14817)
1 parent 8d68286 commit 9a1b285

File tree

3 files changed

+136
-4
lines changed

3 files changed

+136
-4
lines changed

docs/platforms/python/integrations/index.mdx

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,9 @@ The Sentry SDK uses integrations to hook into the functionality of popular libra
4343
| <LinkWithPlatformIcon platform="anthropic" label="Anthropic" url="/platforms/python/integrations/anthropic" /> ||
4444
| <LinkWithPlatformIcon platform="openai" label="OpenAI" url="/platforms/python/integrations/openai" /> ||
4545
| <LinkWithPlatformIcon platform="openai-agents" label="OpenAI Agents SDK" url="/platforms/python/integrations/openai-agents" /> | |
46-
| <LinkWithPlatformIcon platform="langchain" label="LangChain" url="/platforms/python/integrations/langchain" /> | |
46+
| <LinkWithPlatformIcon platform="langchain" label="LangChain" url="/platforms/python/integrations/langchain" /> ||
47+
| <LinkWithPlatformIcon platform="langchain" label="LangGraph" url="/platforms/python/integrations/langgraph" /> ||
48+
4749

4850
### Data Processing
4951

Lines changed: 131 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,131 @@
1+
---
2+
title: LangGraph
3+
description: "Learn about using Sentry for LangGraph."
4+
---
5+
6+
This integration connects Sentry with [LangGraph](https://github.com/langchain-ai/langgraph) in Python.
7+
8+
Once you've installed this SDK, you can use Sentry AI Agents Monitoring, a Sentry dashboard that helps you understand what's going on with your AI requests. Sentry AI Monitoring will automatically collect information about prompts, tools, tokens, and models. Learn more about the [AI Agents Dashboard](/product/insights/ai/agents).
9+
10+
## Install
11+
12+
Install `sentry-sdk` from PyPI with the `langgraph` extra:
13+
14+
```bash {tabTitle:pip}
15+
pip install "sentry-sdk[langgraph]"
16+
```
17+
18+
```bash {tabTitle:uv}
19+
uv add "sentry-sdk[langgraph]"
20+
```
21+
22+
## Configure
23+
24+
If you have the `langgraph` package in your dependencies, the LangGraph integration will be enabled automatically when you initialize the Sentry SDK. For correct token accounting, you need to disable the integration for the model provider you are using (e.g. OpenAI or Anthropic).
25+
26+
```python
27+
import sentry_sdk
28+
from sentry_sdk.integrations.openai import OpenAIIntegration
29+
30+
sentry_sdk.init(
31+
dsn="___PUBLIC_DSN___",
32+
environment="local",
33+
traces_sample_rate=1.0,
34+
send_default_pii=True,
35+
disabled_integrations=[OpenAIIntegration()],
36+
)
37+
```
38+
39+
## Verify
40+
41+
Verify that the integration works by creating a LangGraph workflow and executing it. In these examples, we're creating a simple agent graph that can use a function tool to roll a die.
42+
43+
```python
44+
import os
45+
import random
46+
from typing import Annotated, Literal, TypedDict
47+
48+
from langchain.chat_models import init_chat_model
49+
from langchain_core.messages import AnyMessage, HumanMessage
50+
from langchain_core.tools import tool
51+
from langgraph.graph import END, StateGraph
52+
from langgraph.graph.message import add_messages
53+
from langgraph.prebuilt import ToolNode
54+
55+
56+
class State(TypedDict):
57+
messages: Annotated[list[AnyMessage], add_messages]
58+
59+
@tool
60+
def roll_die(sides: int = 6) -> str:
61+
"""Roll a die with a given number of sides"""
62+
return f"Rolled a {random.randint(1, sides)} on a {sides}-sided die."
63+
64+
def chatbot(state: State):
65+
model = init_chat_model("gpt-4o-mini", model_provider="openai")
66+
return {"messages": [model.bind_tools([roll_die]).invoke(state["messages"])]}
67+
68+
def should_continue(state: State) -> Literal["tools", END]:
69+
last_message = state["messages"][-1]
70+
return "tools" if getattr(last_message, "tool_calls", None) else END
71+
72+
with sentry_sdk.start_transaction(name="langgraph-openai"):
73+
graph_builder = StateGraph(State)
74+
graph_builder.add_node("chatbot", chatbot)
75+
graph_builder.add_node("tools", ToolNode([roll_die]))
76+
graph_builder.set_entry_point("chatbot")
77+
graph_builder.add_conditional_edges("chatbot", should_continue)
78+
graph_builder.add_edge("tools", "chatbot")
79+
graph = graph_builder.compile()
80+
result = graph.invoke({
81+
"messages": [
82+
HumanMessage(content="Hello, my name is Alice! Please roll a six-sided die.")
83+
]
84+
})
85+
print(result)
86+
```
87+
88+
After running this script, the resulting data should show up in the `"AI Spans"` tab on the `"Explore" > "Traces"` page on Sentry.io, and in the [AI Agents Dashboard](/product/insights/ai/agents).
89+
90+
It may take a couple of moments for the data to appear in [sentry.io](https://sentry.io).
91+
92+
## Behavior
93+
94+
- The LangGraph integration will connect Sentry with all supported LangGraph methods automatically.
95+
96+
- All exceptions are reported.
97+
98+
## Options
99+
100+
By adding `LangGraphIntegration` to your `sentry_sdk.init()` call explicitly, you can set options for `LangGraphIntegration` to change its behavior:
101+
102+
```python
103+
import sentry_sdk
104+
from sentry_sdk.integrations.langgraph import LangGraphIntegration
105+
106+
sentry_sdk.init(
107+
# ...
108+
# Add data like inputs and responses;
109+
# see https://docs.sentry.io/platforms/python/data-management/data-collected/ for more info
110+
send_default_pii=True,
111+
integrations=[
112+
LangGraphIntegration(
113+
include_prompts=False, # LLM inputs/outputs will be not sent to Sentry, despite send_default_pii=True
114+
),
115+
],
116+
)
117+
```
118+
119+
You can pass the following keyword arguments to `LangGraphIntegration()`:
120+
121+
- `include_prompts`
122+
123+
Controls whether LLM inputs and outputs should be sent to Sentry. Sentry considers this data personal identifiable data (PII) by default. If you want to include the data, set `send_default_pii=True` in the `sentry_sdk.init()` call. To explicitly exclude prompts and outputs despite `send_default_pii=True`, configure the integration with `include_prompts=False`.
124+
125+
The default is `True`.
126+
127+
## Supported Versions
128+
129+
- OpenAI: 1.0+
130+
- Python: 3.9+
131+
- LangGraph: 0.6+

docs/platforms/python/tracing/instrumentation/custom-instrumentation/ai-agents-module.mdx

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -14,10 +14,9 @@ The Python SDK supports automatic instrumentation for some AI libraries. We reco
1414

1515
- <PlatformLink to="/integrations/anthropic/">Anthropic</PlatformLink>
1616
- <PlatformLink to="/integrations/openai/">OpenAI</PlatformLink>
17-
- <PlatformLink to="/integrations/openai-agents/">
18-
OpenAI Agents SDK
19-
</PlatformLink>
17+
- <PlatformLink to="/integrations/openai-agents/"> OpenAI Agents SDK </PlatformLink>
2018
- <PlatformLink to="/integrations/langchain/">LangChain</PlatformLink>
19+
- <PlatformLink to="/integrations/langgraph/">LangGraph</PlatformLink>
2120

2221
## Manual Instrumentation
2322

0 commit comments

Comments
 (0)