Skip to content

Commit 4e14398

Browse files
authored
Bring back OpenAI integration page for Python (#14525)
<!-- Use this checklist to make sure your PR is ready for merge. You may delete any sections you don't need. --> ## DESCRIBE YOUR PR This page was removed until we make the OpenAI integration in Python work with the new Otel based AI Agents dashboard and the "AI Spans" view in the trace waterfall. The SDK supports this now. For future me: This is the Commit that removed the pages: 3cfe838 ## IS YOUR CHANGE URGENT? Help us prioritize incoming PRs by letting us know when the change needs to go live. - [ ] Urgent deadline (GA date, etc.): <!-- ENTER DATE HERE --> - [ ] Other deadline: <!-- ENTER DATE HERE --> - [x] None: Not urgent, can wait up to 1 week+ ## SLA - Teamwork makes the dream work, so please add a reviewer to your PRs. - Please give the docs team up to 1 week to review your PR unless you've added an urgent due date to it. Thanks in advance for your help! ## PRE-MERGE CHECKLIST *Make sure you've checked the following before merging your changes:* - [ ] Checked Vercel preview for correctness, including links - [ ] PR was reviewed and approved by any necessary SMEs (subject matter experts) - [ ] PR was reviewed and approved by a member of the [Sentry docs team](https://github.com/orgs/getsentry/teams/docs) ## LEGAL BOILERPLATE <!-- Sentry employees and contractors can delete or ignore this section. --> Look, I get it. The entity doing business as "Sentry" was incorporated in the State of Delaware in 2015 as Functional Software, Inc. and is gonna need some rights from me in order to utilize my contributions in this here PR. So here's the deal: I retain all rights, title and interest in and to my contributions, and by keeping this boilerplate intact I confirm that Sentry can use, modify, copy, and redistribute my contributions, under Sentry's choice of terms. ## EXTRA RESOURCES - [Sentry Docs contributor guide](https://docs.sentry.io/contributing/)
1 parent 1c7829b commit 4e14398

File tree

2 files changed

+119
-3
lines changed

2 files changed

+119
-3
lines changed

docs/platforms/python/integrations/index.mdx

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -38,9 +38,10 @@ The Sentry SDK uses integrations to hook into the functionality of popular libra
3838

3939
### AI
4040

41-
| | **Auto-enabled** |
42-
| ------------------------------------------------------------------------------------------------------------------------------------ | :--------------: |
43-
| <LinkWithPlatformIcon platform="openai-agents" label="OpenAI Agents SDK" url="/platforms/python/integrations/openai-agents" /> | |
41+
| | **Auto-enabled** |
42+
| ------------------------------------------------------------------------------------------------------------------------------ | :--------------: |
43+
| <LinkWithPlatformIcon platform="openai" label="OpenAI" url="/platforms/python/integrations/openai" /> ||
44+
| <LinkWithPlatformIcon platform="openai-agents" label="OpenAI Agents SDK" url="/platforms/python/integrations/openai-agents" /> | |
4445

4546
### Data Processing
4647

Lines changed: 115 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,115 @@
1+
---
2+
title: OpenAI
3+
description: "Learn about using Sentry for OpenAI."
4+
sidebar_hidden: true
5+
---
6+
7+
This integration connects Sentry with the [OpenAI Python SDK](https://github.com/openai/openai-python).
8+
9+
Once you've installed this SDK, you can use Sentry AI Agents Monitoring, a Sentry dashboard that helps you understand what's going on with your AI requests.
10+
11+
Sentry AI Monitoring will automatically collect information about prompts, tools, tokens, and models. Learn more about the [AI Agents Dashboard](/product/insights/agents).
12+
13+
## Install
14+
15+
Install `sentry-sdk` from PyPI with the `openai` extra:
16+
17+
```bash {tabTitle:pip}
18+
pip install "sentry-sdk[openai]"
19+
```
20+
21+
```bash {tabTitle:uv}
22+
uv add "sentry-sdk[openai]"
23+
```
24+
25+
## Configure
26+
27+
If you have the `openai` package in your dependencies, the OpenAI integration will be enabled automatically when you initialize the Sentry SDK.
28+
29+
An additional dependency, `tiktoken`, is required if you want to calculate token usage for streaming chat responses.
30+
31+
<PlatformContent includePath="getting-started-config" />
32+
33+
## Verify
34+
35+
Verify that the integration works by making a chat request to OpenAI.
36+
37+
```python
38+
import sentry_sdk
39+
from openai import OpenAI
40+
41+
sentry_sdk.init(...) # same as above
42+
43+
client = OpenAI(api_key="(your OpenAI key)")
44+
45+
def my_llm_stuff():
46+
with sentry_sdk.start_transaction(
47+
name="The result of the AI inference",
48+
op="ai-inference",
49+
):
50+
print(
51+
client.chat.completions.create(
52+
model="gpt-3.5", messages=[{"role": "system", "content": "say hello"}]
53+
)
54+
.choices[0]
55+
.message.content
56+
)
57+
```
58+
59+
After running this script, the resulting data should show up in the `"AI Spans"` tab on the `"Explore" > "Traces"` page on Sentry.io.
60+
61+
If you manually created an <PlatformLink to="/tracing/instrumentation/custom-instrumentation/ai-agents-module/#invoke-agent-span">Invoke Agent Span</PlatformLink> (not done in the example above) the data will also show up in the [AI Agents Dashboard](/product/insights/agents).
62+
63+
It may take a couple of moments for the data to appear in [sentry.io](https://sentry.io).
64+
65+
## Behavior
66+
67+
- The OpenAI integration will connect Sentry with all supported OpenAI methods automatically.
68+
69+
- All exceptions leading to an `OpenAIException` are reported.
70+
71+
- The supported modules are currently `responses.create`, `chat.completions.create`, and `embeddings.create`.
72+
73+
- Sentry considers LLM and tokenizer inputs/outputs as PII (Personally identifiable information) and doesn't include PII data by default. If you want to include the data, set `send_default_pii=True` in the `sentry_sdk.init()` call. To explicitly exclude prompts and outputs despite `send_default_pii=True`, configure the integration with `include_prompts=False` as shown in the [Options section](#options) below.
74+
75+
## Options
76+
77+
By adding `OpenAIIntegration` to your `sentry_sdk.init()` call explicitly, you can set options for `OpenAIIntegration` to change its behavior:
78+
79+
```python
80+
import sentry_sdk
81+
from sentry_sdk.integrations.openai import OpenAIIntegration
82+
83+
sentry_sdk.init(
84+
# ...
85+
# Add data like inputs and responses;
86+
# see https://docs.sentry.io/platforms/python/data-management/data-collected/ for more info
87+
send_default_pii=True,
88+
integrations=[
89+
OpenAIIntegration(
90+
include_prompts=False, # LLM/tokenizer inputs/outputs will be not sent to Sentry, despite send_default_pii=True
91+
tiktoken_encoding_name="cl100k_base",
92+
),
93+
],
94+
)
95+
```
96+
97+
You can pass the following keyword arguments to `OpenAIIntegration()`:
98+
99+
- `include_prompts`:
100+
101+
Whether LLM and tokenizer inputs and outputs should be sent to Sentry. Sentry considers this data personal identifiable data (PII) by default. If you want to include the data, set `send_default_pii=True` in the `sentry_sdk.init()` call. To explicitly exclude prompts and outputs despite `send_default_pii=True`, configure the integration with `include_prompts=False`.
102+
103+
The default is `True`.
104+
105+
- `tiktoken_encoding_name`:
106+
107+
If you want to calculate token usage for streaming chat responses you need to have an additional dependency, [tiktoken](https://pypi.org/project/tiktoken/) installed and specify the `tiktoken_encoding_name` that you use for tokenization. See the [OpenAI Cookbook](https://cookbook.openai.com/examples/how_to_count_tokens_with_tiktoken) for possible values.
108+
109+
The default is `None`.
110+
111+
## Supported Versions
112+
113+
- OpenAI: 1.0+
114+
- tiktoken: 0.6.0+
115+
- Python: 3.9+

0 commit comments

Comments
 (0)