Skip to content

Commit f296930

Browse files
committed
update with platform specfic instructions
1 parent 2a0435b commit f296930

File tree

1 file changed

+48
-3
lines changed
  • docs/platforms/javascript/common/configuration/integrations

1 file changed

+48
-3
lines changed

docs/platforms/javascript/common/configuration/integrations/openai.mdx

Lines changed: 48 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -36,6 +36,7 @@ _Import name: `Sentry.openAIIntegration`_
3636

3737
The `openAIIntegration` adds instrumentation for the `openai` API to capture spans by automatically wrapping OpenAI client calls and recording LLM interactions with configurable input/output recording.
3838

39+
<PlatformSection notSupported={["javascript.cloudflare", "javascript.nextjs"]}>
3940
It is enabled by default and will automatically capture spans for OpenAI API method calls. You can opt-in to capture inputs and outputs by setting `recordInputs` and `recordOutputs` in the integration config:
4041

4142
```javascript
@@ -51,7 +52,53 @@ Sentry.init({
5152
});
5253
```
5354

54-
<PlatformSection>
55+
</PlatformSection>
56+
57+
<PlatformSection supported={["javascript.cloudflare"]}>
58+
For Cloudflare Workers, you need to manually instrument the OpenAI client using the `instrumentOpenAiClient` helper:
59+
60+
```javascript
61+
import * as Sentry from "@sentry/cloudflare";
62+
import OpenAI from "openai";
63+
64+
const openai = new OpenAI();
65+
const client = Sentry.instrumentOpenAiClient(openai, {
66+
recordInputs: true,
67+
recordOutputs: true,
68+
});
69+
70+
// Use the wrapped client instead of the original openai instance
71+
const response = await client.chat.completions.create({
72+
model: "gpt-4o",
73+
messages: [{ role: "user", content: "Hello!" }],
74+
});
75+
```
76+
77+
</PlatformSection>
78+
79+
<PlatformSection supported={['javascript.nextjs']}>
80+
81+
This integration is automatically instrumented in the Node.js runtime. For Next.js applications using the Edge runtime, you need to manually instrument the OpenAI client:
82+
83+
```javascript
84+
import * as Sentry from "@sentry/nextjs";
85+
import OpenAI from "openai";
86+
87+
const openai = new OpenAI();
88+
const client = Sentry.instrumentOpenAiClient(openai, {
89+
recordInputs: true,
90+
recordOutputs: true,
91+
});
92+
93+
// Use the wrapped client instead of the original openai instance
94+
const response = await client.chat.completions.create({
95+
model: "gpt-4o",
96+
messages: [{ role: "user", content: "Hello!" }],
97+
});
98+
```
99+
100+
</PlatformSection>
101+
55102
## Options
56103

57104
### `recordInputs`
@@ -82,8 +129,6 @@ Sentry.init({
82129
});
83130
```
84131

85-
</PlatformSection>
86-
87132
## Configuration
88133

89134
By default this integration adds tracing support to OpenAI API method calls including:

0 commit comments

Comments
 (0)