Skip to content
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
75 changes: 75 additions & 0 deletions docs/platforms/javascript/common/configuration/vercelai.mdx
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

m: Shouldn't this page be part of the "Integrations" section rather than directly in "Configuration"?

Otherwise, if we have already settled on putting this directly under "Confuguration", can we move it down a bit? I don't think this should be directly under "Basic Options".

image

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yup wrong path 🤦 - will fix!

Original file line number Diff line number Diff line change
@@ -0,0 +1,75 @@
---
title: Vercel AI
description: "Adds instrumentation for Vercel AI SDK."
supported:
- javascript.node
- javascript.aws-lambda
- javascript.azure-functions
- javascript.connect
- javascript.express
- javascript.fastify
- javascript.gcp-functions
- javascript.hapi
- javascript.koa
- javascript.nestjs
- javascript.electron
- javascript.nextjs
- javascript.nuxt
- javascript.sveltekit
- javascript.remix
- javascript.astro
- javascript.bun
---

<Alert level="info">

This integration only works in the Node.js and Bun runtimes. Requires SDK version `8.43.0` or higher.

</Alert>

_Import name: `Sentry.vercelAIIntegration`_

The `vercelAIIntegration` adds instrumentation for the [`ai`](https://www.npmjs.com/package/ai) library by Vercel to capture spans using the [`AI SDK's built-in Telemetry`](https://sdk.vercel.ai/docs/ai-sdk-core/telemetry).

```javascript
Sentry.init({
integrations: [new Sentry.vercelAIIntegration()],
});
```

To enhance the spans collected by this integration, we recommend providing a `functionId` to identify the function that the telemetry data is for. For more details, see the [AI SDK Telemetry Metadata docs](https://sdk.vercel.ai/docs/ai-sdk-core/telemetry#telemetry-metadata).

```javascript
const result = await generateText({
model: openai("gpt-4-turbo"),
experimental_telemetry: { functionId: "my-awesome-function" },
});
```

## Configuration

By default this integration adds tracing support to all `ai` function callsites. If you need to disable collecting spans for a specific call, you can do so by setting `experimental_telemetry.isEnabled` to `false` in the first argument of the function call.

```javascript
const result = await generateText({
model: openai("gpt-4-turbo"),
experimental_telemetry: { isEnabled: false },
});
```

If you want to collect inputs and outputs for a specific call, you must specifically opt-in to each function call by setting `experimental_telemetry.recordInputs` and `experimental_telemetry.recordOutputs` to `true`.

```javascript
const result = await generateText({
model: openai("gpt-4-turbo"),
experimental_telemetry: {
isEnabled: true,
recordInputs: true,
recordOutputs: true,
},
});
```

## Supported Versions

- `dataloader`: `>=3.0.0 <5`
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

l: Maybe I'm missing something here but is dataloader a transitive dependency of ai? Might be worth to add a sentence or two to explain what it is. Also, does this translate to an ai version range we could mention instead?

Actually I saw that we mention dataloader in other pages as well. Any chance this was a copy/paste leftover? 😅

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

copy-paste mistake! will fix

Loading