Replies: 2 comments
-
|
Based on the available documentation, Langfuse can capture detailed observability data from the Vercel AI SDK through OpenTelemetry integration(1). Since you're using To see raw provider requests and retry attempts from the Vercel AI SDK's For input/output visibility, Langfuse maps:
For model parameters and usage, it captures:
Each retry attempt should create separate observations (spans) within your trace, allowing you to see:
If you're not seeing this level of detail currently, you may need to ensure that:
The documentation doesn't specifically mention Vercel AI SDK retry behavior, but the general OpenTelemetry integration should capture all spans emitted by any OTEL-instrumented library(1). 📚 Sources: Have another question? Just tag @inkeep. |
Beta Was this translation helpful? Give feedback.
-
|
Hi @leo91000, thanks for bringing this up! As far as I am aware, in the current integration, structured output retries show up as multiple provider-call spans inside a single generateObject call. The telemetry does not label a span as “retry #2” or attach a “retry reason = schema validation failed” field. You infer retries by counting doGenerate spans / timing. Which version of the AI SDK are you using? Do you also have a trace where you know that a retry happened but there is only one generation shown? If yes, It would be great if you could share the link to the trace. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Describe your question
I'm currently using Langfuse with the Vercel AI SDK. It works well, but I want to debug structured output retries from the Vercel AI SDK's
generateObject.I know the AI SDK is retrying because it never succeeds on the first attempt. However, I'd like the model to succeed on the first try if possible (for latency reasons).
In Langfuse, it would be really helpful if I could see:
Is this something that's already possible?
Langfuse Cloud or Self-Hosted?
Langfuse Cloud
If Self-Hosted
No response
If Langfuse Cloud
cmipvwlwx004gad07z7dpdsij
SDK and integration versions
Langfuse packages
@langfuse/client: 4.4.9
@langfuse/otel: 4.4.9
@langfuse/tracing: 4.4.9
Vercel AI SDK
ai: 6.0.39
@ai-sdk/anthropic: ^3.0.15
@ai-sdk/google: ^3.0.10
@ai-sdk/openai: ^3.0.12
@ai-sdk/amazon-bedrock: ^3.0.73
@ai-sdk/mistral: ^3.0.9
@ai-sdk/xai: ^3.0.26
@ai-sdk/deepseek: ^1.0.0
@ai-sdk/provider: ^3.0.4
@ai-sdk/devtools: ^0.0.6 (devDependency)
OpenTelemetry
@opentelemetry/sdk-node: 0.208.0
Pre-Submission Checklist
Beta Was this translation helpful? Give feedback.
All reactions