🚀 Langfuse TypeScript SDK v4 is here #8403
Replies: 10 comments 36 replies
-
|
Hi, I'm trying to upgrade from the previous version (which was working but had some issues) to v4 beta in my Next.js app. I can get the standalone SDK working perfectly (spans appear in dashboard), but when using // Working (standalone):
import { NodeSDK } from "@opentelemetry/sdk-node";
import { LangfuseSpanProcessor } from "@langfuse/otel";
const sdk = new NodeSDK({
spanProcessors: [new LangfuseSpanProcessor()],
});
sdk.start();
// Not working (Next.js):
// instrumentation.ts
import { registerOTel } from "@vercel/otel";
import { LangfuseSpanProcessor } from "@langfuse/otel";
export async function register() {
registerOTel({
serviceName: "my-app",
spanProcessors: [new LangfuseSpanProcessor()],
});
}Do you have a confirmed working example with Next.js + Vercel OTEL? My packages: @langfuse/otel@4.0.0-beta.2, @vercel/otel@1.13.0, Next.js 14.2.30 |
Beta Was this translation helpful? Give feedback.
-
|
The switch from automatically filtering out all non-AI spans in the previous (exporter-based) version to having to manually set this up in each span processor is pretty significant. As we already have an otel infra (we use it for things other than our AI endpoints), a naive upgrade to this basically switches on the firehose of every trace and every span -- something we absolutely can't have with the kind of information we have in our traces. The span-based approach does make integration easier (we previously had to hack together an exporter wrapper that sent things to multiple sub-exporters), but it does make things riskier, especially with the warning about filterings spans breaking the structure of traces. |
Beta Was this translation helpful? Give feedback.
-
|
This is working for us, but I do notice a runtime warning for media files in NextJS: Would be great if you could update https://github.com/langfuse/langfuse-vercel-ai-nextjs-example. I'm guessing it'll be pretty easy to put up a repro. |
Beta Was this translation helpful? Give feedback.
-
|
@hassiebp Is there an example of using the Vercel AI SDK with the generateText / generateObject methods (rather than the streamText method)? I'm guessing that we should be able to get open telemetry spans directly in Langfuse without having to call updateActiveObservation() everywhere? |
Beta Was this translation helpful? Give feedback.
-
|
Thanks everybody for your feedback! The JS SDK v4 is now generally available 🚀 |
Beta Was this translation helpful? Give feedback.
-
|
I'm a bit confused with the new tracing approach about the absence of the "root" trace, which is usually where the user/session id and "root" trace name is being set. |
Beta Was this translation helpful? Give feedback.
-
|
Hi, are there known compatibility issues with Sentry? I recall seeing that in old docs. If Sentry is enabled I do not see traces in Langfuse. |
Beta Was this translation helpful? Give feedback.
-
|
@hassiebp we followed the steps to setup the new |
Beta Was this translation helpful? Give feedback.
-
|
Hi team, we have an existing otel backend setup with signoz. Our current simplified setup looks like this: diag.setLogger(new DiagConsoleLogger(), DiagLogLevel.DEBUG);
const resource = detectResources({
detectors: [awsEcsDetector, containerDetector, envDetector, processDetector],
});
const metricExporter = new OTLPMetricExporter({
url: `${endpoint}/v1/metrics`,
headers: {
...
},
});
const meterProvider = new MeterProvider({
resource,
readers: [
new PeriodicExportingMetricReader({
exporter: metricExporter,
exportIntervalMillis: 10000,
}),
],
});
metrics.setGlobalMeterProvider(meterProvider);
// Debug: Log BullMQ Pro instrumentation creation
const bullMQInstrumentation = new BullMQProInstrumentation({
useProducerSpanAsConsumerParent: true,
emitCreateSpansForBulk: false,
});
const sdk = new NodeSDK({
resource,
traceExporter: new OTLPTraceExporter({
url: `${endpoint}/v1/traces`,
...
}),
instrumentations: [
new HttpInstrumentation({
requireParentforOutgoingSpans: true,
ignoreIncomingRequestHook: (req) => {
return req.url?.includes('/health') || false;
},
}),
new UndiciInstrumentation({
requireParentforSpans: true,
responseHook: (span, { request }) => {
if (request.headers) {
...
if (
headers.some(([key]) => key && key.toLowerCase() === 'x-webhook-id')
) {
span.setStatus({ code: SpanStatusCode.OK });
}
}
},
}),
bullMQInstrumentation,
new IORedisInstrumentation(),
new RuntimeNodeInstrumentation({
monitoringPrecision: 5000,
}),
],
});
sdk.start();We are currently looking for a good way to make this version upgrade while keeping things working on Langfuse side and also keep traces between Langfuse and signoz isolated. I have read the isolated TraceProcessor guide in the docs but a huge warning regarding the setup resulting in orphan children is what keeping us from making the move. |
Beta Was this translation helpful? Give feedback.
-
|
@hassiebp why did the new LangfuseCallbackHandler({
// 'enabled' does not exist anymore
enabled: this.configService.get<boolean>(
"LANGFUSE_TRACING_ENABLED",
),
sessionId: ...,
userId: ...,
environment: this.configService.get("ENVIRONMENT")!,
tags: ...,
metadata: ...,
}),that property was very important, what happened? |
Beta Was this translation helpful? Give feedback.


Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
🚀 Langfuse TypeScript SDK v4 is here!
We’ve rebuilt the JS SDK on top of OpenTelemetry — unlocking better developer experience, robust context management, and seamless integrations with tools like OpenAI, LangChain, and more.
📚 Docs
📖 SDK Reference
Highlights:
@langfuse/core,@langfuse/client,@langfuse/tracing, etc.)Installation
We’d love your feedback — please share thoughts, issues, and ideas in this discussion thread.
Beta Was this translation helpful? Give feedback.
All reactions