Skip to content

Conversation

@quanru
Copy link
Collaborator

@quanru quanru commented Jan 16, 2026

Summary

This PR fixes issue #1677 where Langfuse LLM call tracing was not working.

Root Causes Fixed

1. Incorrect Import Path

The code was importing observeOpenAI from the wrong package:

  • Incorrect: import { observeOpenAI } from 'langfuse'
  • Correct: import { observeOpenAI } from '@langfuse/openai'

According to Langfuse's official documentation, the OpenAI SDK integration should use the @langfuse/openai package, not the base langfuse package.

2. Missing OpenTelemetry Initialization

Langfuse's tracing is built on top of OpenTelemetry, which requires initialization before it can work. Previously, users had to manually add this initialization code to their application entry file, which was easy to miss.

Changes

1. Fixed Import Path (packages/core/src/ai-model/service-caller/index.ts:188)

  • Changed import path from 'langfuse' to '@langfuse/openai'

2. Auto-Initialize OpenTelemetry (packages/core/src/ai-model/service-caller/index.ts:190-214)

  • Automatically initialize OpenTelemetry SDK when Langfuse is enabled
  • Add error handling with clear installation instructions
  • Track initialization state to prevent duplicate initialization
  • Users no longer need to manually add initialization code

3. Documentation Updates

  • Updated Chinese docs: apps/site/docs/zh/model-config.mdx
  • Updated English docs: apps/site/docs/en/model-config.mdx
  • Changed installation command to include all required packages: npm install @langfuse/openai @langfuse/otel @opentelemetry/sdk-node
  • Removed manual OpenTelemetry initialization steps (now automatic)
  • Updated log output examples

Benefits

  • Simpler Setup: No more manual OpenTelemetry initialization required
  • Better Error Messages: Clear instructions if required packages are missing
  • Works Out of the Box: Just install packages and set environment variables

Testing

  • ✅ Lint checks passed
  • ✅ Build successful
  • ✅ No type errors

For Users Affected by #1677

If you're experiencing Langfuse tracing issues:

  1. Update packages:

    npm uninstall langfuse
    npm install @langfuse/openai @langfuse/otel @opentelemetry/sdk-node
  2. Remove manual initialization (if you added it):

    • Remove any NodeSDK or LangfuseSpanProcessor initialization code from your application
    • Midscene now handles this automatically
  3. Restart your application

You should see these logs:

DEBUGGING MODE: OpenTelemetry SDK initialized for Langfuse
DEBUGGING MODE: langfuse wrapper enabled

Langfuse tracing should now work correctly!

Fixes #1677

…/openai'

- Change import from 'langfuse' to '@langfuse/openai' package
- Update documentation to reflect correct package installation
- Fix issue #1677 where Langfuse tracing was not working

The observeOpenAI function is exported from @langfuse/openai package,
not the base langfuse package. This fixes the integration issue where
users couldn't see traces in Langfuse dashboard.
@netlify
Copy link

netlify bot commented Jan 16, 2026

Deploy Preview for midscene ready!

Name Link
🔨 Latest commit 68d7605
🔍 Latest deploy log https://app.netlify.com/projects/midscene/deploys/696edcb345b751000815e8d2
😎 Deploy Preview https://deploy-preview-1805--midscene.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify project configuration.

- Automatically initialize OpenTelemetry SDK when Langfuse is enabled
- Remove manual initialization requirement from documentation
- Add error handling with clear installation instructions
- Track initialization state to prevent duplicate initialization

This simplifies the Langfuse setup process - users no longer need to
manually initialize OpenTelemetry in their application entry file.
@quanru quanru marked this pull request as draft January 16, 2026 09:40
@quanru quanru marked this pull request as ready for review January 20, 2026 01:28
@quanru quanru merged commit 06ae1a0 into main Jan 20, 2026
9 checks passed
@quanru quanru deleted the fix/langfuse-import-path branch January 20, 2026 01:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

can not trace LLM-calls in Langfuse using 'qwen3-vl-plus' model, but in Langsmith it worked

3 participants