Skip to content

@opentelemetry/instrumentation-openai *crashes* the app when client.chat.completions.create({stream: true, ...}).withResponse() is used #3286

@lucasvieirasilva

Description

@lucasvieirasilva

What version of OpenTelemetry are you using?

  • @opentelemetry/auto-instrumentations-node: 0.67.2
  • @opentelemetry/instrumentation-openai: 0.7.0

What version of Node are you using?

v24.2.0

What did you do?

When applying the getNodeAutoInstrumentations() instrumentation and using the OpenAI SDK as follows:

const response = await client.chat.completions
        .create(requestBody)
        .withResponse();

I got the following error:

client.chat.completions.create(...).withResponse is not a function

What did you expect to see?

The @opentelemetry/instrumentation-openai should also patch the withResponse function.

What did you see instead?

I got the following error:

client.chat.completions.create(...).withResponse is not a function

Additional context

Tip: React with 👍 to help prioritize this issue. Please use comments to provide useful context, avoiding +1 or me too, to help us triage it. Learn more here.

Metadata

Metadata

Assignees

Labels

bugSomething isn't workinghas:reproducerThis bug/feature has a minimal reproduction providedpkg:instrumentation-openaipriority:p1Bugs which cause problems in end-user applications such as crashes, data inconsistencies

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions