-
Notifications
You must be signed in to change notification settings - Fork 497
fix(openai): fix tracing exception in model response getting method when tracing is disabled #227
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix(openai): fix tracing exception in model response getting method when tracing is disabled #227
Conversation
…hen tracing is disabled
seratch
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks good to me; @dkundel-openai any concern?
|
This seems odd because if tracing is disabled there should still be a NoopTrace that this should get attached to? |
|
The main thing is that not using |
|
@dkundel-openai thanks for the good point. If we figure the root cause out, the fix can be much simpler than rewriting a bunch of code. |
seratch
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
to avoid confusion, let me revert the approval for now
|
Hello 👋🏻 @seratch @dkundel-openai, thank you for the reply. The issue can be verified by running the unit test that was added on the original code. When the tracing is set to disabled the code throws a The overall usage context was using the (Azure OpenAI Service) Github Models API, which doesn't seem to work with the Agents JS SDK as far as I tried, but does work with the Agents Python one. Due to this API not supporting tracing it must be disabled but that seems to fail somehow (although I tried in many ways to disable tracing, via env var flag, via RunConfig, via globalTraceProvider directly or via ModelRequest tracing property, none seem to had the desired effect, and this seemed like one of the causes of failure – I also might have tried too hard to disable it 😄) |
|
@seratch @dkundel-openai Here is also a sample stack trace when attempting to call the API: {"message":"No existing trace found","name":"Error","stack":"Error: No existing trace found\n at withNewSpanContext (/Users/xxx-redacted-xxx/node_modules/@openai/agents-openai/node_modules/@openai/agents-core/dist/tracing/context.js:141:15)\n at <anonymous> (/Users/xxx-redacted-xxx/node_modules/@openai/agents-openai/node_modules/@openai/agents-core/dist/tracing/createSpans.js:6:16)\n at OpenAIResponsesModel.getResponse (/Users/xxx-redacted-xxx/node_modules/@openai/agents-openai/dist/openaiResponsesModel.js:725:32)\n at <anonymous> (/Users/xxx-redacted-xxx/node_modules/@openai/agents/node_modules/@openai/agents-core/dist/run.js:151:63)" }Overall the code looks like this: // config
const customClient = new OpenAI({ baseURL: _config.github.modelEndpoint, apiKey: _config.github.token })
setTracingDisabled(true)
setDefaultOpenAIClient(customClient)
// usage
const agent = new Agent({
name: 'Assistant',
instructions: 'You are a helpful assistant',
})
const result = await run(agent, 'Write a haiku about recursion in programming.', {})The version used are: "@openai/agents": "0.0.14",
"openai": "5.10.2" |
|
This PR is stale because it has been open for 10 days with no activity. |
|
Any news for this bug ? |
|
Something inside the tracing logic that creates noop one, but we haven't figured what the viable solution is out yet. |
|
@seratch On my side I also didn't have time to look into it more either, I can't say I fully understand the argument regarding the |
|
@upphiminn Thanks for taking the time to check this issue. I've managed to see the situation with 0.0.16 but it no longer occurs with 0.0.17, which was just released with a fix for CJS/ESModule build issue. Perhaps, the library source resolution might be the cause. Let me close this one, but please feel free to write in and/or create a new issue if necessary. |
This pull request addresses a possible issue in the
OpenAIResponsesModelwhere an exception could occur during getting a response when tracing is disabled. When tracing is disabled there is no parent trace sowithResponseSpanfails with not finding a context trace for the span it attempts to create. This leads to the wholegetResponsefailing. Fix found below copies the same style of span creation only when tracing is enabled as seen ingetStreamedResponsemethod.