-
Notifications
You must be signed in to change notification settings - Fork 394
Description
Describe the bug
When streaming is true, onTraceEnd
is called right after the onTraceStart
, it does not wait for the stream to be consumed. This causes issues with 3rd party tracing connectors (e.g. braintrust) that rely on onTraceEnd
to come after all spans in the run have finished.
When stream: false
, this works as expected: onTraceEnd is called when the whole run
call has finished.
The run
call, when streaming, does not block, so this await'd fn returns pretty immediately, right after the first onSpanStart
openai-agents-js/packages/agents-core/src/tracing/context.ts
Lines 59 to 61 in 34a9528
await trace.start(); | |
const result = await fn(trace); | |
await trace.end(); |
A workaround seems to be wrapping our own withTrace
around the run call, but not sure how that interacts with the internal withTrace calls...
Workarounds welcome!
Debug information
- Agents SDK version: (e.g.
v0.1.8
) - Runtime environment (e.g.
Node.js 22.15.0
)
Repro steps
class MyCustomProcessor implements TracingProcessor {
async onTraceStart(trace: Trace) {
console.log(`Trace started: ${trace.name}`);
}
async onSpanStart(span: Span) {
const spanData = span.spanData;
if (spanData?.type === 'generation') {
console.log(` LLM Generation started for prompt: "${spanData.input}"`);
} else if (spanData?.type === 'function') {
console.log(` Tool call started for function: ${spanData.name}`);
}
}
async onSpanEnd(span: Span) {
const spanData = span.spanData;
if (spanData?.type === 'generation' && spanData.output) {
console.log(` LLM Generation ended with output: "${spanData.output}"`);
}
}
async onTraceEnd(trace: Trace) {
console.log(`Trace ended: ${trace.name}`);
console.log(`Trace ID: ${trace.traceId}`);
}
async shutdown(timeout?: number) {}
async forceFlush() {}
}
// use it
setTraceProcessors([new MyCustomProcessor()]);
Expected behavior
My expectation is that the streaming behavior follows the stream: false
behavior in that onTraceEnd
is called after the whole stream has been consumed and the run
call is over.