-
Notifications
You must be signed in to change notification settings - Fork 5.1k
Description
Bug description
When using Azure.AI.Agents.Persistent through Microsoft Agent Framework streaming APIs, a chat run can complete successfully, return the full assistant answer, and then crash during stream finalization/disposal with a System.NullReferenceException.
The exception originates inside the Azure SDK telemetry/finalization path:
Azure.AI.Agents.Persistent.RunStepOpenAPIToolCall.JsonModelWriteCore(...)
Package and environment
- Package:
Azure.AI.Agents.Persistent - Package version:
1.0.0-preview.260311.1 - App shape: .NET 10 console app using Microsoft Agent Framework with Azure Foundry persistent agents
- OS: Windows
- Auth:
DefaultAzureCredential - Streaming path:
AIAgent.RunStreamingAsync(...)
Expected behavior
After the streamed answer completes, enumeration/disposal should finish cleanly.
Actual behavior
The full assistant answer is returned to the caller, but the process crashes afterward during stream cleanup/final telemetry processing.
Reproduction notes
- Create a persistent Foundry agent and connect with
PersistentAgentsClient/ChatClientAgent. - Wrap the agent with Agent Framework OpenTelemetry instrumentation.
- Call
RunStreamingAsync(...)and enumerate the response stream to completion. - Observe that, after the final assistant output is emitted, the SDK throws
NullReferenceExceptionduring finalization.
Minimal observed output
assistant(stream)> <full answer omitted>
[update] role=assistant, messageId=..., author=..., responseId=...
[update] role=assistant, messageId=..., author=..., responseId=...
[content:UsageContent] Microsoft.Extensions.AI.UsageContent
Unhandled exception. System.NullReferenceException: Object reference not set to an instance of an object.
at Azure.AI.Agents.Persistent.RunStepOpenAPIToolCall.JsonModelWriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options)
at Azure.AI.Agents.Persistent.RunStepOpenAPIToolCall.System.ClientModel.Primitives.IJsonModel<Azure.AI.Agents.Persistent.RunStepOpenAPIToolCall>.Write(...)
at Azure.AI.Agents.Persistent.ModelSerializationExtensions.WriteObjectValue[T](...)
at Azure.AI.Agents.Persistent.RunStepOpenAPIToolCall.ToRequestContent()
at Azure.AI.Agents.Persistent.Telemetry.OpenTelemetryScope.GetToolCallAttributes(RunStepToolCall toolCall)
at Azure.AI.Agents.Persistent.Telemetry.OpenTelemetryScope.ProcessToolCalls(RunStepToolCallDetails toolCallDetails)
at Azure.AI.Agents.Persistent.Telemetry.OpenTelemetryScope.RecordRunStepEventAttributes(RunStep runStep, Boolean stream)
at Azure.AI.Agents.Persistent.Telemetry.OpenTelemetryScope.End()
at Azure.AI.Agents.Persistent.Telemetry.OpenTelemetryScope.Dispose()
Suspected root cause
The generated serializer for RunStepOpenAPIToolCall appears to iterate OpenAPI without a null check.
Relevant file: sdk/ai/Azure.AI.Agents.Persistent/src/Generated/RunStepOpenAPIToolCall.Serialization.cs
Impact
- The user-visible answer is already complete, but the client process still terminates.
- This makes streamed chat runs unreliable in production.
- The issue appears tied to SDK finalization/telemetry, not application rendering.
Workaround currently used
Application-side workaround: catch this very specific NullReferenceException after the stream completes and preserve the completed answer.
Request
Please confirm whether this is a known bug in Azure.AI.Agents.Persistent preview and whether a null-safe fix should be added around RunStepOpenAPIToolCall serialization / telemetry attribute generation.