Skip to content

Conversation

@joslat
Copy link
Contributor

@joslat joslat commented Jan 17, 2026

Motivation and Context

Please help reviewers and future users, providing the following information:

  1. Why is this change required?
    Fixes issue .NET: Issue with Foundational Sample  #3269
  2. What problem does it solve?
    .NET: Issue with Foundational Sample  #3269
  3. If it fixes an open issue, please link to the issue here.
    .NET: Issue with Foundational Sample  #3269

Description

Just some changes and adjustments These notes will help understanding how your code works. Thanks! -->

Contribution Checklist

  • [X ] The code builds clean without any errors or warnings
  • [ X] The PR follows the Contribution Guidelines
  • [ X] All unit tests pass, and I have added new tests where possible
  • Is this a breaking change? If yes, add "[BREAKING]" prefix to the title of the PR.

Copilot AI review requested due to automatic review settings January 17, 2026 19:16
@markwallace-microsoft markwallace-microsoft added .NET workflows Related to Workflows in agent-framework labels Jan 17, 2026
@github-actions github-actions bot changed the title Joslat fix sample issue .NET: Joslat fix sample issue Jan 17, 2026
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR fixes issue #3269 in the MixedWorkflowAgentsAndExecutors sample by correcting a type mismatch between executors and the AIAgentHostExecutor. The AIAgentHostExecutor sends messages as List<ChatMessage>, but the downstream executors were expecting a single ChatMessage, causing runtime routing failures.

Changes:

  • Fixed type signatures in JailbreakSyncExecutor and FinalOutputExecutor to accept List<ChatMessage> instead of ChatMessage
  • Added XML remarks documenting the type matching requirement
  • Changed ShowAgentThinking from false to true for better debugging visibility
  • Modified deployment name configuration to use a hardcoded constant
Comments suppressed due to low confidence (2)

dotnet/samples/GettingStarted/Workflows/_Foundational/07_MixedWorkflowAgentsAndExecutors/Program.cs:45

  • The deployment name should be read from environment variables as per the sample code guidelines. All other samples in this repository use the pattern: var deploymentName = Environment.GetEnvironmentVariable("AZURE_OPENAI_DEPLOYMENT_NAME") ?? "gpt-4o-mini"; This ensures consistency across samples and allows users to configure their deployment name without modifying code. The hardcoded constant with a commented-out environment variable read deviates from established patterns.
        var deploymentName = Environment.GetEnvironmentVariable("AZURE_OPENAI_DEPLOYMENT_NAME") ?? "gpt-4o-mini";

dotnet/samples/GettingStarted/Workflows/_Foundational/07_MixedWorkflowAgentsAndExecutors/Program.cs:46

  • According to the sample code guidelines, "Prefer defining variables using types rather than var, to help users understand the types involved." The variable should be declared as IChatClient chatClient instead of using var to make the type explicit for users learning from this sample.
        var chatClient = new AzureOpenAIClient(new Uri(endpoint), new AzureCliCredential()).GetChatClient(deploymentName).AsIChatClient();

@joslat
Copy link
Contributor Author

joslat commented Jan 17, 2026

Root Cause Analysis

Breaking Change Assessment: AIAgentHostExecutor Message Output

Summary

The JailbreakSyncExecutor (and similar executors expecting ChatMessage from agents) stopped working due to a framework change in how AIAgentHostExecutor sends agent responses to downstream executors.

Root Cause

Commit Details

What Changed

Before (old behavior):

// Individual ChatMessage objects were sent one by one as they were built from streaming updates
ChatMessage? currentStreamingMessage = null;
// ... streaming logic ...
await context.SendMessageAsync(currentStreamingMessage, cancellationToken: cancellationToken).ConfigureAwait(false);

After (new behavior):

// The entire Messages collection is sent at once
await context.SendMessageAsync(updates.ToAgentRunResponse().Messages, cancellationToken: cancellationToken).ConfigureAwait(false);
// or in non-streaming mode:
await context.SendMessageAsync(response.Messages, cancellationToken: cancellationToken).ConfigureAwait(false);

Technical Details

Message Router Behavior

The MessageRouter in the workflow framework uses exact runtime type matching:

// From MessageRouter.RouteMessageAsync
if (this._typedHandlers.TryGetValue(message.GetType(), out MessageHandlerF? handler))
{
    result = await handler(message, context, cancellationToken).ConfigureAwait(false);
}

This means:

  • Executor<ChatMessage> registers a handler for type ChatMessage
  • Executor<List<ChatMessage>> registers a handler for type List<ChatMessage>
  • The router checks message.GetType() which returns the runtime type, not the declared interface type

Type Mismatch

  • AgentRunResponse.Messages is declared as IList<ChatMessage>
  • The actual runtime type is List<ChatMessage> (from new List<ChatMessage>(1))
  • An Executor<ChatMessage> cannot handle List<ChatMessage> - they are different types
  • An Executor<IList<ChatMessage>> also won't work because the router uses exact type matching

Impact on Sample Code

Original Code (broken)

internal sealed class JailbreakSyncExecutor() : Executor<ChatMessage>("JailbreakSync")
{
    public override async ValueTask HandleAsync(ChatMessage message, ...)

Fixed Code (working)

internal sealed class JailbreakSyncExecutor() : Executor<List<ChatMessage>>("JailbreakSync")
{
    public override async ValueTask HandleAsync(List<ChatMessage> message, ...)

Affected Patterns

Any executor that:

  1. Follows an AIAgent in a workflow edge (.AddEdge(agent, executor))
  2. Expects to receive ChatMessage as input

Must now be changed to accept List<ChatMessage> instead.

Recommendation

When connecting a custom executor after an AI agent in a workflow, use:

  • Executor<List<ChatMessage>> for void handlers
  • Executor<List<ChatMessage>, TOutput> for handlers that return a value

Related Files Changed

  • dotnet/src/Microsoft.Agents.AI.Workflows/Specialized/AIAgentHostExecutor.cs

Workaround Alternatives

  1. Use List<ChatMessage> (recommended): Change executor input type to match the runtime type
  2. Create an adapter executor: Build a ChatProtocolExecutor-based adapter that handles List<ChatMessage> and forwards individual messages
  3. Framework enhancement: The framework could potentially be enhanced to support interface-based type matching or automatic collection unwrapping

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

.NET workflows Related to Workflows in agent-framework

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants