Skip to content

DefaultChatClient returns null content when using Bedrock Converse API with openai.gpt-oss models #4522

@DrSatyr

Description

@DrSatyr

Bug description
When using Spring AI with Bedrock Converse API and the models like 'openai.gpt-oss-20b-1:0', 'openai.gpt-oss-120b-1:0', the DefaultChatClient does not correctly extract the assistant response.

The model returns multiple Generation objects in the ChatResponse:

  • The first Generation has textContent = null and finishReason = end_turn. (Originally, it was reasoning output from the BedrockClient, but it misses textContent)
  • The second Generation contains the actual assistant message finishReason = end_turn and textContent = "Hello! How can I help you today?".

However, DefaultCallResponseSpec.content() and DefaultCallResponseSpec.entity() only take the textContent from the first generation, which leads to null being returned instead of the real model output.

Environment
Bedrock Converse API + openai.gpt-oss-20b-1:0 or openai.gpt-oss-120b-1:0

Steps to reproduce

var content = chatClientRegistry
    .get(getType())
    .prompt("Hello")
    .call()
    .content();

System.out.println(content); // returns null

Inspecting chatResponse() shows two generations:

  • Generation[0]: textContent = null, finishReason = end_turn
  • Generation[1]: textContent = "Hello! How can I help you today?", finishReason = end_turn

Expected behavior
While using Bedrock Converse API + openai.gpt-oss-20b-1:0 content() method can return actual model output instead of null.

Metadata

Metadata

Assignees

No one assigned

    Labels

    BedrockbugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions