Skip to content

Conversation

@rafaelrddc
Copy link
Contributor

@rafaelrddc rafaelrddc commented Aug 25, 2025

Description
This MR adds support for the OpenAI service_tier parameter in Spring AI's OpenAiChatOptions, allowing users to specify the processing type used for serving requests to OpenAI.

Changes

  • Added serviceTier field to OpenAiChatOptions class
  • Updated builder pattern to include serviceTier() method
  • Added appropriate getter/setter methods
  • Updated documentation to reflect the new option

Resolves #4235

@rafaelrddc rafaelrddc marked this pull request as ready for review August 25, 2025 16:55
@sobychacko
Copy link
Contributor

@rafaelrddc Can we use an enum for service_tier if there are advertised values for it? That way, this will be more type safe. Additionally, could you please add a few tests that validate the feature in the chat response? Thanks!

@rafaelrddc
Copy link
Contributor Author

@sobychacko I have just added tests for the implementation. Although the service_tier field already has some predefined values, I believe it is more appropriate to keep it open, similar to how it's done for the model field. This approach provides greater flexibility and prevents integration breaks if OpenAI changes or adds new values in the future.

To facilitate use and validation, I created an enum with the currently supported values, allowing the user to choose to use it when they want more safety in the values entered.

Closes spring-projectsgh-4235

Signed-off-by: Rafael Cunha <[email protected]>
@sobychacko
Copy link
Contributor

Thanks for the updates. Merged via ad2e1bc.

@sobychacko sobychacko closed this Aug 27, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add support for OpenAI service_tier in OpenAiChatOptions

2 participants