Skip to content

Conversation

eycjur
Copy link
Contributor

@eycjur eycjur commented Oct 4, 2025

Support temperature for GPT-5 Chat model

GPT-5 chat is not an inference model, and since it requires support for parameters like temperature or tool_choice that are unavailable in inference models, I modified gpt_5_transformation to apply only when the model is a reasoning model.

Note: Since the current PR changes the public interface, if undesirable, it is also possible to retain the AzureOpenAIGPT5Config.is_model_gpt_5_model interface and set it to False for GPT-5 chat cases.

Relevant issues

Fixes #13781 and #14704

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/litellm/ directory, Adding at least 1 test is a hard requirement - see details
  • I have added a screenshot of my new test passing locally
  • My PR passes all unit tests on make test-unit
  • My PR's scope is as isolated as possible, it only solves 1 specific problem

Type

🐛 Bug Fix

Changes

Copy link

vercel bot commented Oct 4, 2025

@eycjur is attempting to deploy a commit to the CLERKIEAI Team on Vercel.

A member of the Team first needs to authorize it.

@eycjur eycjur changed the title [Fix] Support temperature for GPT-5 Chat model [Fix] Support non-reasoning parameter for GPT-5 Chat model Oct 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug]: OpenAI GPT-5 Chat model does not support "temperature" parameter
1 participant