-
Notifications
You must be signed in to change notification settings - Fork 2.1k
fix: remove temperature parameter for GPT-5 models #6966
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
GPT-5 no longer supports the temperature parameter in its API. This commit removes the temperature field from GPT-5 requests to prevent API errors when using LiteLLM or other proxies. Fixes #6965
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reviewing my own code is like looking in a mirror and realizing I'm just a recursive function with no base case.
@@ -273,7 +273,7 @@ export class OpenAiNativeHandler extends BaseProvider implements SingleCompletio | |||
stream: boolean | |||
reasoning?: { effort: ReasoningEffortWithMinimal; summary?: "auto" } | |||
text?: { verbosity: VerbosityLevel } | |||
temperature?: number | |||
// temperature parameter removed - GPT-5 no longer supports it |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Minor style inconsistency: This comment uses a different format than the line comment on line 292. Could we standardize on one comment style for consistency?
// TODO: Add a `supportsTemperature` field to the model info. | ||
if (modelId.startsWith("o1") || modelId.startsWith("o3-mini")) { | ||
if (modelId.startsWith("o1") || modelId.startsWith("o3-mini") || modelId.startsWith("gpt-5")) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The existing TODO mentions adding a supportsTemperature
field. Since we're now handling GPT-5 alongside o1 and o3-mini, would it be worth prioritizing this refactor to make the temperature support more explicit in the model definitions?
Closing in favor of #6969 |
Summary
This PR fixes an issue where GPT-5 models were sending the temperature parameter, which is no longer supported by the GPT-5 API. This was causing errors when using LiteLLM or other proxy services.
Changes
temperature
parameter from GPT-5 Responses API requests inopenai-native.ts
model-params.ts
to exclude temperature for GPT-5 models (similar to o1 and o3-mini models)Testing
openai-native.spec.ts
andmodel-params.spec.ts
Related Issue
Fixes #6965
Impact
This change ensures that GPT-5 models work correctly with LiteLLM and other proxy services that strictly enforce the GPT-5 API specification.
Important
Remove unsupported
temperature
parameter for GPT-5 models in API requests and tests.temperature
parameter from GPT-5 API requests inopenai-native.ts
.model-params.ts
to excludetemperature
for GPT-5 models.openai-native.spec.ts
to reflect removal oftemperature
parameter for GPT-5 models.This description was created by
for 10eeece. You can customize this summary. It will automatically update as commits are pushed.