Skip to content
Closed
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 8 additions & 2 deletions src/api/providers/openai-native.ts
Original file line number Diff line number Diff line change
Expand Up @@ -147,10 +147,13 @@ export class OpenAiNativeHandler extends BaseProvider implements SingleCompletio
): ApiStream {
const { reasoning, verbosity } = this.getModel()

// For GPT-5 models, use temperature 1.0 as default when user hasn't set a temperature
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this intentional? We're setting up temperature logic for GPT-5 models in handleDefaultModelMessage, but GPT-5 models actually get routed to handleGpt5Message (line 85) which doesn't use this temperature setting at all. This means the temperature logic here would never actually apply to GPT-5 models.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@daniel-lxs is it right?

const defaultTemp = this.isGpt5Model(model.id) ? 1.0 : OPENAI_NATIVE_DEFAULT_TEMPERATURE

// Prepare the request parameters
const params: any = {
model: model.id,
temperature: this.options.modelTemperature ?? OPENAI_NATIVE_DEFAULT_TEMPERATURE,
temperature: this.options.modelTemperature ?? defaultTemp,
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The temperature is being set here, but could we add a comment explaining why GPT-5 models specifically need a temperature of 1.0? This would help future maintainers understand the reasoning behind this special case.

messages: [{ role: "system", content: systemPrompt }, ...convertToOpenAiMessages(messages)],
stream: true,
stream_options: { include_usage: true },
Expand Down Expand Up @@ -401,12 +404,15 @@ export class OpenAiNativeHandler extends BaseProvider implements SingleCompletio

const info: ModelInfo = openAiNativeModels[id]

// For GPT-5 models, use temperature 1.0 as default when user hasn't set a temperature
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We're using the same temperature logic here and on line 150. Could we extract this into a helper method like getDefaultTemperature(modelId: string) to avoid duplication? This would make it easier to maintain and ensure consistency.

const defaultTemp = this.isGpt5Model(id) ? 1.0 : OPENAI_NATIVE_DEFAULT_TEMPERATURE

const params = getModelParams({
format: "openai",
modelId: id,
model: info,
settings: this.options,
defaultTemperature: OPENAI_NATIVE_DEFAULT_TEMPERATURE,
defaultTemperature: defaultTemp,
})

// For GPT-5 models, ensure we support minimal reasoning effort
Expand Down
Loading