-
Notifications
You must be signed in to change notification settings - Fork 2.6k
fix(openrouter): pass GPT-5 reasoning effort and include_reasoning to OpenRouter #7319
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -121,6 +121,8 @@ export class OpenRouterHandler extends BaseProvider implements SingleCompletionH | |
| messages: openAiMessages, | ||
| stream: true, | ||
| stream_options: { include_usage: true }, | ||
| // For GPT-5 via OpenRouter, request reasoning content in the stream explicitly | ||
| ...(modelId.startsWith("openai/gpt-5") && { include_reasoning: true }), | ||
|
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Consider extracting this GPT-5 check to a helper function like |
||
| // Only include provider if openRouterSpecificProvider is not "[default]". | ||
| ...(this.options.openRouterSpecificProvider && | ||
| this.options.openRouterSpecificProvider !== OPENROUTER_DEFAULT_PROVIDER_NAME && { | ||
|
|
@@ -208,7 +210,14 @@ export class OpenRouterHandler extends BaseProvider implements SingleCompletionH | |
| defaultTemperature: isDeepSeekR1 ? DEEP_SEEK_DEFAULT_TEMPERATURE : 0, | ||
| }) | ||
|
|
||
| return { id, info, topP: isDeepSeekR1 ? 0.95 : undefined, ...params } | ||
| // Apply GPT-5 defaults for OpenRouter: default reasoning effort to "medium" when enabled | ||
| let adjustedParams = params | ||
| if (id.startsWith("openai/gpt-5") && !params.reasoning && this.options.enableReasoningEffort !== false) { | ||
| const effort = (this.options.reasoningEffort as any) ?? "medium" | ||
| adjustedParams = { ...params, reasoning: { effort } as OpenRouterReasoningParams } | ||
|
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Is this logic potentially redundant? I notice we're adding GPT-5-specific reasoning handling here, but |
||
| } | ||
|
|
||
| return { id, info, topP: isDeepSeekR1 ? 0.95 : undefined, ...adjustedParams } | ||
| } | ||
|
|
||
| async completePrompt(prompt: string) { | ||
|
|
@@ -220,6 +229,8 @@ export class OpenRouterHandler extends BaseProvider implements SingleCompletionH | |
| temperature, | ||
| messages: [{ role: "user", content: prompt }], | ||
| stream: false, | ||
| // For GPT-5 via OpenRouter, request reasoning details explicitly as well | ||
| ...(modelId.startsWith("openai/gpt-5") && { include_reasoning: true }), | ||
| // Only include provider if openRouterSpecificProvider is not "[default]". | ||
| ...(this.options.openRouterSpecificProvider && | ||
| this.options.openRouterSpecificProvider !== OPENROUTER_DEFAULT_PROVIDER_NAME && { | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good test coverage for
createMessage()! Should we add a similar test forcompletePrompt()to verify thatinclude_reasoning: trueis also passed for GPT-5 models in that method?