-
Notifications
You must be signed in to change notification settings - Fork 2.6k
Add max tokens checkbox option for OpenAI compatible provider #4467
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add max tokens checkbox option for OpenAI compatible provider #4467
Conversation
- Add checkbox control to enable/disable max tokens in API requests - Update OpenAI compatible provider UI with max tokens option - Add test coverage for the new max tokens functionality - Update localization files across all supported languages
daniel-lxs
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @AlexandruSmirnov, thank you for working on this issue.
I took a look at your PR left a couple of questions and suggestions.
Let me know what you think!
Also I noticed there's a test failing regarding the translations, can you take a look?
- Fixed max_tokens support for O3 models in OpenAI provider - Refactored OpenAI provider to eliminate code duplication with addMaxTokensIfNeeded helper - Made Azure AI Inference Service respect the includeMaxTokens checkbox setting - Applied code optimizations to reduce redundancy - Added missing translations for includeMaxTokens in Catalan and German locales - Updated tests to cover new functionality
- O3 family models (o3-mini, o3) do not support max_tokens parameter - All other models use max_completion_tokens instead of deprecated max_tokens - Remove unused isAzureAiInference parameter from addMaxTokensIfNeeded - Update tests to reflect correct behavior for each model type Per OpenAI docs: max_tokens is deprecated and not compatible with o-series models
- O3 models now include max_completion_tokens when includeMaxTokens is true - Updated tests to reflect that O3 models support max_completion_tokens - This addresses PR feedback that O3 models should use addMaxTokensIfNeeded()
- Added more detailed comments explaining that O3 models support the modern max_completion_tokens parameter - Clarified that this allows O3 models to limit response length when includeMaxTokens is enabled - Emphasized that max_tokens is deprecated in favor of max_completion_tokens
|
@daniel-lxs Should look good now. |
daniel-lxs
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @AlexandruSmirnov , Thank you for fixing this issue!
LGTM
Co-authored-by: AlexandruSmirnov <[email protected]> Co-authored-by: Matt Rubens <[email protected]>
Related GitHub Issue
Closes: #4036
Description
Add max tokens checkbox option for OpenAI compatible provider
Test Procedure
Type of Change
srcor test files.Pre-Submission Checklist
npm run lint).console.log) has been removed.npm test).mainbranch.npm run changesetif this PR includes user-facing changes or dependency updates.Screenshots / Videos
Documentation Updates
Additional Notes
Get in Touch
sandruarmy
Important
Add a checkbox in the UI to enable/disable max tokens in API requests for OpenAI compatible providers, with corresponding tests and localization updates.
OpenAICompatible.tsxfor enabling/disabling max tokens in API requests.settings.json) in multiple languages to includeincludeMaxOutputTokensandincludeMaxOutputTokensDescription.OpenAICompatible.spec.tsxto verify checkbox behavior and state changes.openai.spec.tsto ensure API requests include/exclude max tokens based on checkbox state.openai.tsto conditionally includemax_tokensin API requests based on checkbox state and Azure AI Inference usage.This description was created by
for 508c245. You can customize this summary. It will automatically update as commits are pushed.