-
Notifications
You must be signed in to change notification settings - Fork 2.6k
feat: add custom context window support for AI providers #7210
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
- Add customModelInfo fields to Groq and Vertex provider schemas - Create reusable ContextWindow component for UI - Update Groq and Vertex provider UIs to include context window input - Update API handlers to use custom context window when provided - Allow users to override default context window size for better control Fixes #7209
|
|
||
| const currentValue = customModelInfo?.contextWindow?.toString() || "" | ||
| const placeholderText = placeholder || defaultContextWindow?.toString() || "128000" | ||
| const labelText = label || "Context Window Size" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use localization instead of hardcoded English strings. Consider passing a translated label and helper text (or use t()) rather than "Context Window Size" and the helper text below.
This comment was generated because it violated a code review rule: irule_C0ez7Rji6ANcGkkX.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reviewing my own code is like grading my own homework - I already know where I cut corners.
| return { id, info } | ||
| } | ||
|
|
||
| protected getCustomModelInfo(): ModelInfo | undefined { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this the best approach? I notice that OpenAI provider directly accesses this.options.openAiCustomModelInfo while here we're introducing a getCustomModelInfo() method pattern. Should we standardize on one approach across all providers for consistency?
Also, this method lacks JSDoc documentation explaining its purpose and when subclasses should override it.
| } | ||
|
|
||
| protected override getCustomModelInfo(): ModelInfo | undefined { | ||
| return this.options.groqCustomModelInfo ?? undefined |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing test coverage for this new functionality. Could we add tests to verify that custom context window values properly override the defaults?
| const defaultInfo: ModelInfo = vertexModels[id] | ||
|
|
||
| // Apply custom model info if provided | ||
| const customModelInfo = this.options.vertexCustomModelInfo |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I notice this implementation directly accesses this.options.vertexCustomModelInfo instead of using the getCustomModelInfo() pattern like GroqHandler. Should we be consistent?
Also missing test coverage here.
| onContextWindowChange(undefined) | ||
| } else { | ||
| const numValue = parseInt(value, 10) | ||
| if (!isNaN(numValue) && numValue > 0) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The validation only checks if the number is positive, but doesn't validate reasonable upper bounds. Could we add a warning for very large values that might impact performance? The issue description mentions performance degradation with large context windows.
Also consider adding a max value constraint or at least a warning when values exceed certain thresholds (e.g., > 1M tokens).
|
|
||
| const currentValue = customModelInfo?.contextWindow?.toString() || "" | ||
| const placeholderText = placeholder || defaultContextWindow?.toString() || "128000" | ||
| const labelText = label || "Context Window Size" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These hardcoded English strings should use i18n keys for consistency with the rest of the application. The label and helper text should come from translation files.
|
|
||
| const inputEventTransform = (event: any) => (event as { target: HTMLInputElement })?.target?.value | ||
|
|
||
| export const ContextWindow = ({ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing test coverage for this new component. We should add tests for:
- Rendering with different props
- Input validation behavior
- Callback invocation with correct values
- Edge cases (empty input, invalid numbers, etc.)
| [setApiConfigurationField], | ||
| ) | ||
|
|
||
| const handleContextWindowChange = useCallback( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This entire block (lines 31-65) is duplicated in Vertex.tsx. Could we extract this logic into a shared utility function like createCustomModelInfo(contextWindow, currentModelInfo) to avoid duplication and make maintenance easier?
| [setApiConfigurationField], | ||
| ) | ||
|
|
||
| const handleContextWindowChange = useCallback( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is duplicate code from Groq.tsx. Should be extracted into a shared utility to maintain DRY principles.
|
Closing, the issue needs scoping |
This PR implements custom context window size support for AI providers as requested in #7209.
Changes Made
customModelInfofields to Groq and Vertex provider schemas inpackages/types/src/provider-settings.tsContextWindowcomponent inwebview-ui/src/components/common/ContextWindow.tsxfor consistent UI across providersBaseOpenAiCompatibleProviderto support custom model info overrideHow It Works
Users can now:
Testing
Acceptance Criteria Met
✅ Users can edit model provider settings and enter custom context window size
✅ The custom value is saved to
CustomModelInfowhen settings are submitted✅ The custom context window is used by the provider when making API calls
✅ No changes are made to other model parameters
✅ Default values are used as fallback when no custom information is provided
Fixes #7209
Important
Adds custom context window support for Groq and Vertex AI providers, updating schemas, UI components, and API handlers.
customModelInfofields to Groq and Vertex schemas inprovider-settings.ts.BaseOpenAiCompatibleProviderto support custom model info override.ContextWindowcomponent inContextWindow.tsxfor consistent UI.Groq.tsxandVertex.tsx.This description was created by
for 9c16b8a. You can customize this summary. It will automatically update as commits are pushed.