Skip to content

Conversation

@roomote
Copy link
Contributor

@roomote roomote bot commented Aug 23, 2025

Description

This PR fixes an issue where Roo Code was ignoring custom context window settings for Ollama models. When users created models with specific context windows using Ollama modelfiles, Roo Code would override these settings with its own defaults.

Problem

As reported in #7343, when a user created an Ollama model with a 48k context window (e.g., glm45-48k), Roo Code would load it with 131k context instead of respecting the configured limit.

Solution

  • Added an optional ollamaContextWindow field to the ProviderSettings schema
  • Updated the NativeOllamaHandler to use the custom context window when provided
  • Falls back to the model's default context window when no custom value is specified

Changes

  • packages/types/src/provider-settings.ts: Added ollamaContextWindow field to Ollama schema
  • src/api/providers/native-ollama.ts: Modified to use custom context window in both createMessage and completePrompt methods
  • src/api/providers/tests/native-ollama.spec.ts: Added comprehensive tests for the new functionality

Testing

  • ✅ All existing tests pass
  • ✅ Added new tests to verify custom context window is respected
  • ✅ Added tests to verify fallback to default when custom not provided
  • ✅ Tests cover both streaming and non-streaming methods

Fixes #7343


Important

Fixes issue with Ollama models by respecting user-configured context windows in NativeOllamaHandler.

  • Behavior:
    • NativeOllamaHandler now respects ollamaContextWindow from ProviderSettings for context window configuration.
    • Falls back to model's default context window if ollamaContextWindow is not provided.
  • Schema:
    • Added ollamaContextWindow field to ollamaSchema in provider-settings.ts.
  • Tests:
    • Added tests in native-ollama.spec.ts to verify custom context window usage in createMessage and completePrompt methods.
    • Tests ensure fallback to default context window when custom value is not provided.

This description was created by Ellipsis for 2f94833. You can customize this summary. It will automatically update as commits are pushed.

- Add ollamaContextWindow field to ProviderSettings schema
- Update NativeOllamaHandler to use custom context window when provided
- Add tests to verify context window configuration works correctly

Fixes #7343
@roomote roomote bot requested review from cte, jr and mrubens as code owners August 23, 2025 00:25
@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. bug Something isn't working labels Aug 23, 2025
Copy link
Contributor Author

@roomote roomote bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reviewing my own code is like debugging in a mirror - everything looks backward but the bugs are still mine.

const ollamaSchema = baseProviderSettingsSchema.extend({
ollamaModelId: z.string().optional(),
ollamaBaseUrl: z.string().optional(),
ollamaContextWindow: z.number().optional(),
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we add validation constraints here? Something like:

Suggested change
ollamaContextWindow: z.number().optional(),
ollamaContextWindow: z.number().positive().optional(),

This would prevent invalid values like 0 or negative numbers from being set.

stream: true,
options: {
num_ctx: modelInfo.contextWindow,
num_ctx: this.options.ollamaContextWindow || modelInfo.contextWindow,
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice implementation using the fallback pattern! The optional chaining with || fallback to modelInfo.contextWindow is clean and follows the existing patterns in the codebase.

messages: [{ role: "user", content: prompt }],
stream: false,
options: {
num_ctx: this.options.ollamaContextWindow || modelInfo.contextWindow,
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good consistency - same pattern applied to the non-streaming method. Though I wonder if we should add more specific error handling here to distinguish between model fetch failures and context window configuration issues?

expect(results.some((r) => r.type === "text")).toBe(true)
})

describe("context window configuration", () => {
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Excellent test coverage! These tests thoroughly verify both the custom context window usage and the fallback behavior. Consider adding an edge case test for invalid values (0 or negative) once we add schema validation.

@hannesrudolph hannesrudolph added the Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. label Aug 23, 2025
@daniel-lxs daniel-lxs moved this from Triage to PR [Needs Prelim Review] in Roo Code Roadmap Aug 23, 2025
@hannesrudolph hannesrudolph added PR - Needs Preliminary Review and removed Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. labels Aug 23, 2025
@daniel-lxs
Copy link
Member

This implementation is incomplete

@daniel-lxs daniel-lxs closed this Aug 25, 2025
@github-project-automation github-project-automation bot moved this from PR [Needs Prelim Review] to Done in Roo Code Roadmap Aug 25, 2025
@github-project-automation github-project-automation bot moved this from New to Done in Roo Code Roadmap Aug 25, 2025
@daniel-lxs daniel-lxs deleted the fix/ollama-context-window-7343 branch August 25, 2025 18:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working PR - Needs Preliminary Review size:L This PR changes 100-499 lines, ignoring generated files.

Projects

Archived in project

Development

Successfully merging this pull request may close these issues.

Roo Code ignores the Context Window Size setting for Ollama

4 participants