Skip to content

Conversation

@roomote
Copy link
Contributor

@roomote roomote bot commented Aug 27, 2025

This PR removes the num_ctx parameter from the Ollama chat API options in the native-ollama provider.

Changes

  • Removed num_ctx: modelInfo.contextWindow from the Ollama chat options object

Rationale

The num_ctx parameter was being explicitly set to the model's context window size, but this appears to be redundant and potentially problematic:

  • Ollama handles context window sizing internally based on the model
  • Forcing a specific context window via num_ctx may conflict with model defaults
  • Other providers in the codebase don't explicitly set context windows in API calls

Testing

  • All existing tests pass without modification (6 tests in native-ollama.spec.ts)
  • Type checking passes
  • Linting passes

Impact

This is a minimal, surgical change that only removes the problematic parameter. The temperature setting logic remains intact with proper fallback for DeepSeek R1 models.

Resolves the issue mentioned in the Slack request.


Important

Remove num_ctx parameter from Ollama chat options to rely on internal context handling and update related tests and error messages.

  • Behavior:
    • Removed num_ctx parameter from Ollama chat options in native-ollama.ts.
    • Ollama now relies on internal context window handling.
    • Updated error handling for model and service errors in native-ollama.ts.
  • Testing:
    • Removed test case for num_ctx in native-ollama.spec.ts.
    • Updated context window tests in ollama.test.ts to reflect removal.
  • i18n:
    • Added new error messages for Ollama in various language files.

This description was created by Ellipsis for 8bcf415. You can customize this summary. It will automatically update as commits are pushed.

@roomote roomote bot requested review from cte, jr and mrubens as code owners August 27, 2025 13:49
@dosubot dosubot bot added size:XS This PR changes 0-9 lines, ignoring generated files. bug Something isn't working labels Aug 27, 2025
Copy link
Contributor Author

@roomote roomote bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reviewing my own code because apparently I trust no one, not even myself.

This change looks good. Removing the parameter is the right call:

  • Consistency: Aligns with other Ollama providers that don't explicitly set context windows
  • Simplicity: Lets Ollama handle context sizing internally based on the model
  • No breaking changes: All tests pass without modification
  • Unified behavior: Both and methods now have consistent parameter handling

The removal of this redundant parameter should prevent potential conflicts with model defaults while maintaining full functionality.

@hannesrudolph hannesrudolph added the Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. label Aug 27, 2025
@hannesrudolph hannesrudolph moved this from Triage to PR [Needs Prelim Review] in Roo Code Roadmap Aug 27, 2025
@hannesrudolph hannesrudolph added PR - Needs Preliminary Review and removed Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. labels Aug 27, 2025
@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. and removed size:XS This PR changes 0-9 lines, ignoring generated files. labels Sep 1, 2025
@daniel-lxs daniel-lxs force-pushed the feature/remove-ollama-num-ctx branch from 7589e53 to 29d4012 Compare September 1, 2025 17:07
@daniel-lxs daniel-lxs moved this from PR [Needs Prelim Review] to PR [Needs Review] in Roo Code Roadmap Sep 1, 2025
@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Sep 1, 2025
roomote and others added 4 commits September 16, 2025 10:51
The num_ctx parameter was being passed to the Ollama chat API but is no longer needed. This removes it from the options object to simplify the configuration.
- Add token estimation to prevent exceeding model limits
- Implement async model initialization with proper error handling
- Fix context window handling with environment variable support
- Improve error messages with internationalization
- Update tests to reflect context window fixes
- Add changeset documentation

Matches and exceeds improvements from Kilo-Org PR #2170
- Added t() function import from i18n module
- Replaced all 8 hardcoded English error messages with translation keys
- Updated tests to expect translation keys instead of English text
- All translations already exist in 18 language files
- Tests passing: 5/5 in native-ollama.spec.ts
@daniel-lxs daniel-lxs force-pushed the feature/remove-ollama-num-ctx branch from 29d4012 to 16265fc Compare September 16, 2025 16:05
@daniel-lxs daniel-lxs closed this Sep 16, 2025
@github-project-automation github-project-automation bot moved this from New to Done in Roo Code Roadmap Sep 16, 2025
@github-project-automation github-project-automation bot moved this from PR [Needs Review] to Done in Roo Code Roadmap Sep 16, 2025
@daniel-lxs daniel-lxs deleted the feature/remove-ollama-num-ctx branch September 16, 2025 18:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working lgtm This PR has been approved by a maintainer PR - Needs Review size:L This PR changes 100-499 lines, ignoring generated files.

Projects

Archived in project

Development

Successfully merging this pull request may close these issues.

4 participants