Ollama Context Viewing / Manual Settings #6692
RufiS
started this conversation in
Feature Requests
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
It would be nice to display the maximum context length accurately instead of just displaying "1" for Ollama provider. Since the existing Roo-Code code already determines the context length of the chosen Ollama model anyways. Also with that said, the new Qwen3-Coder supports 256k, but I'm forced to run (with 24GB VRAM) at 170k. It would be nice to have the ability to choose the context window size when selecting Ollama as the provider so you can use less than the Ollama model says it supports when retrieving that information from the Ollama server.
Beta Was this translation helpful? Give feedback.
All reactions