-
Notifications
You must be signed in to change notification settings - Fork 2.5k
Closed
Labels
Issue - In ProgressSomeone is actively working on this. Should link to a PR soon.Someone is actively working on this. Should link to a PR soon.bugSomething isn't workingSomething isn't workingenhancementNew feature or requestNew feature or request
Description
App Version
3.25.15
API Provider
Ollama
Model Used
gpt-oss:120b
Roo Code Task Links (Optional)
No response
π Steps to Reproduce
Implement support for Ollama Turbo mode within the Roo Code Ollama integration. This enables users to utilize Turboβs datacenter-grade hardware execution from Roo Code, CLI, and API, following the Ollama Turbo documentation.
There is already the base URL field that can be customized to https://ollama.com, you just miss a field Ollama API Key that will be sent as a bearer in Authorization header.
Small description below it is a plus:
Ollama instances or cloud services. Leave empty for local installations.
Here is the equivalent work on Cline: https://github.com/cline/cline/pull/5400/files
π₯ Outcome Summary
Expect a field to configure Ollama API key sent as bearer in the headers for Ollama Turbo support.
π Relevant Logs or Errors (Optional)
wighawag
Metadata
Metadata
Assignees
Labels
Issue - In ProgressSomeone is actively working on this. Should link to a PR soon.Someone is actively working on this. Should link to a PR soon.bugSomething isn't workingSomething isn't workingenhancementNew feature or requestNew feature or request
Type
Projects
Status
Done