Skip to content

Support Ollama TurboΒ #7147

@LivioGama

Description

@LivioGama

App Version

3.25.15

API Provider

Ollama

Model Used

gpt-oss:120b

Roo Code Task Links (Optional)

No response

πŸ” Steps to Reproduce

Implement support for Ollama Turbo mode within the Roo Code Ollama integration. This enables users to utilize Turbo’s datacenter-grade hardware execution from Roo Code, CLI, and API, following the Ollama Turbo documentation.

There is already the base URL field that can be customized to https://ollama.com, you just miss a field Ollama API Key that will be sent as a bearer in Authorization header.

Small description below it is a plus:

Ollama instances or cloud services. Leave empty for local installations.
Image

Here is the equivalent work on Cline: https://github.com/cline/cline/pull/5400/files

πŸ’₯ Outcome Summary

Expect a field to configure Ollama API key sent as bearer in the headers for Ollama Turbo support.

πŸ“„ Relevant Logs or Errors (Optional)

Metadata

Metadata

Assignees

No one assigned

    Labels

    EnhancementNew feature or requestIssue - In ProgressSomeone is actively working on this. Should link to a PR soon.bugSomething isn't working

    Type

    No type

    Projects

    Status

    Done

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions