Skip to content

Wrong routes for Ollama modelsΒ #7070

@LivioGama

Description

@LivioGama

App Version

3.25.14

API Provider

Ollama

Model Used

gpt-oss:120b

Roo Code Task Links (Optional)

No response

πŸ” Steps to Reproduce

When using gpt-oss:120b, the plugin tries to get completion from classical openAI routes instead of ollama /generate. Does not happen on Kilo Code.

Image

πŸ’₯ Outcome Summary

Should use /completion

πŸ“„ Relevant Logs or Errors (Optional)

Metadata

Metadata

Assignees

No one assigned

    Labels

    Issue - In ProgressSomeone is actively working on this. Should link to a PR soon.bugSomething isn't working

    Type

    No type

    Projects

    Status

    Done

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions