Skip to content

Wrong routes for Ollama models #7070

@LivioGama

Description

@LivioGama

App Version

3.25.14

API Provider

Ollama

Model Used

gpt-oss:120b

Roo Code Task Links (Optional)

No response

🔁 Steps to Reproduce

When using gpt-oss:120b, the plugin tries to get completion from classical openAI routes instead of ollama /generate. Does not happen on Kilo Code.

Image

💥 Outcome Summary

Should use /completion

📄 Relevant Logs or Errors (Optional)

Metadata

Metadata

Assignees

No one assigned

    Labels

    Issue/PR - TriageNew issue. Needs quick review to confirm validity and assign labels.bugSomething isn't working

    Type

    No type

    Projects

    Status

    Triage

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions