Skip to content

Copilot premium requeusts via LM API #7010

@grapexy

Description

@grapexy

What specific problem does this solve?

LM API by design, counts only single premium request per user session.

Roo-Code, however, uses 1 premium request per message, even when in session. This make LM API to burn through the premium requests pretty fast.

Tools like OpenCode and others solve this by passing the X-Initiator flag. If I understand correctly, the first message in chat session would have a value of user and subsequent values - agent. Which will result in just 1 premium request consumed.

sst/opencode#595
BerriAI/litellm#12859
olimorris/codecompanion.nvim@ed96fe3

Additional context (optional)

No response

Roo Code Task Links (Optional)

No response

Request checklist

  • I've searched existing Issues and Discussions for duplicates
  • This describes a specific problem with clear impact and context

Interested in implementing this?

  • Yes, I'd like to help implement this feature

Implementation requirements

  • I understand this needs approval before implementation begins

How should this be solved? (REQUIRED if contributing, optional otherwise)

No response

How will we know it works? (Acceptance Criteria - REQUIRED if contributing, optional otherwise)

No response

Technical considerations (REQUIRED if contributing, optional otherwise)

No response

Trade-offs and risks (REQUIRED if contributing, optional otherwise)

No response

Metadata

Metadata

Assignees

Labels

Issue - In ProgressSomeone is actively working on this. Should link to a PR soon.enhancementNew feature or requestproposal

Type

No type

Projects

Status

Issue [In Progress]

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions