-
Notifications
You must be signed in to change notification settings - Fork 2.5k
Description
What specific problem does this solve?
Adds DeepInfra as a model provider so users can access its models directly with competitive pricing and prompt caching.
Additional context (optional)
No response
Roo Code Task Links (Optional)
No response
Request checklist
- I've searched existing Issues and Discussions for duplicates
- This describes a specific problem with clear impact and context
Interested in implementing this?
- Yes, I'd like to help implement this feature
Implementation requirements
- I understand this needs approval before implementation begins
How should this be solved? (REQUIRED if contributing, optional otherwise)
Implement DeepInfra as a new provider by following the same pattern used for existing providers. Add API key support, fetch models dynamically from DeepInfra’s API, and route completions through its OpenAI-compatible endpoint with prompt caching support.
How will we know it works? (Acceptance Criteria - REQUIRED if contributing, optional otherwise)
Given DeepInfra is selected and an API key is set, When I open the model picker, Then I see DeepInfra’s models loaded dynamically, And I can select one and run a chat without errors, But I don’t see the picker if no key is set.
Technical considerations (REQUIRED if contributing, optional otherwise)
Use existing provider interface, fetcher interface and model cache to avoid performance issue; map chat messages to DeepInfra’s OpenAI-compatible endpoints. No blockers.
Trade-offs and risks (REQUIRED if contributing, optional otherwise)
None
Metadata
Metadata
Assignees
Labels
Type
Projects
Status