Skip to content

[ENHANCEMENT] Add SiliconCloud provider support #8550

@AnotiaWang

Description

@AnotiaWang

Problem (one or two sentences)

SiliconCloud is a LLM inference platform that offers up-to-date SOTA models with competitive pricing.

However, when using SiliconCloud as an OpenAI compatible provider, I noticed three issues:

  1. SiliconCloud by default disables reasoning for some models, like GLM-4.6 or DeepSeek V3.1 Terminus. This could result in degraded model performance in complex tasks.
  2. Its reasoning params (enable_thinking and thinking_budget) are different from OpenAI's.
  3. When trying out new models, I have to update the context length and pricing in settings as well, which is kind of annoying

Context (who is affected and when)

The OpenAI compatible provider does not fit SiliconCloud well

Desired behavior (conceptual, not technical)

Implement a dedicated provider for SiliconCloud.

Constraints / preferences (optional)

No response

Request checklist

  • I've searched existing Issues and Discussions for duplicates
  • This describes a specific problem with clear context and impact

Roo Code Task Links (optional)

No response

Acceptance criteria (optional)

No response

Proposed approach (optional)

I've opened a PR that implements the SiliconCloud provider, based on the OpenAI compatible provider. For details please see #8551

Trade-offs / risks (optional)

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    Issue/PR - TriageNew issue. Needs quick review to confirm validity and assign labels.enhancementNew feature or request

    Type

    No type

    Projects

    Status

    Done

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions