Skip to content

Add OpenRouter support as an LLM provider #1

@moodmosaic

Description

@moodmosaic

It may be useful to add support for OpenRouter, which acts as a unified API gateway for many of the already supported models and more.

The docs state that OpenRouter “will select the least expensive and best GPUs available to serve the request” and “fall back … if you are rate-limited.” which I think it's interesting for both resilience and potential cost efficiency.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions