Skip to content

feat: Add Forge LLM provider support#3659

Open
Yiiii0 wants to merge 1 commit intoQuivrHQ:mainfrom
Yiiii0:feature/add-forge-provider
Open

feat: Add Forge LLM provider support#3659
Yiiii0 wants to merge 1 commit intoQuivrHQ:mainfrom
Yiiii0:feature/add-forge-provider

Conversation

@Yiiii0
Copy link

@Yiiii0 Yiiii0 commented Feb 11, 2026

Description

  • Add FORGE to DefaultModelSuppliers enum in config.py
  • Add Forge provider branch in LLMEndpoint.from_config() using ChatOpenAI
  • Default base URL: https://api.forge.tensorblock.co/v1
  • Environment variable: FORGE_API_KEY
  • Model format: Provider/model-name (e.g. OpenAI/gpt-4o)

About Forge

Forge (https://github.com/TensorBlock/forge) is an open-source middleware that routes inference across 40+ upstream providers (including OpenAI, Anthropic, Gemini, DeepSeek, and OpenRouter). It is OpenAI API compatible — works with the standard OpenAI SDK by changing base_url and api_key.

Motivation

We have seen growing interest from users who standardize on Forge for their model management and want to use it natively with ${PROJECT}. This integration aims to bridge that gap.

Key Benefits

  • Self-Hosted & Privacy-First: Forge is open-source and designed to be self-hosted, critical for users who require data sovereignty
  • Future-Proofing: acts as a decoupling layer — instead of maintaining individual adapters for every new provider, Forge users can access them immediately
  • Compatibility: supports established aggregators (like OpenRouter) as well as direct provider connections (BYOK)

References

Checklist before requesting a review

Please delete options that are not relevant.

  • My code follows the style guidelines of this project
  • I have performed a self-review of my code
  • I have commented hard-to-understand areas
  • I have ideally added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes
  • Any dependent changes have been merged

Screenshots (if appropriate):

- Add FORGE to DefaultModelSuppliers enum in config.py
- Add Forge provider branch in LLMEndpoint.from_config() using ChatOpenAI
- Default base URL: https://api.forge.tensorblock.co/v1
- Environment variable: FORGE_API_KEY
- Model format: Provider/model-name (e.g. OpenAI/gpt-4o)
@dosubot dosubot bot added size:S This PR changes 10-29 lines, ignoring generated files. area: backend Related to backend functionality or under the /backend directory labels Feb 11, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area: backend Related to backend functionality or under the /backend directory size:S This PR changes 10-29 lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant