Built-in and custom model providers #8180
Replies: 3 comments 1 reply
-
|
I see here a list https://docs.continue.dev/customize/model-providers/overview but is not possible to use a custom one specifying the host as example. I see here for a custom host https://docs.continue.dev/customize/model-providers/top-level/openai |
Beta Was this translation helpful? Give feedback.
-
|
@ksze we did this for Tetrate recently. There are many providers and it is not possible for us to add all of them, but we welcome contributions from contributors if they have a need for one we don't already support. My suggestion is opening a PR and we can go from there. |
Beta Was this translation helpful? Give feedback.
-
|
@ksze is this your primary concern?
If so then I think that's definitely something we can solve on the Hub. As for the actual codebase though, we try to be permissive when allowing providers to add support in the extension but generally leave it up to the provider to update our docs for their service. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Continue.dev has a built-in list of supported model providers (deepseek, openai, etc.) for whom some properties Continue.dev knows by default.
However, there are two issues:
provideras "openai" and then override theapiBasewith Z.ai's API base URL, but that's dirty because if I then publish the model config to hub.continue.dev, it shows the model as coming from openai when it really doesn't. I wish you would have a way to explicitly specify OpenAI-compatible or Anthropic-compatible as a separate property from the provider's actual nameBeta Was this translation helpful? Give feedback.
All reactions