You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/cody/clients/model-configuration.mdx
-78Lines changed: 0 additions & 78 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -608,84 +608,6 @@ The configuration snippet above changes the default model for chat while retaini
608
608
609
609
Note: The values in the `defaultModels` section should match the `modelRef` of one of the available models (found in the `"models"` field of the `.api/modelconfig/supported-models.json` response).
610
610
611
-
##
612
-
613
-
## Concepts
614
-
615
-
The LLM models available for use from a Sourcegraph Enterprise instance are the union of "Sourcegraph-supplied models" and any custom models providers that you explicitly add to your Sourcegraph instance's site configuration. For most administrators, just relying on Sourcegraph-supplied models will ensure that you are using quality models without needing to worry about the specifics.
616
-
617
-
### Sourcegraph-supplied Models
618
-
619
-
The Sourcegraph-supplied models are those that are available from [Cody Gateway](/cody/core-concepts/cody-gateway), and your site configuration controls which of those models can be used.
620
-
621
-
If you wish to not use _any_ Sourcegraph-supplied models, and instead _only_ rely on those you have explicitly defined in your site configuration, you can set the `"sourcegraph"` field to `null`.
622
-
623
-
There are three top-level settings for configuring Sourcegraph-supplied LLM models:
|`endpoint` (optional) | The URL for connecting to Cody Gateway, defaults to the production instance. |
628
-
|`accessToken` (optional) | The access token used to connect to Cody Gateway, defaulting to the current license key. |
629
-
|`modelFilters` (optional) | Filters for which models to include from Cody Gateway. |
630
-
631
-
**Model Filters**
632
-
633
-
The `"modelFilters"` section is how you restrict which Cody Gateway models are made available to your Sourcegraph Enterprise instance's users.
634
-
635
-
The first field is the `"statusFilter"`. Each LLM model is given a label by Sourcegraph as per its release, such as "stable", beta", or "experimental". By default, all models available on
636
-
Cody Gateway are exposed. Using the category filter ensures that only models with a particular category are made available to your users.
637
-
638
-
The `"allow"` and `"deny"` fields, are arrays of [model references](#model-configuration) for what models should or should not be included. These values accept wild cards.
639
-
640
-
The following examples illustrate how to use all these settings in conjunction:
641
-
642
-
```json
643
-
"cody.enabled": true,
644
-
"modelConfiguration": {
645
-
"sourcegraph": {
646
-
"modelFilters": {
647
-
// Only allow "beta" and "stable" models.
648
-
// Not "experimental" or "deprecated".
649
-
"statusFilter": ["beta", "stable"],
650
-
651
-
// Allow any models provided by Anthropic, OpenAI, Google and Fireworks.
652
-
"allow": [
653
-
"anthropic::*", // Anthropic models
654
-
"openai::*", // OpenAI models
655
-
"google::*", // Google Gemini models
656
-
"fireworks::*", // Autocomplete models like StarCoder and DeepSeek-V2-Coder hosted on Fireworks
657
-
],
658
-
659
-
// Do not include any models with the Model ID containing "turbo",
660
-
// or any from AcmeCo.
661
-
"deny": [
662
-
"*turbo*",
663
-
"acmeco::*"
664
-
]
665
-
}
666
-
}
667
-
}
668
-
```
669
-
670
-
## Default Models
671
-
672
-
The `"modelConfiguration"` setting also contains a `"defaultModels"` field that allows you to specify the LLM model used depending on the situation. If no default is specified, or refers to a model that isn't found, it will silently fallback to a suitable alternative.
The format of these strings is a "Model Reference", which is a format for uniquely identifying each LLM model exposed from your Sourcegraph instance.
688
-
689
611
## Advanced Configuration
690
612
691
613
For most administrators, relying on the LLM models made available by Cody Gateway is sufficient. However, if even more customization is required, you can configure your own LLM providers and models.
0 commit comments