Skip to content
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 7 additions & 7 deletions docs/cody/clients/model-configuration.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -111,21 +111,21 @@ The `"modelConfiguration"` section exposes two fields `"providerOverrides"` and

### Provider Configuration

A "provider" a way to organize LLM models. Typically a provider would be referring to the company that produced the model. Or the specific API/service being used to access the model. But conceptually, it's just a namespace.
A "provider" is a way to organize LLM models. Typically a provider would be referring to the company that produced the model. Or the specific API/service being used to access the model. But conceptually, it's just a namespace.

By defining a provider override in your Sourcegraph site configuration, you are introducing a new namespace to contain models. Or customize the existing provider namespace supplied by Sourcegraph. (e.g. all `"anthropic"` models.)

The following configuration shippet defines a single provider override with the ID `"anthropic-direct"`.
The following configuration shippet defines a single provider override with the ID `"anthropic"`.

```json
"modelConfiguration": {
// Do not use any Sourcegraph-supplied models.
"sourcegraph": null,

// Introduce our own, custom provider "anthropic-direct".
// Define a provider for "anthropic".
"providerOverrides": [
{
"id": "anthropic-direct",
"id": "anthropic",
"displayName": "Anthropic models, sent directly to anthropic.com",

// The server-side config section defines how this provider operates.
Expand Down Expand Up @@ -159,7 +159,7 @@ The following configuration shippet defines a single provider override with the

The most important part of a provider's configuration is the `"serverSideConfig"` field. That defines how the LLM model's should be invoked, i.e. which external service or API will be called to serve LLM requests.

In the example, the `"type"` field was `"anthropic"`. Meaning that any interactions using the `"anthropic-direct"` provider would be sent directly to Anthropic, at the supplied `endpoint` URL using the given `accessToken`.
In the example, the `"type"` field was `"anthropic"`. Meaning that any interactions using the `"anthropic"` provider would be sent directly to Anthropic, at the supplied `endpoint` URL using the given `accessToken`.

However, Sourcegraph supports several different types of LLM API providers natively. The current set of supported LLM API providers is:

Expand All @@ -181,14 +181,14 @@ With a provider defined, we can now specify custom models using that provider by

**Model Reference**

The following configuration snippet defines a custom model, using the `"anthropic-direct"` provider from the previous example.
The following configuration snippet defines a custom model, using the `"anthropic"` provider from the previous example.

```json
"modelConfiguration": {
...
"modelOverrides": [
{
"modelRef": "anthropic-direct::2023-06-01::claude-3-sonnet",
"modelRef": "anthropic::2023-06-01::claude-3-sonnet",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we just change this to 3.5 Sonnet in case people are copy pasting this?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a good suggestion. I updated the model in the example to Claude Sonnet 3.5 and also bumped up the maxInputTokens value to 45k based on our previous conversations about the max supported context window.

"displayName": "Claude 3 Sonnet",
"modelName": "claude-3-sonnet-20240229",
"contextWindow": {
Expand Down
Loading