Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
26 changes: 7 additions & 19 deletions docs/cody/core-concepts/cody-gateway.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,10 @@

Sourcegraph Cody Gateway powers the default `"provider": "sourcegraph"` and Cody completions for Sourcegraph Enterprise users. It supports a variety of upstream LLM providers, such as [Anthropic](https://www.anthropic.com/) and [OpenAI](https://openai.com/), with rate limits, quotas, and model availability tied to your Sourcegraph Enterprise subscription.

## Supported Models

See our page on [Supported LLMs](/cody/capabilities/supported-models) for a current list of supported models and providers.

## Using Cody Gateway in Sourcegraph Enterprise

To enable completions provided by Cody Gateway on your Sourcegraph Enterprise instance, make sure your license key is set, and Cody is enabled in your [site configuration](/admin/config/site_config):
Expand All @@ -14,6 +18,9 @@ To enable completions provided by Cody Gateway on your Sourcegraph Enterprise in
{
"licenseKey": "<...>",
"cody.enabled": true,
"completions": {
"provider": "sourcegraph"
}
}
```

Expand All @@ -27,25 +34,6 @@ Cody Gateway is hosted at `cody-gateway.sourcegraph.com`. To use Cody Gateway, y

<Callout type="warning">Sourcegraph Cody Gateway access must be included in your Sourcegraph Enterprise subscription. You can verify it by checking it with your account manager. If you are a [Sourcegraph Cloud](/cloud/) user, Cody is enabled by default on your instance starting with Sourcegraph 5.1.</Callout>

## Configuring custom models

To configure custom models for various Cody configurations (for example, `"completions"`), specify the desired model with the upstream provider as a prefix to the name of the model. For example, to use the `claude-2` model from Anthropic, you would configure:

```json
{
"completions": { "chatModel": "anthropic/claude-2.0" },
}
```

The currently supported upstream providers for models are:

- [`anthropic/`](https://www.anthropic.com/)
- [`openai/`](https://openai.com/)

For Sourcegraph Enterprise customers, model availability depends on your Sourcegraph Enterprise subscription.

<Callout type="warning">When using OpenAI models for completions, only chat completions will work - code completions are currently unsupported.</Callout>

## Rate limits and quotas

Rate limits, quotas, and model availability are tied to your Sourcegraph Enterprise license for Sourcegraph Enterprise instances.
Expand Down
Loading