Skip to content

Commit f6433e8

Browse files
author
Chris Concannon
committed
remove incomplete information about LLM providers
1 parent 88d9e51 commit f6433e8

File tree

1 file changed

+7
-19
lines changed

1 file changed

+7
-19
lines changed

docs/cody/core-concepts/cody-gateway.mdx

Lines changed: 7 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,10 @@
66

77
Sourcegraph Cody Gateway powers the default `"provider": "sourcegraph"` and Cody completions for Sourcegraph Enterprise users. It supports a variety of upstream LLM providers, such as [Anthropic](https://www.anthropic.com/) and [OpenAI](https://openai.com/), with rate limits, quotas, and model availability tied to your Sourcegraph Enterprise subscription.
88

9+
## Supported Models
10+
11+
See our page on [Supported LLMs](/cody/capabilities/supported-models) for a current list of supported models and providers.
12+
913
## Using Cody Gateway in Sourcegraph Enterprise
1014

1115
To enable completions provided by Cody Gateway on your Sourcegraph Enterprise instance, make sure your license key is set, and Cody is enabled in your [site configuration](/admin/config/site_config):
@@ -14,6 +18,9 @@ To enable completions provided by Cody Gateway on your Sourcegraph Enterprise in
1418
{
1519
"licenseKey": "<...>",
1620
"cody.enabled": true,
21+
"completions": {
22+
"provider": "sourcegraph"
23+
}
1724
}
1825
```
1926

@@ -27,25 +34,6 @@ Cody Gateway is hosted at `cody-gateway.sourcegraph.com`. To use Cody Gateway, y
2734

2835
<Callout type="warning">Sourcegraph Cody Gateway access must be included in your Sourcegraph Enterprise subscription. You can verify it by checking it with your account manager. If you are a [Sourcegraph Cloud](/cloud/) user, Cody is enabled by default on your instance starting with Sourcegraph 5.1.</Callout>
2936

30-
## Configuring custom models
31-
32-
To configure custom models for various Cody configurations (for example, `"completions"`), specify the desired model with the upstream provider as a prefix to the name of the model. For example, to use the `claude-2` model from Anthropic, you would configure:
33-
34-
```json
35-
{
36-
"completions": { "chatModel": "anthropic/claude-2.0" },
37-
}
38-
```
39-
40-
The currently supported upstream providers for models are:
41-
42-
- [`anthropic/`](https://www.anthropic.com/)
43-
- [`openai/`](https://openai.com/)
44-
45-
For Sourcegraph Enterprise customers, model availability depends on your Sourcegraph Enterprise subscription.
46-
47-
<Callout type="warning">When using OpenAI models for completions, only chat completions will work - code completions are currently unsupported.</Callout>
48-
4937
## Rate limits and quotas
5038

5139
Rate limits, quotas, and model availability are tied to your Sourcegraph Enterprise license for Sourcegraph Enterprise instances.

0 commit comments

Comments
 (0)