Skip to content

Commit cb0fa6a

Browse files
committed
wip
1 parent 0b6962c commit cb0fa6a

File tree

2 files changed

+120
-0
lines changed

2 files changed

+120
-0
lines changed
Lines changed: 119 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1 +1,120 @@
11
# Examples
2+
3+
## Sourcegraph-supplied models only
4+
5+
This section includes examples how to configure Cody to use Sourcegraph-supplied models.
6+
7+
- [Minimal configuration](/cody/model-configuration#configure-sourcegraph-supplied-models)
8+
- [Using model filters](/cody/model-configuration#model-filters)
9+
- [Change default models](/cody/model-configuration#default-models)
10+
11+
## Sourcegraph-supplied models and BYOK (Bring Your Own Key)
12+
13+
Sourcegraph-supplied models come with preconfigured providers, identified by the following IDs (namespaces):
14+
15+
- "anthropic"
16+
- "google"
17+
- "fireworks"
18+
- "mistral"
19+
- "openai"
20+
21+
### Override provider config for all models in the namespace
22+
23+
When Sourcegraph-supplied models are used and a provider override for a Sourcegraph-supported provider (same ID) is specified, the override applies to all Sourcegraph-supplied models within that provider.
24+
For example, if you specify an override for a provider with ID `"anthropic"`, it will apply to all models from the `"anthropic"` provider.
25+
26+
Example configuration:
27+
28+
```json
29+
{
30+
"cody.enabled": true,
31+
"modelConfiguration": {
32+
"sourcegraph": {},
33+
"providerOverrides": [
34+
{
35+
"id": "anthropic",
36+
"displayName": "Anthropic BYOK",
37+
"serverSideConfig": {
38+
"type": "anthropic",
39+
"accessToken": "sk-ant-token",
40+
"endpoint": "https://api.anthropic.com/v1/messages"
41+
}
42+
}
43+
],
44+
"defaultModels": {
45+
"chat": "anthropic::2024-10-22::claude-3.5-sonnet",
46+
"fastChat": "anthropic::2023-06-01::claude-3-haiku",
47+
"autocomplete": "fireworks::v1::deepseek-coder-v2-lite-base"
48+
}
49+
}
50+
```
51+
52+
In the configuration above, we:
53+
54+
- Enable Sourcegraph-supplied models and do not set any overrides (note that `"modelConfiguration.modelOverrides"` is not specified).
55+
- Route requests for Anthropic models directly to the Anthropic API (via the provider override specified for "anthropic").
56+
- Route requests for other models (such as the Fireworks model for "autocomplete") through Cody Gateway.
57+
58+
### Override provider configur for some models and use the Sourcegraph-configured provider config for the rest
59+
60+
It's possible to route requests directly to the LLM provider (bypassing the Cody Gateway) for some models while using the Sourcegraph-configured provider config for the rest.
61+
62+
Example configuration:
63+
64+
```json
65+
{
66+
"cody.enabled": true,
67+
"modelConfiguration": {
68+
"sourcegraph": {},
69+
"providerOverrides": [
70+
{
71+
"id": "anthropic-byok",
72+
"displayName": "Anthropic BYOK",
73+
"serverSideConfig": {
74+
"type": "anthropic",
75+
"accessToken": "sk-ant-token",
76+
"endpoint": "https://api.anthropic.com/v1/messages"
77+
}
78+
}
79+
],
80+
"modelOverrides": [
81+
{
82+
"modelRef": "anthropic-byok::2024-10-22::claude-3.5-sonnet",
83+
"displayName": "Claude 3.5 Sonnet",
84+
"modelName": "claude-3-5-sonnet-latest",
85+
"capabilities": ["edit", "chat", "vision"],
86+
"category": "accuracy",
87+
"status": "stable",
88+
"tier": "free",
89+
"contextWindow": {
90+
"maxInputTokens": 45000,
91+
"maxOutputTokens": 4000
92+
}
93+
}
94+
],
95+
"defaultModels": {
96+
"chat": "anthropic-byok::2024-10-22::claude-3.5-sonnet",
97+
"fastChat": "anthropic::2023-06-01::claude-3-haiku",
98+
"autocomplete": "fireworks::v1::deepseek-coder-v2-lite-base"
99+
}
100+
}
101+
```
102+
103+
In the configuration above, we:
104+
105+
- Enable Sourcegraph-supplied models (the `sourcegraph` field is not empty).
106+
- Define a new provider with the ID `"anthropic-byok"` and configure it to use the Anthropic API.
107+
- Since this provider is unknown to Sourcegraph, no Sourcegraph-supplied models are available for it. Therefore,
108+
we add a custom model in the `"modelOverrides"` section.
109+
- Use the custom model configured in the previous step (`"anthropic-byok::2024-10-22::claude-3.5-sonnet"`) for `"chat"`.
110+
Requests are sent directly to the Anthropic API as set in the provider override.
111+
- For `"fastChat"` and `"autocomplete"`, we use Sourcegraph-supplied models via Cody Gateway.
112+
113+
## Config examples for various LLM providers
114+
115+
Below are configuration examples for setting up various LLM providers using BYOK.
116+
These examples are applicable whether or not you are using Sourcegraph-supported models.
117+
118+
<Accordion title="Anthropic">
119+
120+
</Accordion>

docs/cody/model-configuration/index.mdx

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -223,6 +223,7 @@ This field is an array of items, each with the following fields:
223223
- `"speed"` - Ideal for low-parameter models that may not be suited for general-purpose chat but are beneficial for specialized tasks, such as query rewriting.
224224
- `"accuracy"` - Reserved for models, like OpenAI o1, that use advanced reasoning techniques to improve response accuracy, though with slower latency.
225225
- `"other"` - Used for older models without distinct advantages in reasoning or speed. Select this category if uncertain about which category to choose.
226+
- `"deprecated"` - For models that are no longer supported by the provider and are filtered out on the client side (not available for use).
226227

227228
- `contextWindow` - An object that defines the number of "tokens" (units of text) that can be sent to the LLM.
228229
This setting influences response time and request cost, and may vary according to the limits set by each LLM model or provider.

0 commit comments

Comments
 (0)