|  | 
| 1 | 1 | # Examples | 
|  | 2 | + | 
|  | 3 | +## Sourcegraph-supplied models only | 
|  | 4 | + | 
|  | 5 | +This section includes examples how to configure Cody to use Sourcegraph-supplied models. | 
|  | 6 | + | 
|  | 7 | +-   [Minimal configuration](/cody/model-configuration#configure-sourcegraph-supplied-models) | 
|  | 8 | +-   [Using model filters](/cody/model-configuration#model-filters) | 
|  | 9 | +-   [Change default models](/cody/model-configuration#default-models) | 
|  | 10 | + | 
|  | 11 | +## Sourcegraph-supplied models and BYOK (Bring Your Own Key) | 
|  | 12 | + | 
|  | 13 | +Sourcegraph-supplied models come with preconfigured providers, identified by the following IDs (namespaces): | 
|  | 14 | + | 
|  | 15 | +-   "anthropic" | 
|  | 16 | +-   "google" | 
|  | 17 | +-   "fireworks" | 
|  | 18 | +-   "mistral" | 
|  | 19 | +-   "openai" | 
|  | 20 | + | 
|  | 21 | +### Override provider config for all models in the namespace | 
|  | 22 | + | 
|  | 23 | +When Sourcegraph-supplied models are used and a provider override for a Sourcegraph-supported provider (same ID) is specified, the override applies to all Sourcegraph-supplied models within that provider. | 
|  | 24 | +For example, if you specify an override for a provider with ID `"anthropic"`, it will apply to all models from the `"anthropic"` provider. | 
|  | 25 | + | 
|  | 26 | +Example configuration: | 
|  | 27 | + | 
|  | 28 | +```json | 
|  | 29 | +{ | 
|  | 30 | +"cody.enabled": true, | 
|  | 31 | +"modelConfiguration": { | 
|  | 32 | +  "sourcegraph": {}, | 
|  | 33 | +  "providerOverrides": [ | 
|  | 34 | +    { | 
|  | 35 | +     "id": "anthropic", | 
|  | 36 | +    "displayName": "Anthropic BYOK", | 
|  | 37 | +    "serverSideConfig": { | 
|  | 38 | +      "type": "anthropic", | 
|  | 39 | +      "accessToken": "sk-ant-token", | 
|  | 40 | +       "endpoint": "https://api.anthropic.com/v1/messages" | 
|  | 41 | +      } | 
|  | 42 | +    } | 
|  | 43 | +  ], | 
|  | 44 | +  "defaultModels": { | 
|  | 45 | +    "chat": "anthropic::2024-10-22::claude-3.5-sonnet", | 
|  | 46 | +    "fastChat": "anthropic::2023-06-01::claude-3-haiku", | 
|  | 47 | +    "autocomplete": "fireworks::v1::deepseek-coder-v2-lite-base" | 
|  | 48 | +  } | 
|  | 49 | +} | 
|  | 50 | +``` | 
|  | 51 | + | 
|  | 52 | +In the configuration above, we: | 
|  | 53 | + | 
|  | 54 | +-   Enable Sourcegraph-supplied models and do not set any overrides (note that `"modelConfiguration.modelOverrides"` is not specified). | 
|  | 55 | +-   Route requests for Anthropic models directly to the Anthropic API (via the provider override specified for "anthropic"). | 
|  | 56 | +-   Route requests for other models (such as the Fireworks model for "autocomplete") through Cody Gateway. | 
|  | 57 | + | 
|  | 58 | +### Override provider configur for some models and use the Sourcegraph-configured provider config for the rest | 
|  | 59 | + | 
|  | 60 | +It's possible to route requests directly to the LLM provider (bypassing the Cody Gateway) for some models while using the Sourcegraph-configured provider config for the rest. | 
|  | 61 | + | 
|  | 62 | +Example configuration: | 
|  | 63 | + | 
|  | 64 | +```json | 
|  | 65 | +{ | 
|  | 66 | +"cody.enabled": true, | 
|  | 67 | +"modelConfiguration": { | 
|  | 68 | +  "sourcegraph": {}, | 
|  | 69 | +  "providerOverrides": [ | 
|  | 70 | +    { | 
|  | 71 | +     "id": "anthropic-byok", | 
|  | 72 | +      "displayName": "Anthropic BYOK", | 
|  | 73 | +      "serverSideConfig": { | 
|  | 74 | +      "type": "anthropic", | 
|  | 75 | +      "accessToken": "sk-ant-token", | 
|  | 76 | +       "endpoint": "https://api.anthropic.com/v1/messages" | 
|  | 77 | +      } | 
|  | 78 | +    } | 
|  | 79 | +  ], | 
|  | 80 | +  "modelOverrides": [ | 
|  | 81 | +    { | 
|  | 82 | +      "modelRef": "anthropic-byok::2024-10-22::claude-3.5-sonnet", | 
|  | 83 | +      "displayName": "Claude 3.5 Sonnet", | 
|  | 84 | +      "modelName": "claude-3-5-sonnet-latest", | 
|  | 85 | +      "capabilities": ["edit", "chat", "vision"], | 
|  | 86 | +      "category": "accuracy", | 
|  | 87 | +      "status": "stable", | 
|  | 88 | +      "tier": "free", | 
|  | 89 | +      "contextWindow": { | 
|  | 90 | +        "maxInputTokens": 45000, | 
|  | 91 | +        "maxOutputTokens": 4000 | 
|  | 92 | +      } | 
|  | 93 | +    } | 
|  | 94 | +  ], | 
|  | 95 | +  "defaultModels": { | 
|  | 96 | +    "chat": "anthropic-byok::2024-10-22::claude-3.5-sonnet", | 
|  | 97 | +    "fastChat": "anthropic::2023-06-01::claude-3-haiku", | 
|  | 98 | +    "autocomplete": "fireworks::v1::deepseek-coder-v2-lite-base" | 
|  | 99 | +  } | 
|  | 100 | +} | 
|  | 101 | +``` | 
|  | 102 | + | 
|  | 103 | +In the configuration above, we: | 
|  | 104 | + | 
|  | 105 | +-   Enable Sourcegraph-supplied models (the `sourcegraph` field is not empty). | 
|  | 106 | +-   Define a new provider with the ID `"anthropic-byok"` and configure it to use the Anthropic API. | 
|  | 107 | +-   Since this provider is unknown to Sourcegraph, no Sourcegraph-supplied models are available for it. Therefore, | 
|  | 108 | +    we add a custom model in the `"modelOverrides"` section. | 
|  | 109 | +-   Use the custom model configured in the previous step (`"anthropic-byok::2024-10-22::claude-3.5-sonnet"`) for `"chat"`. | 
|  | 110 | +    Requests are sent directly to the Anthropic API as set in the provider override. | 
|  | 111 | +-   For `"fastChat"` and `"autocomplete"`, we use Sourcegraph-supplied models via Cody Gateway. | 
|  | 112 | + | 
|  | 113 | +## Config examples for various LLM providers | 
|  | 114 | + | 
|  | 115 | +Below are configuration examples for setting up various LLM providers using BYOK. | 
|  | 116 | +These examples are applicable whether or not you are using Sourcegraph-supported models. | 
|  | 117 | + | 
|  | 118 | +<Accordion title="Anthropic"> | 
|  | 119 | + | 
|  | 120 | +</Accordion> | 
0 commit comments