Skip to content

Commit 4d31ee2

Browse files
committed
feat(docs): add additional configuration options
The changes introduce support for the OpenAI provider in the Cody VS Code extension configuration, along with additional configuration options for input and output token sizes, and provider-specific options. This provides more flexibility and options for users when configuring their Cody integration.
1 parent 0cf0b63 commit 4d31ee2

File tree

1 file changed

+14
-10
lines changed

1 file changed

+14
-10
lines changed

docs/cody/clients/install-vscode.mdx

Lines changed: 14 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -410,16 +410,20 @@ Example VS Code user settings JSON configuration:
410410

411411
### Provider configuration options
412412

413-
- `provider`: `"google"`, `"groq"` or `"ollama"`
414-
- The LLM provider type.
415-
- `model`: `string`
416-
- The ID of the model, e.g. `"gemini-1.5-pro-latest"`
417-
- `tokens`: `number` - optional
418-
- The context window size of the model. Default: `7000`.
419-
- `apiKey`: `string` - optional
420-
- The API key for the endpoint. Required if the provider is `"google"` or `"groq"`.
421-
- `apiEndpoint`: `string` - optional
422-
- The endpoint URL, if you don't want to use the provider’s default endpoint.
413+
- `"provider"`: `"google"`, `"groq"`, `"ollama"` or `"openai"`
414+
- The LLM provider type.
415+
- `"model"`: `string`
416+
- The ID of the model, e.g. "gemini-2.0-flash-exp"
417+
- `"inputTokens"`: `number` - optional
418+
- The context window size of the model's input. Default: 7000.
419+
- `"outputTokens"`: `number` - optional
420+
- The context window size of the model's output. Default: 4000.
421+
- `"apiKey"`: `string` - optional
422+
- The API key for the endpoint. Required if the provider is "google", "groq" or "OpenAI".
423+
- `"apiEndpoint"`: `string` - optional
424+
- The endpoint URL, if you don't want to use the provider’s default endpoint.
425+
- `"options"` : `object` - optional
426+
- Additional parameters like `temperature`, `topK`, `topP` based on provider documentation.
423427

424428
### Debugging experimental models
425429

0 commit comments

Comments
 (0)