diff --git a/docs/cody/capabilities/chat.mdx b/docs/cody/capabilities/chat.mdx index 9a77351e4..8a22999c9 100644 --- a/docs/cody/capabilities/chat.mdx +++ b/docs/cody/capabilities/chat.mdx @@ -4,7 +4,7 @@ You can **chat** with Cody to ask questions about your code, generate code, and edit code. By default, Cody has the context of your open file and entire repository, and you can use `@` to add context for specific files, symbols, remote repositories, or other non-code artifacts. -You can do it from the **chat** panel of the supported editor extensions ([VS Code](/clients/install-vscode), [JetBrains](/clients/install-jetbrains), [Visual Studio](/clients/install-visual-studio)) or in the [web](/clients/cody-with-sourcegraph) app. +You can do it from the **chat** panel of the supported editor extensions ([VS Code](/cody/clients/install-vscode), [JetBrains](/cody/clients/install-jetbrains), [Visual Studio](/cody/clients/install-visual-studio)) or in the [web](/cody/clients/cody-with-sourcegraph) app. ## Prerequisites diff --git a/docs/cody/clients/cody-with-sourcegraph.mdx b/docs/cody/clients/cody-with-sourcegraph.mdx index 68b834160..20d480003 100644 --- a/docs/cody/clients/cody-with-sourcegraph.mdx +++ b/docs/cody/clients/cody-with-sourcegraph.mdx @@ -29,11 +29,11 @@ The chat interface with your Code Search queries is operated per chat. You canno ## LLM Selection -Sourcegraph.com users with Cody **Free** and **Pro** can choose from a list of supported LLM models for a chat. Claude Sonnet 3.5 is the default LLM model, but users can select the LLM of their choice from the drop-down menu. +Cody Free and Pro users can select multiple models. Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model. -![llm-select-web](https://storage.googleapis.com/sourcegraph-assets/Docs/llm-select-web-0724.jpg) +You can read about these supported LLM models [here](/cody/capabilities/supported-models#chat-and-commands). -Users on an Enterprise Sourcegraph instance do not have the option to choose an LLM model. Their site admin will configure the default LLM model for chat. However, Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model. +![LLM-models-for-cody-free](https://storage.googleapis.com/sourcegraph-assets/Docs/llm-dropdown-options-102024.png) ## Selecting Context with @-mentions