Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/cody/capabilities/chat.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

You can **chat** with Cody to ask questions about your code, generate code, and edit code. By default, Cody has the context of your open file and entire repository, and you can use `@` to add context for specific files, symbols, remote repositories, or other non-code artifacts.

You can do it from the **chat** panel of the supported editor extensions ([VS Code](/clients/install-vscode), [JetBrains](/clients/install-jetbrains), [Visual Studio](/clients/install-visual-studio)) or in the [web](/clients/cody-with-sourcegraph) app.
You can do it from the **chat** panel of the supported editor extensions ([VS Code](/cody/clients/install-vscode), [JetBrains](/cody/clients/install-jetbrains), [Visual Studio](/cody/clients/install-visual-studio)) or in the [web](/cody/clients/cody-with-sourcegraph) app.

## Prerequisites

Expand Down
6 changes: 3 additions & 3 deletions docs/cody/clients/cody-with-sourcegraph.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -29,11 +29,11 @@ The chat interface with your Code Search queries is operated per chat. You canno

## LLM Selection

Sourcegraph.com users with Cody **Free** and **Pro** can choose from a list of supported LLM models for a chat. Claude Sonnet 3.5 is the default LLM model, but users can select the LLM of their choice from the drop-down menu.
Cody Free and Pro users can select multiple models. Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model.

![llm-select-web](https://storage.googleapis.com/sourcegraph-assets/Docs/llm-select-web-0724.jpg)
You can read about these supported LLM models [here](/cody/capabilities/supported-models#chat-and-commands).

Users on an Enterprise Sourcegraph instance do not have the option to choose an LLM model. Their site admin will configure the default LLM model for chat. However, Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model.
![LLM-models-for-cody-free](https://storage.googleapis.com/sourcegraph-assets/Docs/llm-dropdown-options-102024.png)

## Selecting Context with @-mentions

Expand Down
Loading