diff --git a/docs/cody/faq.mdx b/docs/cody/faq.mdx index 8abb5f999..35d51d87f 100644 --- a/docs/cody/faq.mdx +++ b/docs/cody/faq.mdx @@ -114,3 +114,13 @@ Yes, Cody supports the following cloud development environments: - vscode.dev and GitHub Codespaces (install from the VS Code extension marketplace) - Any editor supporting the [Open VSX Registry](https://open-vsx.org/extension/sourcegraph/cody-ai), including [Gitpod](https://www.gitpod.io/blog/boosting-developer-productivity-unleashing-the-power-of-sourcegraph-cody-in-gitpod), Coder, and `code-server` (install from the [Open VSX Registry](https://open-vsx.org/extension/sourcegraph/cody-ai)) + +### Can I use my LLM of preference to chat with Cody on CLI? + +Yes you can. In the CLI you can use the following command to get started. Please replace `$name_of_the_model` with the LLM model of your choice. + +``` +cody chat --model '$name_of_the_model' -m 'Hi Cody!' +``` + +For example, to use Claude 3.5 Sonnet, you'd pass the following command in your terminal, `cody chat --model 'claude-3.5-sonnet' -m 'Hi Cody!'