From 2edc6d8f23711df993f7c2d43c937de567d4d1b4 Mon Sep 17 00:00:00 2001 From: Ado Kukic Date: Tue, 5 Nov 2024 11:24:21 -0800 Subject: [PATCH 01/15] Hackathon --- docs/cody/clients/install-vscode.mdx | 1 + 1 file changed, 1 insertion(+) diff --git a/docs/cody/clients/install-vscode.mdx b/docs/cody/clients/install-vscode.mdx index a38eb4556..15900e6f1 100644 --- a/docs/cody/clients/install-vscode.mdx +++ b/docs/cody/clients/install-vscode.mdx @@ -1,3 +1,4 @@ + # Installing Cody in VS Code

Learn how to use Cody and its features with the VS Code editor.

From 16007c6340dd2abe9343d359420be3af6861e5b2 Mon Sep 17 00:00:00 2001 From: Maedah Batool Date: Tue, 5 Nov 2024 11:53:18 -0800 Subject: [PATCH 02/15] Edit comment format --- docs/cody/clients/install-vscode.mdx | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/docs/cody/clients/install-vscode.mdx b/docs/cody/clients/install-vscode.mdx index 15900e6f1..5841c11b4 100644 --- a/docs/cody/clients/install-vscode.mdx +++ b/docs/cody/clients/install-vscode.mdx @@ -1,4 +1,5 @@ - +{/* Hackathon */} + # Installing Cody in VS Code

Learn how to use Cody and its features with the VS Code editor.

From 9aaca294e208f5689e972ce3b50522a7c886f5f4 Mon Sep 17 00:00:00 2001 From: Chris Sev Date: Wed, 6 Nov 2024 10:07:40 -0700 Subject: [PATCH 03/15] install from website link --- docs/cody/clients/install-vscode.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/cody/clients/install-vscode.mdx b/docs/cody/clients/install-vscode.mdx index 5841c11b4..b7b64032f 100644 --- a/docs/cody/clients/install-vscode.mdx +++ b/docs/cody/clients/install-vscode.mdx @@ -17,7 +17,7 @@ The Cody extension by Sourcegraph enhances your coding experience in VS Code by ## Install the VS Code extension -Follow these steps to install the Cody AI extension for VS Code: +You can install VS Code directly from the [VS Code extension marketplace listing](https://marketplace.visualstudio.com/items?itemName=sourcegraph.cody-ai) or by following these steps directly within VS Code: - Open VS Code editor on your local machine - Click the **Extensions** icon in the Activity Bar on the side of VS Code, or use the keyboard shortcut `Cmd+Shift+X` (macOS) or `Ctrl+Shift+X` (Windows/Linux) From 3d266f5ebebde751448b326cb6d74d3c7ed3ce0d Mon Sep 17 00:00:00 2001 From: Chris Sev Date: Wed, 6 Nov 2024 10:09:14 -0700 Subject: [PATCH 04/15] supported llms link --- docs/cody/clients/install-vscode.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/cody/clients/install-vscode.mdx b/docs/cody/clients/install-vscode.mdx index b7b64032f..e81863b9c 100644 --- a/docs/cody/clients/install-vscode.mdx +++ b/docs/cody/clients/install-vscode.mdx @@ -138,7 +138,7 @@ A chat history icon at the top of your chat input window allows you to navigate ### Changing LLM model for chat - You need to be a Cody Free or Pro user to have multi-model selection capability. Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model. + You need to be a Cody Free or Pro user to have multi-model selection capability. You can view which LLMs you have access to on our [supported LLMs page](/cody/capabilities/supported-models). Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model. For Chat: From eab3772c5cda4e7740c635d35043c7122624f020 Mon Sep 17 00:00:00 2001 From: Chris Sev Date: Wed, 6 Nov 2024 10:12:49 -0700 Subject: [PATCH 05/15] wording on file too large errors --- docs/cody/clients/install-vscode.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/cody/clients/install-vscode.mdx b/docs/cody/clients/install-vscode.mdx index e81863b9c..a4006645f 100644 --- a/docs/cody/clients/install-vscode.mdx +++ b/docs/cody/clients/install-vscode.mdx @@ -164,7 +164,7 @@ The `@-file` also supports line numbers to query the context of large files. You When you `@-mention` files to add to Cody’s context window, the file lookup takes `files.exclude`, `search.exclude`, and `.gitgnore` files into account. This makes the file search faster as a result up to 100ms. -Moreover, when you `@-mention` files, Cody will track the number of characters in those files against the context window limit of the selected chat model. As you `@-mention` multiple files, Cody will calculate how many tokens of the context window remain. When the remaining context window size becomes too small, you get **File too large** errors for further more `@-mention` files. +Moreover, when you `@-mention` files, Cody will track the number of characters in those files against the context window limit of the selected chat model. As you `@-mention` multiple files, Cody will calculate how many tokens of the context window remain. When the remaining context window size becomes too small, you'll receive **File too large** errors when attempting to `@-mention` additional files. Cody defaults to showing @-mention context chips for all the context it intends to use. When you open a new chat, Cody will show context chips for your current repository and current file (or file selection if you have code highlighted). From a170f20c34bde646261a38958780e4d732540925 Mon Sep 17 00:00:00 2001 From: Chris Sev Date: Wed, 6 Nov 2024 10:30:55 -0700 Subject: [PATCH 06/15] verbiage --- docs/cody/clients/install-vscode.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/cody/clients/install-vscode.mdx b/docs/cody/clients/install-vscode.mdx index a4006645f..573742319 100644 --- a/docs/cody/clients/install-vscode.mdx +++ b/docs/cody/clients/install-vscode.mdx @@ -180,7 +180,7 @@ When you have both a repository and files @-mentioned, Cody will search the repo ### @-mention context providers with OpenCtx -OpenCtx context providers is in Experimental stage for all Cody users. Enterprise users can also use this but with limited support. If you have feedback or questions, please visit our [support forum](https://community.sourcegraph.com/c/openctx/10). +OpenCtx context providers are in Experimental stage for all Cody users. Enterprise users can also use this but with limited support. If you have feedback or questions, please visit our [support forum](https://community.sourcegraph.com/c/openctx/10). [OpenCtx](https://openctx.org/) is an open standard for bringing contextual info about code into your dev tools. Cody Free and Pro users can use OpenCtx providers to fetch and use context from the following sources: From 3a8063a9d32ba1d5e8459ce97070b963dc7e81ec Mon Sep 17 00:00:00 2001 From: Chris Sev Date: Wed, 6 Nov 2024 10:53:36 -0700 Subject: [PATCH 07/15] re-run with context image --- docs/cody/clients/install-vscode.mdx | 2 ++ 1 file changed, 2 insertions(+) diff --git a/docs/cody/clients/install-vscode.mdx b/docs/cody/clients/install-vscode.mdx index 573742319..497bd1a92 100644 --- a/docs/cody/clients/install-vscode.mdx +++ b/docs/cody/clients/install-vscode.mdx @@ -211,6 +211,8 @@ If Cody's answer isn't helpful, you can try asking again with different context: - Current file only: Re-run the prompt again using just the current file as context. - Add context: Provides @-mention context options to improve the response by explicitly including files, symbols, remote repositories, or even web pages (by URL). +![re-run-with-context](https://storage.googleapis.com/sourcegraph-assets/Docs/re-run-with-context.png) + ## Context fetching mechanism VS Code users on the Free or Pro plan use [local context](/cody/core-concepts/context#context-selection). From 527566ed30cd6e9d0f881bedfcdb781b627afa0a Mon Sep 17 00:00:00 2001 From: Chris Sev Date: Wed, 6 Nov 2024 11:01:53 -0700 Subject: [PATCH 08/15] smart apply image --- docs/cody/clients/install-vscode.mdx | 6 ++++-- 1 file changed, 4 insertions(+), 2 deletions(-) diff --git a/docs/cody/clients/install-vscode.mdx b/docs/cody/clients/install-vscode.mdx index 497bd1a92..713292ee0 100644 --- a/docs/cody/clients/install-vscode.mdx +++ b/docs/cody/clients/install-vscode.mdx @@ -269,9 +269,11 @@ For customization and advanced use cases, you can create **Custom Commands** tai Cody lets you dynamically insert code from chat into your files with **Smart Apply**. Every time Cody provides you with a code suggestion, you can click the **Apply** button. Cody will then analyze your open code file, find where that relevant code should live, and add a diff. For chat messages where Cody provides multiple code suggestions, you can apply each in sequence to go from chat suggestions to written code. -Smart Apply also supports the executing of commands in the terminal. When you ask Cody a question related to terminal commands, you can now execute the suggestion in your terminal by clicking the `Execute` button in the chat window. +![smart-apply-code](https://storage.googleapis.com/sourcegraph-assets/Docs/smart-apply-102024.png) -![smart-apply](https://storage.googleapis.com/sourcegraph-assets/Docs/smart-apply-102024.png) +Smart Apply also supports executing commands in the terminal. When you ask Cody a question related to terminal commands, you can execute the suggestion in your terminal by clicking the `Execute` button in the chat window. + +![smart-apply-execute](https://storage.googleapis.com/sourcegraph-assets/Docs/smart-apply-102024.png) ## Keyboard shortcuts From 591a3e3f7a8b2337408dc012b0e2b41840d88e68 Mon Sep 17 00:00:00 2001 From: Chris Sev Date: Wed, 6 Nov 2024 11:05:13 -0700 Subject: [PATCH 09/15] supported llm section --- docs/cody/clients/install-vscode.mdx | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/cody/clients/install-vscode.mdx b/docs/cody/clients/install-vscode.mdx index 713292ee0..e62fb791c 100644 --- a/docs/cody/clients/install-vscode.mdx +++ b/docs/cody/clients/install-vscode.mdx @@ -299,9 +299,9 @@ Cody also works with Cursor, Gitpod, IDX, and other similar VS Code forks. To ac ## Supported LLM models -Claude Sonnet 3.5 is the default LLM model for inline edits and commands. If you've used Claude 3 Sonnet for inline edit or commands before, remember to manually update the model. The default model change only affects new users. +Claude 3.5 Sonnet is the default LLM model for inline edits and commands. If you've used a different or older LLM model for inline edits or commands before, remember to manually change your model to Claude 3.5 Sonnet. Default model changes only affect new users. -Users on Cody **Free** and **Pro** can choose from a list of supported LLM models for Chat and Commands. +Users on Cody **Free** and **Pro** can choose from a list of [supported LLM models](/cody/capabilities/supported-models) for Chat and Commands. ![LLM-models-for-cody-free](https://storage.googleapis.com/sourcegraph-assets/Docs/llm-dropdown-options-102024.png) From fe31bbce2dc05274ea0506f79e7b26393d8020d0 Mon Sep 17 00:00:00 2001 From: Chris Sev Date: Wed, 6 Nov 2024 11:19:55 -0700 Subject: [PATCH 10/15] ollama updates --- docs/cody/clients/install-vscode.mdx | 21 +++++++++------------ 1 file changed, 9 insertions(+), 12 deletions(-) diff --git a/docs/cody/clients/install-vscode.mdx b/docs/cody/clients/install-vscode.mdx index e62fb791c..226bd9dcf 100644 --- a/docs/cody/clients/install-vscode.mdx +++ b/docs/cody/clients/install-vscode.mdx @@ -322,21 +322,18 @@ You also get additional capabilities like BYOLLM (Bring Your Own LLM), supportin To get autocomplete suggestions from Ollama locally, follow these steps: - Install and run [Ollama](https://ollama.ai/) -- Download one of the supported local models: - - `ollama pull deepseek-coder:6.7b-base-q4_K_M` for [deepseek-coder](https://ollama.ai/library/deepseek-coder) - - `ollama pull codellama:7b-code` for [codellama](https://ollama.ai/library/codellama) - - `ollama pull starcoder2:7b` for [codellama](https://ollama.ai/library/starcoder2) +- Download one of the supported local models using `pull`. The `pull` command is used to download models from the Ollama library to your local machine. + - `ollama pull deepseek-coder-v2` for [deepseek-coder](https://ollama.com/library/deepseek-coder-v2) + - `ollama pull codellama:13b` for [codellama](https://ollama.ai/library/codellama) + - `ollama pull starcoder2:7b` for [starcoder2](https://ollama.ai/library/starcoder2) - Update Cody's VS Code settings to use the `experimental-ollama` autocomplete provider and configure the right model: ```json - - { - "cody.autocomplete.advanced.provider": "experimental-ollama", - "cody.autocomplete.experimental.ollamaOptions": { - "url": "http://localhost:11434", - "model": "deepseek-coder:6.7b-base-q4_K_M" - } - } +"cody.autocomplete.advanced.provider": "experimental-ollama", +"cody.autocomplete.experimental.ollamaOptions": { + "url": "http://localhost:11434", + "model": "deepseek-coder-v2" +} ``` - Confirm Cody uses Ollama by looking at the Cody output channel or the autocomplete trace view (in the command palette) From 5e4ae3a9fcd7d85cc52b63550a5ff8dd9fab98ab Mon Sep 17 00:00:00 2001 From: Chris Sev Date: Wed, 6 Nov 2024 11:20:01 -0700 Subject: [PATCH 11/15] ollama updates --- docs/cody/clients/install-vscode.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/cody/clients/install-vscode.mdx b/docs/cody/clients/install-vscode.mdx index 226bd9dcf..ba8cda115 100644 --- a/docs/cody/clients/install-vscode.mdx +++ b/docs/cody/clients/install-vscode.mdx @@ -351,7 +351,7 @@ To generate chat and commands with Ollama locally, follow these steps: - Download [Ollama](https://ollama.com/download) - Start Ollama (make sure the Ollama logo is showing up in your menu bar) - Select a chat model (model that includes instruct or chat, for example, [gemma:7b-instruct-q4_K_M](https://ollama.com/library/gemma:7b-instruct-q4_K_M)) from the [Ollama Library](https://ollama.com/library) -- Pull the chat model locally (for example, `ollama pull gemma:7b-instruct-q4_K_M`) +- Pull (download) the chat model locally (for example, `ollama pull gemma:7b-instruct-q4_K_M`) - Once the chat model is downloaded successfully, open Cody in VS Code - Open a new Cody chat - In the new chat panel, you should see the chat model you've pulled in the dropdown list From f228d6a504093c1ef3e13bc386ba3f7d6db3fc26 Mon Sep 17 00:00:00 2001 From: Maedah Batool Date: Wed, 6 Nov 2024 16:09:38 -0800 Subject: [PATCH 12/15] Remove label --- docs/cody/clients/install-vscode.mdx | 4 +--- 1 file changed, 1 insertion(+), 3 deletions(-) diff --git a/docs/cody/clients/install-vscode.mdx b/docs/cody/clients/install-vscode.mdx index ba8cda115..3725e9d4b 100644 --- a/docs/cody/clients/install-vscode.mdx +++ b/docs/cody/clients/install-vscode.mdx @@ -1,5 +1,3 @@ -{/* Hackathon */} - # Installing Cody in VS Code

Learn how to use Cody and its features with the VS Code editor.

@@ -333,7 +331,7 @@ To get autocomplete suggestions from Ollama locally, follow these steps: "cody.autocomplete.experimental.ollamaOptions": { "url": "http://localhost:11434", "model": "deepseek-coder-v2" -} +} ``` - Confirm Cody uses Ollama by looking at the Cody output channel or the autocomplete trace view (in the command palette) From 25193210b902a6383eb592f15504e314cc805c89 Mon Sep 17 00:00:00 2001 From: Maedah Batool Date: Wed, 6 Nov 2024 16:12:47 -0800 Subject: [PATCH 13/15] Replace prompts --- docs/cody/clients/install-vscode.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/cody/clients/install-vscode.mdx b/docs/cody/clients/install-vscode.mdx index 3725e9d4b..74760a3f9 100644 --- a/docs/cody/clients/install-vscode.mdx +++ b/docs/cody/clients/install-vscode.mdx @@ -299,7 +299,7 @@ Cody also works with Cursor, Gitpod, IDX, and other similar VS Code forks. To ac Claude 3.5 Sonnet is the default LLM model for inline edits and commands. If you've used a different or older LLM model for inline edits or commands before, remember to manually change your model to Claude 3.5 Sonnet. Default model changes only affect new users. -Users on Cody **Free** and **Pro** can choose from a list of [supported LLM models](/cody/capabilities/supported-models) for Chat and Commands. +Users on Cody **Free** and **Pro** can choose from a list of [supported LLM models](/cody/capabilities/supported-models) for Chat and Prompts. ![LLM-models-for-cody-free](https://storage.googleapis.com/sourcegraph-assets/Docs/llm-dropdown-options-102024.png) From 7a35a73df18c9072d10d8a5c3f394de71c221123 Mon Sep 17 00:00:00 2001 From: Maedah Batool Date: Wed, 6 Nov 2024 16:14:15 -0800 Subject: [PATCH 14/15] Replace prompts --- docs/cody/clients/install-vscode.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/cody/clients/install-vscode.mdx b/docs/cody/clients/install-vscode.mdx index 74760a3f9..2389a4e6f 100644 --- a/docs/cody/clients/install-vscode.mdx +++ b/docs/cody/clients/install-vscode.mdx @@ -299,7 +299,7 @@ Cody also works with Cursor, Gitpod, IDX, and other similar VS Code forks. To ac Claude 3.5 Sonnet is the default LLM model for inline edits and commands. If you've used a different or older LLM model for inline edits or commands before, remember to manually change your model to Claude 3.5 Sonnet. Default model changes only affect new users. -Users on Cody **Free** and **Pro** can choose from a list of [supported LLM models](/cody/capabilities/supported-models) for Chat and Prompts. +Users on Cody **Free** and **Pro** can choose from a list of [supported LLM models](/cody/capabilities/supported-models) for chat. ![LLM-models-for-cody-free](https://storage.googleapis.com/sourcegraph-assets/Docs/llm-dropdown-options-102024.png) From f39de0e33862700b4559e0b4627505b73f346baf Mon Sep 17 00:00:00 2001 From: Maedah Batool Date: Wed, 6 Nov 2024 16:14:32 -0800 Subject: [PATCH 15/15] Replace prompts in supported LLM docs --- docs/cody/clients/install-vscode.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/cody/clients/install-vscode.mdx b/docs/cody/clients/install-vscode.mdx index 2389a4e6f..662aa0c14 100644 --- a/docs/cody/clients/install-vscode.mdx +++ b/docs/cody/clients/install-vscode.mdx @@ -297,7 +297,7 @@ Cody also works with Cursor, Gitpod, IDX, and other similar VS Code forks. To ac ## Supported LLM models -Claude 3.5 Sonnet is the default LLM model for inline edits and commands. If you've used a different or older LLM model for inline edits or commands before, remember to manually change your model to Claude 3.5 Sonnet. Default model changes only affect new users. +Claude 3.5 Sonnet is the default LLM model for inline edits and prompts. If you've used a different or older LLM model for inline edits or commands before, remember to manually change your model to Claude 3.5 Sonnet. Default model changes only affect new users. Users on Cody **Free** and **Pro** can choose from a list of [supported LLM models](/cody/capabilities/supported-models) for chat.