From 25f4ea2badfc9061eab0a0641028641896688fad Mon Sep 17 00:00:00 2001 From: Ado Kukic Date: Tue, 5 Nov 2024 11:25:34 -0800 Subject: [PATCH 1/7] Hackathon --- docs/cody/clients/install-jetbrains.mdx | 1 + 1 file changed, 1 insertion(+) diff --git a/docs/cody/clients/install-jetbrains.mdx b/docs/cody/clients/install-jetbrains.mdx index 79a8e73df..d7c0bf48a 100644 --- a/docs/cody/clients/install-jetbrains.mdx +++ b/docs/cody/clients/install-jetbrains.mdx @@ -1,3 +1,4 @@ + # Installing Cody for JetBrains

Learn how to use Cody and its features with the JetBrains IntelliJ editor.

From 9c8397b49c520f6cd709083057a83a8e851f2d26 Mon Sep 17 00:00:00 2001 From: Maedah Batool Date: Tue, 5 Nov 2024 11:54:31 -0800 Subject: [PATCH 2/7] Edit comment format --- docs/cody/clients/install-jetbrains.mdx | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/docs/cody/clients/install-jetbrains.mdx b/docs/cody/clients/install-jetbrains.mdx index d7c0bf48a..95ea093e3 100644 --- a/docs/cody/clients/install-jetbrains.mdx +++ b/docs/cody/clients/install-jetbrains.mdx @@ -1,4 +1,5 @@ - +{/* Hackathon */} + # Installing Cody for JetBrains

Learn how to use Cody and its features with the JetBrains IntelliJ editor.

From cf73d9f185dcbaf0324b6d79d2f7ab37c6fc8842 Mon Sep 17 00:00:00 2001 From: morgangauth <113058716+morgangauth@users.noreply.github.com> Date: Thu, 7 Nov 2024 16:42:03 -0800 Subject: [PATCH 3/7] Update Install for Jetbrains page --- docs/cody/clients/install-jetbrains.mdx | 57 +++++++------------------ 1 file changed, 16 insertions(+), 41 deletions(-) diff --git a/docs/cody/clients/install-jetbrains.mdx b/docs/cody/clients/install-jetbrains.mdx index 95ea093e3..9471df0a3 100644 --- a/docs/cody/clients/install-jetbrains.mdx +++ b/docs/cody/clients/install-jetbrains.mdx @@ -1,8 +1,6 @@ -{/* Hackathon */} - # Installing Cody for JetBrains -

Learn how to use Cody and its features with the JetBrains IntelliJ editor.

+

Learn how to use Cody and its features with JetBrains editors.

The Cody extension by Sourcegraph enhances your coding experience in your IDE by providing intelligent code suggestions, context-aware completions, and advanced code analysis. This guide will walk you through the steps to install and set up Cody within your JetBrains environment. @@ -41,9 +39,9 @@ Alternatively, you can [Download and install the extension from the Jetbrains ma ## Connect the extension to Sourcegraph -After a successful installation, Cody's icon appears in the sidebar. Clicking it prompts you to start with codehosts like GitHub, GitLab, and your Google login. This allows Cody to access your Sourcegraph.com account. +After a successful installation, Cody's icon appears in the sidebar. Cody Free and Cody Pro users can sign in to their Sourcegraph.com accounts using SSO through GitHub, GitLab, or Google. -Alternatively, you can also click the **Sign in with an Enterprise Instance** to connect to your enterprise instance. +Sourcegraph Enterprise users should connect Cody to their Enterprise instance by clicking **Sign in with an Enterprise Instance**. ![cody-for-intellij-login](https://storage.googleapis.com/sourcegraph-assets/Docs/Media/sign-in-cody-jb.png) @@ -53,13 +51,9 @@ To connect the extension with your Enterprise instance, - Click **Sign in with an Enterprise Instance** - Enter the server for your enterprise instance (e.g. `https://.sourcegraph.com`) -- Select **Generate new token**. You'll be directed to the **Access tokens** page on your instance in the browser -- Generate a new token, copy it, and paste it into the **Token** field in your editor -- Click **Sign in** - -### For Sourcegraph.com users +- Select **Authorize in Browser**. You'll be directed to an authorization page on your instance in the browser -For Cody Free and Cody Pro users, you can Log in through SSO to authenticate the IntelliJ extension with your Sourcegraph.com account. +Alternatively, you can access advanced authorization settings by clicking **Show Advanced**. From here, you can manually enter a token generated from your User Settings in your Sourcegraph Enterprise instance, or add optional custom request headers. ## Verifying the installation @@ -77,7 +71,7 @@ Cody provides intelligent code suggestions and context-aware autocompletions for ## Chat -Cody chat in JetBrains is available in a unified interface opened right next to your code. Once connected to Sourcegraph, a new chat input field is opened with a default `@-mention` [context chips](#context-retrieval). +Cody chat in JetBrains is available in a unified interface opened right next to your code. Once connected to Sourcegraph, a new chat input field is opened with a default `@-mention` [context chip](#context-retrieval) for the current file. All your previous and existing chats are stored for later use and can be accessed via the **History** icon from the top menu. You can download them to share or use later in a `.json` file or delete them altogether. @@ -89,11 +83,11 @@ Since your first message to Cody anchors the conversation, you can return to the ![chat-interface](https://storage.googleapis.com/sourcegraph-assets/Docs/jb-chat-interface-102024.png) - Users should be on the Cody for JetBrains extension v2023.2 or more to get this new and improved chat UI. + Users must be on JetBrains v2023.2 and Cody plugin v7.0.0 or above to get the new and improved chat UI. ### Changing LLM model for chat - You need to be a Cody Free or Pro user to have multi-model selection capability. Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model. +Multi-model selection capability is available for all Cody Free and Pro users. Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model. For Chat: @@ -103,22 +97,19 @@ For Chat: For Edit: -- On any file, select some code and a right-click +- On any file, select some code and right-click - Select **Cody > Edit Code** - Select the default model available (this is Claude Sonnet 3.5) - See the selection of models and click the model you desire. This model will now be the default model going forward on any new edits ## Supported LLM models -Users on Cody **Free** and **Pro** can choose from a list of supported LLM models for Chat and Commands. Both Cody Free and Pro users can choose from a list of supported LLM models for Chat and Commands. +Both Cody Free and Pro users can choose from a list of supported LLM models for Chat and Commands. ![llm-selection-cody](https://storage.googleapis.com/sourcegraph-assets/Docs/jb-llm-select-0824.jpg) -Enterprise users get Claude 3 (Opus and Sonnet) as the default LLM models without extra cost. Moreover, Enterprise users can use Claude 3.5 models through Cody Gateway, Anthropic BYOK, AWS Bedrock (limited availability), and GCP Vertex. +Enterprise users who have [Model Configuration](/cody/clients/model-configuration#model-configuration) configured will also be able to select from the available models for their instance. On instances with the ["completions" configuration](/cody/clients/model-configuration#completions-configuration), a site administrator determines the LLM, and it cannot be changed within the editor. -For enterprise users on AWS Bedrock: 3.5 Sonnet is unavailable in `us-west-2` but available in `us-east-1`. Check the current model availability on AWS and your customer's instance location before switching. Provisioned throughput via AWS is not supported for 3.5 Sonnet. - -You also get additional capabilities like BYOLLM (Bring Your Own LLM), supporting Single-Tenant and Self Hosted setups for flexible coding environments. Your site administrator determines the LLM, and cannot be changed within the editor. However, Cody Enterprise users when using Cody Gateway have the ability to [configure custom models](/cody/core-concepts/cody-gateway#configuring-custom-models) Anthropic (like Claude 3 Opus and Claude Haiku), OpenAI (GPT 3.5 Turbo and GPT 4 Turbo) and Google Gemini 1.5 models (Flash and Pro). Read and learn more about the [supported LLMs](/cody/capabilities/supported-models) and [token limits](/cody/core-concepts/token-limits) on Cody Free, Pro and Enterprise. @@ -140,7 +131,7 @@ When you start a new Cody chat, the chat input window opens with a default `@-me ![jb-context-retrieval](https://storage.googleapis.com/sourcegraph-assets/Docs/jb-context-retrieval-102024.png) -At any point in time, you can edit these context chips or remove them completely if you do not want to use these as context. Any chat without a context chip will instruct Cody to use no codebase context. However, you can always provide an alternate `@-mention` file to let Cody use it as a new source of context. +At any point in time, you can edit these context chips, add additional context chips, or remove them completely if you do not want to use these as context. Any chat without a context chip will instruct Cody to use no codebase context. However, you can always provide an alternate `@-mention` file to let Cody use it as a new source of context. When you have both a repository and files @-mentioned, Cody will search the repository for context while prioritizing the mentioned files. @@ -150,11 +141,10 @@ Cody's chat allows you to add files as context in your messages. Type `@-file` a The `@-file` also supports line numbers to query the context of large files. You can add ranges of large files to your context by @-mentioning a large file and appending a number range to the filename, for example, `@filepath/filename:1-10`. -When you `@-mention` files to add to Cody’s context window, the file lookup takes `files.exclude`, `search.exclude`, and `.gitgnore` files into account. This makes the file search faster as a result up to 100ms. - Moreover, when you `@-mention` files, Cody will track the number of characters in those files against the context window limit of the selected chat model. As you `@-mention` multiple files, Cody will calculate how many tokens of the context window remain. When the remaining context window size becomes too small, you get **File too large** errors for further more `@-mention` files. -Cody defaults to showing @-mention context chips for all the context it intends to use. When you open a new chat, Cody will show context chips for your current repository and current file (or file selection if you have code highlighted). +You can read more about context limits when selecting context [here](/cody/core-concepts/token-limits). + ### Rerun prompts with different context @@ -174,21 +164,6 @@ Enterprise users can leverage the full power of the Sourcegraph search engine as Read more about [Context fetching mechanism](/cody/core-concepts/context/#context-fetching-mechanism) in detail. -## Context scope - -JetBrains users on the Free or Pro plan get single-repo support in chat and can use one repo for context fetching. Enterprise users get multi-repo support in chat and can explicitly specify **up to 10 additional repos** they would like Cody to use for context. - -## Context Selection - -Cody automatically understands the context of your codebase for all Cody Free, Pro, and Enterprise users based on the project opened in your workspace. Enterprise users can add up to **9 additional repos (10 total, including the default project)** to use as context. The multi-repo context for Enterprise is powered by Sourcegraph code search and allows Cody to use the selected codebase to answer your questions. - -Moreover, Cody's chat allows you to add files as context in your messages. Type `@-file` in the Cody chat window and then a filename to include a file as context. - -### Chat with multi-repo context - -For Cody Free and Cody Pro users, if you delete all the context from the chat input, Cody will not use any local context. When this happens, Cody doesn't search your local project for context and sends your prompt to the selected LLM. - -Cody Enterprise users can add remote repositories from your Sourcegraph instance. You can type the name of your repositories into this interface and select up to **9 additional repos (10 total, including the default project)**. Cody will then search against those repositories and retrieve relevant files to answer your chat prompts. ### Cody Context Filters @@ -254,7 +229,7 @@ Like the Edit Code and Generate Unit Test commands, you can generate inline docu ## Autocomplete -Cody provides multi-line autocomplete as you type. Autocomplete suggestions appear as inlay suggestions and are enabled by default in your JetBrains IntelliJ editor. With this setting, there is a list of programming languages supported and enabled by default. +Cody provides multi-line autocomplete as you type. Autocomplete suggestions appear as inlay suggestions and are enabled by default in your JetBrains IDE. With this setting, there is a list of programming languages supported and enabled by default. To manually configure the Autocomplete feature, @@ -278,7 +253,7 @@ In addition, you can use the following keyboard shortcuts to interact with Cody' To add or remove an account you can do the following: -1. Open your IntelliJ settings by selecting **IntelliJ IDEA | Settings** on macOS or **File | Settings** on Windows and Linux from the main menu. +1. Open your IDE settings by selecting **IDE | Settings** on macOS or **File | Settings** on Windows and Linux from the main menu. 1. Get to the Cody Settings by navigating to `Tools -> Sourcegraph & Cody` 1. Under authentication see the accounts that are currently logged in 1. To remove, select your account and hit `-`. To add click `+` and choose the appropriate login method From 67532fd2bdae889d5f9de764ea57c06543824c85 Mon Sep 17 00:00:00 2001 From: morgangauth <113058716+morgangauth@users.noreply.github.com> Date: Mon, 11 Nov 2024 14:39:02 -0800 Subject: [PATCH 4/7] Remove mentions of IntelliJ to make docs general to any jetbrains IDE --- docs/cody/clients/install-jetbrains.mdx | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/docs/cody/clients/install-jetbrains.mdx b/docs/cody/clients/install-jetbrains.mdx index 9471df0a3..d259604cd 100644 --- a/docs/cody/clients/install-jetbrains.mdx +++ b/docs/cody/clients/install-jetbrains.mdx @@ -25,17 +25,17 @@ The Cody extension by Sourcegraph enhances your coding experience in your IDE by - [RubyMine](https://www.jetbrains.com/ruby/) - [WebStorm](https://www.jetbrains.com/webstorm/) -## Install the JetBrains IntelliJ Cody extension +## Install the JetBrains Cody extension Follow these steps to install the Cody plugin: -- Open JetBrains IntelliJ editor on your local machine +- Open a supported JetBrains editor on your local machine - Open **Settings** (Mac: `⌘+,` Windows: `Ctrl+Alt+S`) and select **Plugins** - Type and search **Cody: AI Coding Assistant with Autocomplete & Chat** extension and click **Install** ![cody-for-intellij](https://storage.googleapis.com/sourcegraph-assets/Docs/cody-for-intellij-062024.png) -Alternatively, you can [Download and install the extension from the Jetbrains marketplace](https://plugins.jetbrains.com/plugin/9682-sourcegraph). +Alternatively, you can [Download and install the extension from the JetBrains marketplace](https://plugins.jetbrains.com/plugin/9682-sourcegraph). ## Connect the extension to Sourcegraph @@ -61,7 +61,7 @@ Once Cody is successfully connected, you'll see that the sign-in panel has been Cody provides intelligent code suggestions and context-aware autocompletions for numerous programming languages like JavaScript, Python, TypeScript, Go, etc. -- Create a new file in IntelliJ, for example, `code.js` +- Create a new file in JetBrains IDE, for example, `code.js` - As you start typing, Cody will automatically provide suggestions and context-aware completions based on your coding patterns and the code context - These autocomplete suggestions appear as grayed text. To accept the suggestion, press the `Tab` key @@ -263,7 +263,7 @@ To add or remove an account you can do the following: Our nightly release channel gets updated much more frequently and might be helpful to verify bug fixes that will come in the next stable release. To update your update channel you can do the following: -1. Open your IntelliJ settings by selecting **IntelliJ IDEA | Settings** on macOS or **File | Settings** on Windows and Linux from the main menu. +1. Open your JetBrains IDE settings by selecting **IDE Name | Settings** on macOS or **File | Settings** on Windows and Linux from the main menu. 1. Get to the Cody Settings by navigating to `Tools -> Sourcegraph & Cody` 1. Under update channel select `Stable` or `Nightly` From 6a8276ac6039a2b5da4eb010fd72d5cd6a97f4d7 Mon Sep 17 00:00:00 2001 From: morgangauth <113058716+morgangauth@users.noreply.github.com> Date: Mon, 11 Nov 2024 15:54:28 -0800 Subject: [PATCH 5/7] Reorder to match updated VSCode page --- docs/cody/clients/install-jetbrains.mdx | 192 ++++++++++++++---------- 1 file changed, 111 insertions(+), 81 deletions(-) diff --git a/docs/cody/clients/install-jetbrains.mdx b/docs/cody/clients/install-jetbrains.mdx index d259604cd..0ea0eb481 100644 --- a/docs/cody/clients/install-jetbrains.mdx +++ b/docs/cody/clients/install-jetbrains.mdx @@ -2,7 +2,7 @@

Learn how to use Cody and its features with JetBrains editors.

-The Cody extension by Sourcegraph enhances your coding experience in your IDE by providing intelligent code suggestions, context-aware completions, and advanced code analysis. This guide will walk you through the steps to install and set up Cody within your JetBrains environment. +The Cody plugin by Sourcegraph enhances your coding experience in your IDE by providing intelligent code suggestions, context-aware completions, and advanced code analysis. This guide will walk you through the steps to install and set up Cody within your JetBrains environment. @@ -25,53 +25,64 @@ The Cody extension by Sourcegraph enhances your coding experience in your IDE by - [RubyMine](https://www.jetbrains.com/ruby/) - [WebStorm](https://www.jetbrains.com/webstorm/) -## Install the JetBrains Cody extension +## Install the JetBrains Cody plugin Follow these steps to install the Cody plugin: - Open a supported JetBrains editor on your local machine -- Open **Settings** (Mac: `⌘+,` Windows: `Ctrl+Alt+S`) and select **Plugins** -- Type and search **Cody: AI Coding Assistant with Autocomplete & Chat** extension and click **Install** +- Open **Settings** (MacOS: `⌘+,` Windows: `Ctrl+Alt+S`) and select **Plugins** +- Search for **Cody: AI Coding Assistant with Autocomplete & Chat** in the marketplace and click **Install** ![cody-for-intellij](https://storage.googleapis.com/sourcegraph-assets/Docs/cody-for-intellij-062024.png) -Alternatively, you can [Download and install the extension from the JetBrains marketplace](https://plugins.jetbrains.com/plugin/9682-sourcegraph). +Alternatively, you can also [download and install the plugin from the JetBrains marketplace](https://plugins.jetbrains.com/plugin/9682-sourcegraph) directly. -## Connect the extension to Sourcegraph +## Connect the plugin to Sourcegraph -After a successful installation, Cody's icon appears in the sidebar. Cody Free and Cody Pro users can sign in to their Sourcegraph.com accounts using SSO through GitHub, GitLab, or Google. +After a successful installation, the Cody icon appears in the Tool Windows Bar. -Sourcegraph Enterprise users should connect Cody to their Enterprise instance by clicking **Sign in with an Enterprise Instance**. +### Cody Free or Cody Pro Users + +Cody Free and Pro users can sign in to their Sourcegraph.com accounts using SSO through GitHub, GitLab, or Google. ![cody-for-intellij-login](https://storage.googleapis.com/sourcegraph-assets/Docs/Media/sign-in-cody-jb.png) -### For Sourcegraph Enterprise users +### Sourcegraph Enterprise Cody Users + +Sourcegraph Enterprise users should connect Cody to their Enterprise instance by clicking **Sign in with an Enterprise Instance**. -To connect the extension with your Enterprise instance, +To connect the plugin with your Enterprise instance, - Click **Sign in with an Enterprise Instance** -- Enter the server for your enterprise instance (e.g. `https://.sourcegraph.com`) -- Select **Authorize in Browser**. You'll be directed to an authorization page on your instance in the browser +- Enter the URL of your enterprise instance. Enter the URL of your Enterprise instance. If you are unsure, please contact your administrator. +- Select **Authorize in Browser**. You'll be directed to an authorization page on your instance in the browser. Alternatively, you can access advanced authorization settings by clicking **Show Advanced**. From here, you can manually enter a token generated from your User Settings in your Sourcegraph Enterprise instance, or add optional custom request headers. ## Verifying the installation -Once Cody is successfully connected, you'll see that the sign-in panel has been replaced by a welcome message from Cody. Let's create an autocomplete suggestion to verify that the Cody extension has been successfully installed and is working as expected. +Once connected, click the Cody icon from the sidebar again. The Cody plugin will open in a configurable side panel. + +Let's create an autocomplete suggestion to verify that the Cody plugin has been successfully installed and is working as expected. Cody provides intelligent code suggestions and context-aware autocompletions for numerous programming languages like JavaScript, Python, TypeScript, Go, etc. -- Create a new file in JetBrains IDE, for example, `code.js` +- Create a new file in your JetBrains IDE, for example, `code.js` +- Next, type the following algorithm function to sort an array of numbers +```js +function bubbleSort(array){ +} +``` - As you start typing, Cody will automatically provide suggestions and context-aware completions based on your coding patterns and the code context - These autocomplete suggestions appear as grayed text. To accept the suggestion, press the `Tab` key + - ## Chat -Cody chat in JetBrains is available in a unified interface opened right next to your code. Once connected to Sourcegraph, a new chat input field is opened with a default `@-mention` [context chip](#context-retrieval) for the current file. +Cody chat in JetBrains is available in a unified interface opened right next to your code. Once connected to Sourcegraph, a new chat input field is opened with default `@-mention` [context chips](#context-retrieval). All your previous and existing chats are stored for later use and can be accessed via the **History** icon from the top menu. You can download them to share or use later in a `.json` file or delete them altogether. @@ -85,9 +96,12 @@ Since your first message to Cody anchors the conversation, you can return to the Users must be on JetBrains v2023.2 and Cody plugin v7.0.0 or above to get the new and improved chat UI. +### Chat History +A chat history icon at the top of your chat input window allows you to navigate between chats (and search chats) without opening the Cody sidebar. + ### Changing LLM model for chat -Multi-model selection capability is available for all Cody Free and Pro users. Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model. + You need to be a Cody Free or Pro user to have multi-model selection capability. You can view which LLMs you have access to on our [supported LLMs page](/cody/capabilities/supported-models). Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model. For Chat: @@ -98,36 +112,27 @@ For Chat: For Edit: - On any file, select some code and right-click -- Select **Cody > Edit Code** -- Select the default model available (this is Claude Sonnet 3.5) +- Select **Cody > Edit Code** (optionally, you can do this with `Opt+K`/`Alt+K`) +- Select the default model available - See the selection of models and click the model you desire. This model will now be the default model going forward on any new edits -## Supported LLM models - -Both Cody Free and Pro users can choose from a list of supported LLM models for Chat and Commands. - -![llm-selection-cody](https://storage.googleapis.com/sourcegraph-assets/Docs/jb-llm-select-0824.jpg) - -Enterprise users who have [Model Configuration](/cody/clients/model-configuration#model-configuration) configured will also be able to select from the available models for their instance. On instances with the ["completions" configuration](/cody/clients/model-configuration#completions-configuration), a site administrator determines the LLM, and it cannot be changed within the editor. - - -Read and learn more about the [supported LLMs](/cody/capabilities/supported-models) and [token limits](/cody/core-concepts/token-limits) on Cody Free, Pro and Enterprise. +### Selecting Context with @-mentions -## Ollama model support +Cody's chat allows you to add files as context in your messages. -Ollama support for JetBrains is in the Experimental stage and is available on Cody Free and Pro. +- Type `@-file` and then a filename to include a file as a context. -You can use Ollama models locally for Cody’s chat and commands. This lets you chat without sending messages over the internet to an LLM provider so that you can use Cody offline. To use Ollama locally, you’ll need to install Ollama and download a chat model such as CodeGemma or Llama3. [Read here for detailed instructions](https://sourcegraph.com/github.com/sourcegraph/jetbrains/-/blob/README.md#use-ollama-models-for-chat--commands). +The `@-file` also supports line numbers to query the context of large files. You can add ranges of large files to your context by @-mentioning a large file and appending a number range to the filename, for example, `@filepath/filename:1-10`. -## Smart Apply code suggestions +When you `@-mention` files to add to Cody’s context window, the file lookup takes `files.exclude`, `search.exclude`, and `.gitgnore` files into account. This makes the file search faster as a result up to 100ms. -Cody lets you dynamically insert code from chat into your files with **Smart Apply**. Every time Cody provides you with a code suggestion, you can click the **Apply** button. Cody will then analyze your open code file, find where that relevant code should live, and add a diff. +Moreover, when you `@-mention` files, Cody will track the number of characters in those files against the context window limit of the selected chat model. As you `@-mention` multiple files, Cody will calculate how many tokens of the context window remain. When the remaining context window size becomes too small, you'll receive **File too large** errors when attempting to `@-mention` additional files. -For chat messages where Cody provides multiple code suggestions, you can apply each in sequence to go from chat suggestions to written code. +You can read more about context limits when selecting context [here](/cody/core-concepts/token-limits). ### Context retrieval -When you start a new Cody chat, the chat input window opens with a default `@-mention` context chips for all the context it intends to use. This context is based on your current repository and current file (or a file selection if you have code highlighted). +When you start a new Cody chat, the chat input window opens with default `@-mention` context chips for all the context it intends to use. This context is based on your current repository and current file (or a file selection if you have code highlighted). ![jb-context-retrieval](https://storage.googleapis.com/sourcegraph-assets/Docs/jb-context-retrieval-102024.png) @@ -135,17 +140,6 @@ At any point in time, you can edit these context chips, add additional context c When you have both a repository and files @-mentioned, Cody will search the repository for context while prioritizing the mentioned files. -### Selecting Context with @-mentions - -Cody's chat allows you to add files as context in your messages. Type `@-file` and then a filename to include a file as a context. - -The `@-file` also supports line numbers to query the context of large files. You can add ranges of large files to your context by @-mentioning a large file and appending a number range to the filename, for example, `@filepath/filename:1-10`. - -Moreover, when you `@-mention` files, Cody will track the number of characters in those files against the context window limit of the selected chat model. As you `@-mention` multiple files, Cody will calculate how many tokens of the context window remain. When the remaining context window size becomes too small, you get **File too large** errors for further more `@-mention` files. - -You can read more about context limits when selecting context [here](/cody/core-concepts/token-limits). - - ### Rerun prompts with different context If Cody's answer isn't helpful, you can try asking again with different context: @@ -158,26 +152,52 @@ If Cody's answer isn't helpful, you can try asking again with different context: ## Context fetching mechanism -JetBrains users on the Free or Pro plan can leverage [local search](/cody/core-concepts/context#context-selection) as the primary context source for Cody chat. +JetBrains users on the Free or Pro plan use [local context](/cody/core-concepts/context#context-selection). -Enterprise users can leverage the full power of the Sourcegraph search engine as the primary context provider to Cody. +Enterprise users can leverage the full power of the Sourcegraph search engine as Cody's primary context provider. - Read more about [Context fetching mechanism](/cody/core-concepts/context/#context-fetching-mechanism) in detail. + Read more about [Context fetching mechanisms](/cody/core-concepts/context/#context-fetching-mechanism) in detail. +## Context sources +You can @-mention files and web pages in Cody. Cody Enterprise also supports @-mentioning repositories to search for context in a broader scope. +Cody Free and Pro offer single-repo context, and Cody Enterprise supports multi-repo context. ### Cody Context Filters -Context Filters is available for all Cody Enterprise users running Cody JetBrains extension version `>=6.0.0`. +Context Filters is available for all Cody Enterprise users running Cody JetBrains plugin version `>=6.0.0`. -Admins on the Sourcegraph Enterprise instance can use the Cody Context Filters to determine which repositories Cody can use as the context in its requests to third-party LLMs. Inside your site configuration, you can define a set of `include` and `exclude` rules that will be used to filter the list of repositories Cody can access. +Admins on the Sourcegraph Enterprise instance can use Cody Context Filters to determine which repositories Cody can use as the context in its requests to third-party LLMs. Inside your site configuration, you can define a set of `include` and `exclude` rules that will be used to filter the list of repositories Cody can access. For repos mentioned in the `exclude` field, Cody's commands are disabled, and you cannot use them for context fetching. If you try running any of these, you'll be prompted with an error message. However, Cody chat will still work, and you can use it to ask questions. -[Read more about the Cody Context Filters here →](/cody/capabilities/ignore-context) +[Read more about Cody Context Filters here →](/cody/capabilities/ignore-context) + +## Autocomplete + +Cody provides multi-line autocomplete as you type. Autocomplete suggestions appear as inlay suggestions and are enabled by default in your JetBrains IDE. With this setting, there is a list of programming languages supported and enabled by default. + +To manually configure the Autocomplete feature, + +- Go to the **Cody Settings...** from the Cody icon in the sidebar +- Next, click the **Sourcegraph & Cody** dropdown and select **Cody** +- The **Autocomplete** settings will appear with the list of **Enabled Languages** + +Autocomplete suggestions use the same color as inline parameter hints according to your configured editor theme. However, you can optionally enable the **Custom color for completions** checkbox to customize the color of your choice. + +In addition, you can use the following keyboard shortcuts to interact with Cody's autocomplete suggestions: + +- `Tab` to accept a suggestion +- `Alt + [` (Windows) or `Opt + [` (macOS) to cycle suggestions +- `Alt + \` (Windows) or `Opt + \` (macOS) to manually trigger autocomplete if no suggestions have been returned + + + ## Prompts and Commands -Cody with JetBrains offers quick, ready-to-use [prompts and commands](/cody/capabilities/commands) for common actions to write, describe, fix, and smell code. These allow you to run predefined actions with smart context-fetching anywhere in the editor. These allow you to run predefined actions with smart context-fetching anywhere in the editor, like: +Cody offers quick, ready-to-use [prompts and commands](/cody/capabilities/commands) for common actions to write, describe, fix, and smell code. These allow you to run predefined actions with smart context-fetching anywhere in the editor, like: - **Edit Code**: Makes inline code edits. You also get the option to select LLM for edit suggestions - **Document Code**: Create inline documentation for your code @@ -227,46 +247,56 @@ Like the Edit Code and Generate Unit Test commands, you can generate inline docu -## Autocomplete +## Smart Apply code suggestions -Cody provides multi-line autocomplete as you type. Autocomplete suggestions appear as inlay suggestions and are enabled by default in your JetBrains IDE. With this setting, there is a list of programming languages supported and enabled by default. +Cody lets you dynamically insert code from chat into your files with **Smart Apply**. Every time Cody provides you with a code suggestion, you can click the **Apply** button. Cody will then analyze your open code file, find where that relevant code should live, and add a diff. -To manually configure the Autocomplete feature, +For chat messages where Cody provides multiple code suggestions, you can apply each in sequence to go from chat suggestions to written code. -- Go to the **Cody Settings...** from the Cody icon in the sidebar -- Next, click the **Sourcegraph & Cody** dropdown and select **Cody** -- The **Autocomplete** settings will appear with the list of **Enabled Languages** +## Updating the plugin -Autocomplete suggestions use the same color as inline parameter hints according to your configured editor theme. However, you can optionally enable the **Custom color for completions** checkbox to customize the color of your choice. +JetBrains IDEs will typically notify you when updates are available for installed plugins. Follow the prompts to update the Cody AI plugin to the latest version. -In addition, you can use the following keyboard shortcuts to interact with Cody's autocomplete suggestions: +## Change update channel for stable or nightly releases -- `Tab` to accept a suggestion -- `Alt + [` (Windows) or `Opt + [` (macOS) to cycle suggestions -- `Alt + \` (Windows) or `Opt + \` (macOS) to manually trigger autocomplete if no suggestions have been returned +Our nightly release channel gets updated much more frequently and might be helpful to verify bug fixes that will come in the next stable release. +To update your update channel you can do the following: - +1. Open your JetBrains IDE settings by selecting **IDE Name | Settings** on macOS or **File | Settings** on Windows and Linux from the main menu. +1. Get to the Cody Settings by navigating to `Tools -> Sourcegraph & Cody` +1. Under update channel select `Stable` or `Nightly` -## Add or remove account +## Supported LLM models + +Both Cody Free and Pro users can choose from a list of supported LLM models for Chat and Commands. + +![llm-selection-cody](https://storage.googleapis.com/sourcegraph-assets/Docs/jb-llm-select-0824.jpg) + +Enterprise users who have [model configuration](/cody/clients/model-configuration#model-configuration) configured will also be able to select from the available models for their instance. On instances with the ["completions" configuration](/cody/clients/model-configuration#completions-configuration), a site administrator determines the LLM, and it cannot be changed within the editor. + +Read and learn more about the [supported LLMs](/cody/capabilities/supported-models) and [token limits](/cody/core-concepts/token-limits) on Cody Free, Pro and Enterprise. + +## Ollama model support + +Ollama support for JetBrains is in the Experimental stage and is available for Cody Free and Pro Plans. + +You can use Ollama models locally for Cody’s chat and commands. This lets you chat without sending messages over the internet to an LLM provider so that you can use Cody offline. To use Ollama locally, you’ll need to install Ollama and download a chat model such as CodeGemma or Llama3. [Read here for detailed instructions](https://sourcegraph.com/github.com/sourcegraph/jetbrains/-/blob/README.md#use-ollama-models-for-chat--commands). + +## Add/remove account To add or remove an account you can do the following: +1. Open Cody by clicking the Cody icon on the toolbar +1. On the open sidebar select the Account icon +1. Select `Sign Out` to remove account + +Alternatively, you can also manage multiple accounts in Cody Settings: + 1. Open your IDE settings by selecting **IDE | Settings** on macOS or **File | Settings** on Windows and Linux from the main menu. -1. Get to the Cody Settings by navigating to `Tools -> Sourcegraph & Cody` +1. Go to the Cody Settings by navigating to `Tools -> Sourcegraph & Cody` 1. Under authentication see the accounts that are currently logged in 1. To remove, select your account and hit `-`. To add click `+` and choose the appropriate login method -## Change update channel for stable or nightly releases - -Our nightly release channel gets updated much more frequently and might be helpful to verify bug fixes that will come in the next stable release. -To update your update channel you can do the following: - -1. Open your JetBrains IDE settings by selecting **IDE Name | Settings** on macOS or **File | Settings** on Windows and Linux from the main menu. -1. Get to the Cody Settings by navigating to `Tools -> Sourcegraph & Cody` -1. Under update channel select `Stable` or `Nightly` - ## Find Cody features Using the **Search Everywhere** option in JetBrains IDEs you can find and discover all Cody features and actions. Press `Shift` twice to open the `Search Everywhere` window. Then, type in the `Cody:` prefix to get a list of all supported Cody actions. From 6a32247734ac40e13edaa220a760dc718a76d013 Mon Sep 17 00:00:00 2001 From: Maedah Batool Date: Thu, 14 Nov 2024 11:54:45 -0800 Subject: [PATCH 6/7] Add final improvements --- docs/cody/clients/install-jetbrains.mdx | 89 +++++++++++++------------ 1 file changed, 45 insertions(+), 44 deletions(-) diff --git a/docs/cody/clients/install-jetbrains.mdx b/docs/cody/clients/install-jetbrains.mdx index 0ea0eb481..f1f3b3d47 100644 --- a/docs/cody/clients/install-jetbrains.mdx +++ b/docs/cody/clients/install-jetbrains.mdx @@ -30,7 +30,7 @@ The Cody plugin by Sourcegraph enhances your coding experience in your IDE by pr Follow these steps to install the Cody plugin: - Open a supported JetBrains editor on your local machine -- Open **Settings** (MacOS: `⌘+,` Windows: `Ctrl+Alt+S`) and select **Plugins** +- Open **Settings** (macOS: `⌘+,` Windows: `Ctrl+Alt+S`) and select **Plugins** - Search for **Cody: AI Coding Assistant with Autocomplete & Chat** in the marketplace and click **Install** ![cody-for-intellij](https://storage.googleapis.com/sourcegraph-assets/Docs/cody-for-intellij-062024.png) @@ -39,11 +39,11 @@ Alternatively, you can also [download and install the plugin from the JetBrains ## Connect the plugin to Sourcegraph -After a successful installation, the Cody icon appears in the Tool Windows Bar. +After a successful installation, the Cody icon appears in the Tool Windows Bar. ### Cody Free or Cody Pro Users -Cody Free and Pro users can sign in to their Sourcegraph.com accounts using SSO through GitHub, GitLab, or Google. +Cody Free and Pro users can sign in to their Sourcegraph.com accounts using SSO through GitHub, GitLab, or Google. ![cody-for-intellij-login](https://storage.googleapis.com/sourcegraph-assets/Docs/Media/sign-in-cody-jb.png) @@ -54,21 +54,22 @@ Sourcegraph Enterprise users should connect Cody to their Enterprise instance by To connect the plugin with your Enterprise instance, - Click **Sign in with an Enterprise Instance** -- Enter the URL of your enterprise instance. Enter the URL of your Enterprise instance. If you are unsure, please contact your administrator. -- Select **Authorize in Browser**. You'll be directed to an authorization page on your instance in the browser. +- Enter the URL of your Enterprise instance. If you are unsure, please get in touch with your administrator +- Select **Authorize in Browser**. You'll be directed to an authorization page on your instance in the browser -Alternatively, you can access advanced authorization settings by clicking **Show Advanced**. From here, you can manually enter a token generated from your User Settings in your Sourcegraph Enterprise instance, or add optional custom request headers. +Alternatively, you can access advanced authorization settings by clicking **Show Advanced**. You can manually enter a token generated from your User Settings in your Sourcegraph Enterprise instance or add optional custom request headers. ## Verifying the installation Once connected, click the Cody icon from the sidebar again. The Cody plugin will open in a configurable side panel. -Let's create an autocomplete suggestion to verify that the Cody plugin has been successfully installed and is working as expected. +Let's create an autocomplete suggestion to verify that the Cody plugin has been installed and works as expected. -Cody provides intelligent code suggestions and context-aware autocompletions for numerous programming languages like JavaScript, Python, TypeScript, Go, etc. +Cody provides intelligent code suggestions and context-aware autocompletion for numerous programming languages, such as JavaScript, Python, TypeScript, Go, etc. - Create a new file in your JetBrains IDE, for example, `code.js` - Next, type the following algorithm function to sort an array of numbers + ```js function bubbleSort(array){ } @@ -82,15 +83,15 @@ function bubbleSort(array){ ## Chat -Cody chat in JetBrains is available in a unified interface opened right next to your code. Once connected to Sourcegraph, a new chat input field is opened with default `@-mention` [context chips](#context-retrieval). +Cody chat in JetBrains is available in a unified interface that opens right next to your code. Once connected to Sourcegraph, a new chat input field opens with the default `@-mention` [context chips](#context-retrieval). -All your previous and existing chats are stored for later use and can be accessed via the **History** icon from the top menu. You can download them to share or use later in a `.json` file or delete them altogether. +All your previous and existing chats are stored for later use and can be accessed via the **History** icon from the top menu. You can download them to share or use later in a `.json` file or delete them. ### Chat interface The chat interface is designed intuitively. Your very first chat input lives at the top of the panel, and the first message in any chat log will stay pinned to the top of the chat. After your first message, the chat input window moves to the bottom of the sidebar. -Since your first message to Cody anchors the conversation, you can return to the top chat box anytime, edit your prompt, or re-run it using a different LLM model. +Since your first message to Cody anchors the conversation, you can return to the top chat box anytime, edit your prompt, or rerun it using a different LLM model. ![chat-interface](https://storage.googleapis.com/sourcegraph-assets/Docs/jb-chat-interface-102024.png) @@ -101,24 +102,24 @@ A chat history icon at the top of your chat input window allows you to navigate ### Changing LLM model for chat - You need to be a Cody Free or Pro user to have multi-model selection capability. You can view which LLMs you have access to on our [supported LLMs page](/cody/capabilities/supported-models). Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model. + You need to be a Cody Free or Pro user to have multi-model selection capability. You can view which LLMs you can access on our [supported LLMs page](/cody/capabilities/supported-models). Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model. For Chat: - Open chat or toggle between editor and chat - Click on the model selector (which by default indicates Claude 3.5 Sonnet) -- See the selection of models and click the model you desire. This model will now be the default model going forward on any new chats +- See the selection of models and click the model you desire. This model will now be the default model for any new chats For Edit: - On any file, select some code and right-click - Select **Cody > Edit Code** (optionally, you can do this with `Opt+K`/`Alt+K`) - Select the default model available -- See the selection of models and click the model you desire. This model will now be the default model going forward on any new edits +- See the selection of models and click the model you desire. This model will now be the default model for any new edits ### Selecting Context with @-mentions -Cody's chat allows you to add files as context in your messages. +Cody's chat allows you to add files as context in your messages. - Type `@-file` and then a filename to include a file as a context. @@ -128,25 +129,25 @@ When you `@-mention` files to add to Cody’s context window, the file lookup ta Moreover, when you `@-mention` files, Cody will track the number of characters in those files against the context window limit of the selected chat model. As you `@-mention` multiple files, Cody will calculate how many tokens of the context window remain. When the remaining context window size becomes too small, you'll receive **File too large** errors when attempting to `@-mention` additional files. -You can read more about context limits when selecting context [here](/cody/core-concepts/token-limits). +You can read more about context limits when selecting context [here](/cody/core-concepts/token-limits). ### Context retrieval -When you start a new Cody chat, the chat input window opens with default `@-mention` context chips for all the context it intends to use. This context is based on your current repository and current file (or a file selection if you have code highlighted). +When you start a new Cody chat, the input window opens with default `@-mention` context chips for all the context it intends to use. This context is based on your current repository and current file (or a file selection if you have code highlighted). ![jb-context-retrieval](https://storage.googleapis.com/sourcegraph-assets/Docs/jb-context-retrieval-102024.png) -At any point in time, you can edit these context chips, add additional context chips, or remove them completely if you do not want to use these as context. Any chat without a context chip will instruct Cody to use no codebase context. However, you can always provide an alternate `@-mention` file to let Cody use it as a new source of context. +At any point in time, you can edit these context chips, add additional context chips, or remove them completely if you do not want to use these as context. Any chat without a context chip will instruct Cody to use no codebase context. However, you can always provide an alternate `@-mention` file to let Cody use it as a new context source. When you have both a repository and files @-mentioned, Cody will search the repository for context while prioritizing the mentioned files. ### Rerun prompts with different context -If Cody's answer isn't helpful, you can try asking again with different context: +If Cody's answer isn't helpful, you can try asking again with a different context: -- **Public knowledge only**: Cody will not use your own code files as context; it’ll only use knowledge trained into the base model. -- **Current file only**: Re-run the prompt again using just the current file as context. -- **Add context**: Provides @-mention context options to improve the response by explicitly including files, remote repositories, or even web pages (by URL). +- **Public knowledge only**: Cody will not use your code files as context; it’ll only use knowledge trained into the base model. +- **Current file only**: Rerun the prompt using just the current file as context. +- **Add context**: Provides @-mention context options to improve the response by explicitly including files, remote repositories, or web pages (URL). ![jb-rerun-context](https://storage.googleapis.com/sourcegraph-assets/Docs/jb-rerun-context-0824.jpg) @@ -159,7 +160,7 @@ Enterprise users can leverage the full power of the Sourcegraph search engine as Read more about [Context fetching mechanisms](/cody/core-concepts/context/#context-fetching-mechanism) in detail. ## Context sources -You can @-mention files and web pages in Cody. Cody Enterprise also supports @-mentioning repositories to search for context in a broader scope. +You can @-mention files and web pages in Cody. Cody Enterprise also supports @-mentioning repositories to search for context in a broader scope. Cody Free and Pro offer single-repo context, and Cody Enterprise supports multi-repo context. ### Cody Context Filters @@ -174,7 +175,7 @@ For repos mentioned in the `exclude` field, Cody's commands are disabled, and yo ## Autocomplete -Cody provides multi-line autocomplete as you type. Autocomplete suggestions appear as inlay suggestions and are enabled by default in your JetBrains IDE. With this setting, there is a list of programming languages supported and enabled by default. +Cody provides multi-line autocomplete as you type. Autocomplete suggestions appear as inlay suggestions and are enabled by default in your JetBrains IDE. This setting lists the programming languages supported and enabled by default. To manually configure the Autocomplete feature, @@ -200,7 +201,7 @@ In addition, you can use the following keyboard shortcuts to interact with Cody' Cody offers quick, ready-to-use [prompts and commands](/cody/capabilities/commands) for common actions to write, describe, fix, and smell code. These allow you to run predefined actions with smart context-fetching anywhere in the editor, like: - **Edit Code**: Makes inline code edits. You also get the option to select LLM for edit suggestions -- **Document Code**: Create inline documentation for your code +- **Document Code**: Create inline docs for your code - **Generate Unit Test**: Creates inline unit tests for your code - **Smell Code**: Finds code smells in your file - **Explain Code**: Expains code in your file @@ -211,11 +212,11 @@ Let's learn about how to use some of these commands: ### Inline code edits -You can make edits to your code directly in your file without opening the chat window. The **Edit Code** command makes direct code edits, refactors, or bug fixes. +You can edit your code directly in your file without opening the chat window. The **Edit Code** command makes direct code edits, refactors, or bug fixes. -You can run the inline edit command on a selected code snippet, an entire file, or to generate code on a new line. To do so, use the **Edit Code** command in the Cody sidebar or context menu, or the `Shift + Ctrl + Enter` shortcut. This opens a floating editor where you can describe the change you want to make. +You can run the inline edit command on a selected code snippet or an entire file or generate code on a new line. Use the **Edit Code** command in the Cody sidebar or context menu or the `Shift + Ctrl + Enter` shortcut. This opens a floating editor where you can describe the change you want to make. -Once you enter your prompt, Cody will perform inline edits that you can **Accept**, **Undo**, or **Show diff** for the change. You can also click **Edit & Retry** to iterate your prompt and get alternate suggestions. +Once you enter your prompt, Cody will perform inline edits that you can **Accept**, **Undo**, or **Show diff** for the change. Click **Edit & Retry** to iterate your prompt and get alternate suggestions. -## Smart Apply code suggestions - -Cody lets you dynamically insert code from chat into your files with **Smart Apply**. Whenever Cody provides a code suggestion, you can click the **Apply** button. Cody will then analyze your open code file, find where that relevant code should live, and add a diff. - -For chat messages where Cody provides multiple code suggestions, you can apply each in sequence to go from chat suggestions to written code. - ## Updating the plugin JetBrains IDEs will typically notify you when updates are available for installed plugins. Follow the prompts to update the Cody AI plugin to the latest version. diff --git a/docs/cody/quickstart.mdx b/docs/cody/quickstart.mdx index 597221dd9..46c4c0a32 100644 --- a/docs/cody/quickstart.mdx +++ b/docs/cody/quickstart.mdx @@ -67,9 +67,10 @@ Type your refactoring suggestion in the input field and press `Enter`. Cody will ## Use Cody to debug code -You can ask Cody to debug your code and fix bugs. Cody chat and inline edits are both quite powerful in debugging. If you are running into a bug, you can ask Cody to debug and fix the code. If you are a VS Code user and have the Cody extension installed, you can use **code actions** to debug your code. +You can ask Cody to debug your code and fix bugs. Cody chat and inline edits are both quite powerful in debugging. If there is a bug, you can ask Cody to debug and fix the code. VS Code and JetBrains IDEs offer **Ask Cody to fix** option. + +When a mistake occurs, a red warning is triggered. Along with this, you get a lightbulb icon. If you click on this lightbulb icon, there is an **Ask Cody to fix** option. Click this, and Cody will try to fix the bug with a **Cody is working** notice. -When there is a mistake, a red warning is triggered. Along with this, you get a lightbulb icon. If you click on this lightbulb icon, there is an **Ask Cody to fix** option. Click this, and Cody will try to fix the bug with **Cody is working** notice. ![code-actions](https://storage.googleapis.com/sourcegraph-assets/Docs/cody-code-actions-vscode-1124.png) That's it for this quickstart guide! Feel free to continue chatting with Cody to learn more about its [capabilities](/cody/capabilities).