diff --git a/docs/cody/clients/install-jetbrains.mdx b/docs/cody/clients/install-jetbrains.mdx index 79a8e73df..08c901d3f 100644 --- a/docs/cody/clients/install-jetbrains.mdx +++ b/docs/cody/clients/install-jetbrains.mdx @@ -1,8 +1,8 @@ # Installing Cody for JetBrains -

Learn how to use Cody and its features with the JetBrains IntelliJ editor.

+

Learn how to use Cody and its features with JetBrains editors.

-The Cody extension by Sourcegraph enhances your coding experience in your IDE by providing intelligent code suggestions, context-aware completions, and advanced code analysis. This guide will walk you through the steps to install and set up Cody within your JetBrains environment. +The Cody plugin by Sourcegraph enhances your coding experience in your IDE by providing intelligent code suggestions, context-aware completions, and advanced code analysis. This guide will walk you through the steps to install and set up Cody within your JetBrains environment. @@ -25,185 +25,182 @@ The Cody extension by Sourcegraph enhances your coding experience in your IDE by - [RubyMine](https://www.jetbrains.com/ruby/) - [WebStorm](https://www.jetbrains.com/webstorm/) -## Install the JetBrains IntelliJ Cody extension +## Install the JetBrains Cody plugin Follow these steps to install the Cody plugin: -- Open JetBrains IntelliJ editor on your local machine -- Open **Settings** (Mac: `⌘+,` Windows: `Ctrl+Alt+S`) and select **Plugins** -- Type and search **Cody: AI Coding Assistant with Autocomplete & Chat** extension and click **Install** +- Open a supported JetBrains editor on your local machine +- Open **Settings** (macOS: `⌘+,` Windows: `Ctrl+Alt+S`) and select **Plugins** +- Search for **Cody: AI Coding Assistant with Autocomplete & Chat** in the marketplace and click **Install** ![cody-for-intellij](https://storage.googleapis.com/sourcegraph-assets/Docs/cody-for-intellij-062024.png) -Alternatively, you can [Download and install the extension from the Jetbrains marketplace](https://plugins.jetbrains.com/plugin/9682-sourcegraph). +Alternatively, you can also [download and install the plugin from the JetBrains marketplace](https://plugins.jetbrains.com/plugin/9682-sourcegraph) directly. -## Connect the extension to Sourcegraph +## Connect the plugin to Sourcegraph -After a successful installation, Cody's icon appears in the sidebar. Clicking it prompts you to start with codehosts like GitHub, GitLab, and your Google login. This allows Cody to access your Sourcegraph.com account. +After a successful installation, the Cody icon appears in the Tool Windows Bar. -Alternatively, you can also click the **Sign in with an Enterprise Instance** to connect to your enterprise instance. +### Cody Free or Cody Pro Users + +Cody Free and Pro users can sign in to their Sourcegraph.com accounts using SSO through GitHub, GitLab, or Google. ![cody-for-intellij-login](https://storage.googleapis.com/sourcegraph-assets/Docs/Media/sign-in-cody-jb.png) -### For Sourcegraph Enterprise users +### Sourcegraph Enterprise Cody Users -To connect the extension with your Enterprise instance, +Sourcegraph Enterprise users should connect Cody to their Enterprise instance by clicking **Sign in with an Enterprise Instance**. -- Click **Sign in with an Enterprise Instance** -- Enter the server for your enterprise instance (e.g. `https://.sourcegraph.com`) -- Select **Generate new token**. You'll be directed to the **Access tokens** page on your instance in the browser -- Generate a new token, copy it, and paste it into the **Token** field in your editor -- Click **Sign in** +To connect the plugin with your Enterprise instance, -### For Sourcegraph.com users +- Click **Sign in with an Enterprise Instance** +- Enter the URL of your Enterprise instance. If you are unsure, please get in touch with your administrator +- Select **Authorize in Browser**. You'll be directed to an authorization page on your instance in the browser -For Cody Free and Cody Pro users, you can Log in through SSO to authenticate the IntelliJ extension with your Sourcegraph.com account. +Alternatively, you can access advanced authorization settings by clicking **Show Advanced**. You can manually enter a token generated from your User Settings in your Sourcegraph Enterprise instance or add optional custom request headers. ## Verifying the installation -Once Cody is successfully connected, you'll see that the sign-in panel has been replaced by a welcome message from Cody. Let's create an autocomplete suggestion to verify that the Cody extension has been successfully installed and is working as expected. +Once connected, click the Cody icon from the sidebar again. The Cody plugin will open in a configurable side panel. + +Let's create an autocomplete suggestion to verify that the Cody plugin has been installed and works as expected. + +Cody provides intelligent code suggestions and context-aware autocompletion for numerous programming languages, such as JavaScript, Python, TypeScript, Go, etc. -Cody provides intelligent code suggestions and context-aware autocompletions for numerous programming languages like JavaScript, Python, TypeScript, Go, etc. +- Create a new file in your JetBrains IDE, for example, `code.js` +- Next, type the following algorithm function to sort an array of numbers -- Create a new file in IntelliJ, for example, `code.js` +```js +function bubbleSort(array){ +} +``` - As you start typing, Cody will automatically provide suggestions and context-aware completions based on your coding patterns and the code context - These autocomplete suggestions appear as grayed text. To accept the suggestion, press the `Tab` key + - ## Chat -Cody chat in JetBrains is available in a unified interface opened right next to your code. Once connected to Sourcegraph, a new chat input field is opened with a default `@-mention` [context chips](#context-retrieval). +Cody chat in JetBrains is available in a unified interface that opens right next to your code. Once connected to Sourcegraph, a new chat input field opens with the default `@-mention` [context chips](#context-retrieval). -All your previous and existing chats are stored for later use and can be accessed via the **History** icon from the top menu. You can download them to share or use later in a `.json` file or delete them altogether. +All your previous and existing chats are stored for later use and can be accessed via the **History** icon from the top menu. You can download them to share or use later in a `.json` file or delete them. ### Chat interface The chat interface is designed intuitively. Your very first chat input lives at the top of the panel, and the first message in any chat log will stay pinned to the top of the chat. After your first message, the chat input window moves to the bottom of the sidebar. -Since your first message to Cody anchors the conversation, you can return to the top chat box anytime, edit your prompt, or re-run it using a different LLM model. +Since your first message to Cody anchors the conversation, you can return to the top chat box anytime, edit your prompt, or rerun it using a different LLM model. ![chat-interface](https://storage.googleapis.com/sourcegraph-assets/Docs/jb-chat-interface-102024.png) - Users should be on the Cody for JetBrains extension v2023.2 or more to get this new and improved chat UI. + Users must be on JetBrains v2023.2 and Cody plugin v7.0.0 or above to get the new and improved chat UI. + +### Chat History +A chat history icon at the top of your chat input window allows you to navigate between chats (and search chats) without opening the Cody sidebar. ### Changing LLM model for chat - You need to be a Cody Free or Pro user to have multi-model selection capability. Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model. + You need to be a Cody Free or Pro user to have multi-model selection capability. You can view which LLMs you can access on our [supported LLMs page](/cody/capabilities/supported-models). Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model. For Chat: - Open chat or toggle between editor and chat - Click on the model selector (which by default indicates Claude 3.5 Sonnet) -- See the selection of models and click the model you desire. This model will now be the default model going forward on any new chats +- See the selection of models and click the model you desire. This model will now be the default model for any new chats For Edit: -- On any file, select some code and a right-click -- Select **Cody > Edit Code** -- Select the default model available (this is Claude Sonnet 3.5) -- See the selection of models and click the model you desire. This model will now be the default model going forward on any new edits - -## Supported LLM models - -Users on Cody **Free** and **Pro** can choose from a list of supported LLM models for Chat and Commands. Both Cody Free and Pro users can choose from a list of supported LLM models for Chat and Commands. - -![llm-selection-cody](https://storage.googleapis.com/sourcegraph-assets/Docs/jb-llm-select-0824.jpg) - -Enterprise users get Claude 3 (Opus and Sonnet) as the default LLM models without extra cost. Moreover, Enterprise users can use Claude 3.5 models through Cody Gateway, Anthropic BYOK, AWS Bedrock (limited availability), and GCP Vertex. +- On any file, select some code and right-click +- Select **Cody > Edit Code** (optionally, you can do this with `Opt+K`/`Alt+K`) +- Select the default model available +- See the selection of models and click the model you desire. This model will now be the default model for any new edits -For enterprise users on AWS Bedrock: 3.5 Sonnet is unavailable in `us-west-2` but available in `us-east-1`. Check the current model availability on AWS and your customer's instance location before switching. Provisioned throughput via AWS is not supported for 3.5 Sonnet. - -You also get additional capabilities like BYOLLM (Bring Your Own LLM), supporting Single-Tenant and Self Hosted setups for flexible coding environments. Your site administrator determines the LLM, and cannot be changed within the editor. However, Cody Enterprise users when using Cody Gateway have the ability to [configure custom models](/cody/core-concepts/cody-gateway#configuring-custom-models) Anthropic (like Claude 3 Opus and Claude Haiku), OpenAI (GPT 3.5 Turbo and GPT 4 Turbo) and Google Gemini 1.5 models (Flash and Pro). - -Read and learn more about the [supported LLMs](/cody/capabilities/supported-models) and [token limits](/cody/core-concepts/token-limits) on Cody Free, Pro and Enterprise. +### Selecting Context with @-mentions -## Ollama model support +Cody's chat allows you to add files as context in your messages. -Ollama support for JetBrains is in the Experimental stage and is available on Cody Free and Pro. +- Type `@-file` and then a filename to include a file as a context. -You can use Ollama models locally for Cody’s chat and commands. This lets you chat without sending messages over the internet to an LLM provider so that you can use Cody offline. To use Ollama locally, you’ll need to install Ollama and download a chat model such as CodeGemma or Llama3. [Read here for detailed instructions](https://sourcegraph.com/github.com/sourcegraph/jetbrains/-/blob/README.md#use-ollama-models-for-chat--commands). +The `@-file` also supports line numbers to query the context of large files. You can add ranges of large files to your context by @-mentioning a large file and appending a number range to the filename, for example, `@filepath/filename:1-10`. -## Smart Apply code suggestions +When you `@-mention` files to add to Cody’s context window, the file lookup takes `files.exclude`, `search.exclude`, and `.gitgnore` files into account. This makes the file search faster as a result up to 100ms. -Cody lets you dynamically insert code from chat into your files with **Smart Apply**. Every time Cody provides you with a code suggestion, you can click the **Apply** button. Cody will then analyze your open code file, find where that relevant code should live, and add a diff. +Moreover, when you `@-mention` files, Cody will track the number of characters in those files against the context window limit of the selected chat model. As you `@-mention` multiple files, Cody will calculate how many tokens of the context window remain. When the remaining context window size becomes too small, you'll receive **File too large** errors when attempting to `@-mention` additional files. -For chat messages where Cody provides multiple code suggestions, you can apply each in sequence to go from chat suggestions to written code. +You can read more about context limits when selecting context [here](/cody/core-concepts/token-limits). ### Context retrieval -When you start a new Cody chat, the chat input window opens with a default `@-mention` context chips for all the context it intends to use. This context is based on your current repository and current file (or a file selection if you have code highlighted). +When you start a new Cody chat, the input window opens with default `@-mention` context chips for all the context it intends to use. This context is based on your current repository and current file (or a file selection if you have code highlighted). ![jb-context-retrieval](https://storage.googleapis.com/sourcegraph-assets/Docs/jb-context-retrieval-102024.png) -At any point in time, you can edit these context chips or remove them completely if you do not want to use these as context. Any chat without a context chip will instruct Cody to use no codebase context. However, you can always provide an alternate `@-mention` file to let Cody use it as a new source of context. +At any point in time, you can edit these context chips, add additional context chips, or remove them completely if you do not want to use these as context. Any chat without a context chip will instruct Cody to use no codebase context. However, you can always provide an alternate `@-mention` file to let Cody use it as a new context source. When you have both a repository and files @-mentioned, Cody will search the repository for context while prioritizing the mentioned files. -### Selecting Context with @-mentions - -Cody's chat allows you to add files as context in your messages. Type `@-file` and then a filename to include a file as a context. - -The `@-file` also supports line numbers to query the context of large files. You can add ranges of large files to your context by @-mentioning a large file and appending a number range to the filename, for example, `@filepath/filename:1-10`. - -When you `@-mention` files to add to Cody’s context window, the file lookup takes `files.exclude`, `search.exclude`, and `.gitgnore` files into account. This makes the file search faster as a result up to 100ms. - -Moreover, when you `@-mention` files, Cody will track the number of characters in those files against the context window limit of the selected chat model. As you `@-mention` multiple files, Cody will calculate how many tokens of the context window remain. When the remaining context window size becomes too small, you get **File too large** errors for further more `@-mention` files. - -Cody defaults to showing @-mention context chips for all the context it intends to use. When you open a new chat, Cody will show context chips for your current repository and current file (or file selection if you have code highlighted). - ### Rerun prompts with different context -If Cody's answer isn't helpful, you can try asking again with different context: +If Cody's answer isn't helpful, you can try asking again with a different context: -- **Public knowledge only**: Cody will not use your own code files as context; it’ll only use knowledge trained into the base model. -- **Current file only**: Re-run the prompt again using just the current file as context. -- **Add context**: Provides @-mention context options to improve the response by explicitly including files, remote repositories, or even web pages (by URL). +- **Public knowledge only**: Cody will not use your code files as context; it’ll only use knowledge trained into the base model. +- **Current file only**: Rerun the prompt using just the current file as context. +- **Add context**: Provides @-mention context options to improve the response by explicitly including files, remote repositories, or web pages (URL). ![jb-rerun-context](https://storage.googleapis.com/sourcegraph-assets/Docs/jb-rerun-context-0824.jpg) ## Context fetching mechanism -JetBrains users on the Free or Pro plan can leverage [local search](/cody/core-concepts/context#context-selection) as the primary context source for Cody chat. +JetBrains users on the Free or Pro plan use [local context](/cody/core-concepts/context#context-selection). -Enterprise users can leverage the full power of the Sourcegraph search engine as the primary context provider to Cody. +Enterprise users can leverage the full power of the Sourcegraph search engine as Cody's primary context provider. - Read more about [Context fetching mechanism](/cody/core-concepts/context/#context-fetching-mechanism) in detail. + Read more about [Context fetching mechanisms](/cody/core-concepts/context/#context-fetching-mechanism) in detail. -## Context scope +## Context sources +You can @-mention files and web pages in Cody. Cody Enterprise also supports @-mentioning repositories to search for context in a broader scope. +Cody Free and Pro offer single-repo context, and Cody Enterprise supports multi-repo context. -JetBrains users on the Free or Pro plan get single-repo support in chat and can use one repo for context fetching. Enterprise users get multi-repo support in chat and can explicitly specify **up to 10 additional repos** they would like Cody to use for context. +### Cody Context Filters -## Context Selection +Context Filters is available for all Cody Enterprise users running Cody JetBrains plugin version `>=6.0.0`. -Cody automatically understands the context of your codebase for all Cody Free, Pro, and Enterprise users based on the project opened in your workspace. Enterprise users can add up to **9 additional repos (10 total, including the default project)** to use as context. The multi-repo context for Enterprise is powered by Sourcegraph code search and allows Cody to use the selected codebase to answer your questions. +Admins on the Sourcegraph Enterprise instance can use Cody Context Filters to determine which repositories Cody can use as the context in its requests to third-party LLMs. Inside your site configuration, you can define a set of `include` and `exclude` rules that will be used to filter the list of repositories Cody can access. -Moreover, Cody's chat allows you to add files as context in your messages. Type `@-file` in the Cody chat window and then a filename to include a file as context. +For repos mentioned in the `exclude` field, Cody's commands are disabled, and you cannot use them for context fetching. If you try running any of these, you'll be prompted with an error message. However, Cody chat will still work, and you can use it to ask questions. -### Chat with multi-repo context +[Read more about Cody Context Filters here →](/cody/capabilities/ignore-context) -For Cody Free and Cody Pro users, if you delete all the context from the chat input, Cody will not use any local context. When this happens, Cody doesn't search your local project for context and sends your prompt to the selected LLM. +## Autocomplete -Cody Enterprise users can add remote repositories from your Sourcegraph instance. You can type the name of your repositories into this interface and select up to **9 additional repos (10 total, including the default project)**. Cody will then search against those repositories and retrieve relevant files to answer your chat prompts. +Cody provides multi-line autocomplete as you type. Autocomplete suggestions appear as inlay suggestions and are enabled by default in your JetBrains IDE. This setting lists the programming languages supported and enabled by default. -### Cody Context Filters +To manually configure the Autocomplete feature, -Context Filters is available for all Cody Enterprise users running Cody JetBrains extension version `>=6.0.0`. +- Go to the **Cody Settings...** from the Cody icon in the sidebar +- Next, click the **Sourcegraph & Cody** dropdown and select **Cody** +- The **Autocomplete** settings will appear with the list of **Enabled Languages** -Admins on the Sourcegraph Enterprise instance can use the Cody Context Filters to determine which repositories Cody can use as the context in its requests to third-party LLMs. Inside your site configuration, you can define a set of `include` and `exclude` rules that will be used to filter the list of repositories Cody can access. +Autocomplete suggestions use the same color as inline parameter hints according to your configured editor theme. However, you can optionally enable the **Custom color for completions** checkbox to customize the color of your choice. -For repos mentioned in the `exclude` field, Cody's commands are disabled, and you cannot use them for context fetching. If you try running any of these, you'll be prompted with an error message. However, Cody chat will still work, and you can use it to ask questions. +In addition, you can use the following keyboard shortcuts to interact with Cody's autocomplete suggestions: -[Read more about the Cody Context Filters here →](/cody/capabilities/ignore-context) +- `Tab` to accept a suggestion +- `Alt + [` (Windows) or `Opt + [` (macOS) to cycle suggestions +- `Alt + \` (Windows) or `Opt + \` (macOS) to manually trigger autocomplete if no suggestions have been returned + + ## Prompts and Commands -Cody with JetBrains offers quick, ready-to-use [prompts and commands](/cody/capabilities/commands) for common actions to write, describe, fix, and smell code. These allow you to run predefined actions with smart context-fetching anywhere in the editor. These allow you to run predefined actions with smart context-fetching anywhere in the editor, like: +Cody offers quick, ready-to-use [prompts and commands](/cody/capabilities/commands) for common actions to write, describe, fix, and smell code. These allow you to run predefined actions with smart context-fetching anywhere in the editor, like: - **Edit Code**: Makes inline code edits. You also get the option to select LLM for edit suggestions -- **Document Code**: Create inline documentation for your code +- **Document Code**: Create inline docs for your code - **Generate Unit Test**: Creates inline unit tests for your code - **Smell Code**: Finds code smells in your file - **Explain Code**: Expains code in your file @@ -214,11 +211,11 @@ Let's learn about how to use some of these commands: ### Inline code edits -You can make edits to your code directly in your file without opening the chat window. The **Edit Code** command makes direct code edits, refactors, or bug fixes. +You can edit your code directly in your file without opening the chat window. The **Edit Code** command makes direct code edits, refactors, or bug fixes. -You can run the inline edit command on a selected code snippet, an entire file, or to generate code on a new line. To do so, use the **Edit Code** command in the Cody sidebar or context menu, or the `Shift + Ctrl + Enter` shortcut. This opens a floating editor where you can describe the change you want to make. +You can run the inline edit command on a selected code snippet or an entire file or generate code on a new line. Use the **Edit Code** command in the Cody sidebar or context menu or the `Shift + Ctrl + Enter` shortcut. This opens a floating editor where you can describe the change you want to make. -Once you enter your prompt, Cody will perform inline edits that you can **Accept**, **Undo**, or **Show diff** for the change. You can also click **Edit & Retry** to iterate your prompt and get alternate suggestions. +Once you enter your prompt, Cody will perform inline edits that you can **Accept**, **Undo**, or **Show diff** for the change. Click **Edit & Retry** to iterate your prompt and get alternate suggestions.