diff --git a/docs/batch-changes/configuring-credentials.mdx b/docs/batch-changes/configuring-credentials.mdx index 05c001b9e..e455ce5bc 100644 --- a/docs/batch-changes/configuring-credentials.mdx +++ b/docs/batch-changes/configuring-credentials.mdx @@ -201,6 +201,14 @@ GitHub apps follow the same concepts as [personal and global access tokens](#typ - GitHub apps can only be used with GitHub code hosts. - The forking mechanism (`fork:true`) is only supported if the GitHub app has access to all repositories of the installation, and if the origin repository and the fork are in the same organization that the GitHub app is installed to ([GitHub Docs](https://docs.github.com/en/rest/repos/forks?apiVersion=2022-11-28#create-a-fork)). +### Migrating from PATs to GitHub Apps + +You can migrate your credentials from PATs to GitHub Apps by deleting the PAT credential and creating a GitHub app credential. + +Batch Changes will look at the available credentials, and pick one that matches the targeted namespace (e.g. organization). + +You can continue to use existing batch changes without modifications. + ### Adding a GitHub app Adding a GitHub app is done through the Batch Changes section of your user settings: diff --git a/docs/cody/clients/cody-with-sourcegraph.mdx b/docs/cody/clients/cody-with-sourcegraph.mdx index 11bc54870..17b46f521 100644 --- a/docs/cody/clients/cody-with-sourcegraph.mdx +++ b/docs/cody/clients/cody-with-sourcegraph.mdx @@ -2,12 +2,10 @@

Learn how to use Cody in the web interface with your Sourcegraph.com instance.

- The new chat UI for Cody for web is currently in Beta and is available to users on Sourcegraph versions >=5.5. - In addition to the Cody extensions for [VS Code](/cody/clients/install-vscode), [JetBrains](/cody/clients/install-jetbrains) IDEs, and [Neovim](/cody/clients/install-neovim), Cody is also available in the Sourcegraph web app. Community users can use Cody for free by logging into their accounts on Sourcegraph.com, and enterprise users can use Cody within their Sourcegraph instance. - + ## Initial setup @@ -27,13 +25,15 @@ The chat interface for Cody on the web is similar to the one you get with the [V The chat interface with your Code Search queries is operated per chat. You cannot run multiple chats and store them in parallel. A new chat window opens whenever you click the Cody button from the query editor or the top header. + The new and improved chat UI for Cody for web is currently available to users on Sourcegraph versions >=5.5. It's recommeded to update your Sourcegraph instance to the latest version to use this new chat interface. + ## LLM Selection -Sourcegraph.com users with Cody **Free** and **Pro** can choose from a list of supported LLM models for a chat. Claude Sonnet 3 is the default LLM model, but users can select the LLM of their choice from the drop-down menu. +Sourcegraph.com users with Cody **Free** and **Pro** can choose from a list of supported LLM models for a chat. Claude Sonnet 3.5 is the default LLM model, but users can select the LLM of their choice from the drop-down menu. ![llm-select-web](https://storage.googleapis.com/sourcegraph-assets/Docs/llm-select-web-0724.jpg) -Users on an Enterprise Sourcegraph instance do not have the option to choose an LLM model. Their site admin will configure the default LLM model for chat. +Users on an Enterprise Sourcegraph instance do not have the option to choose an LLM model. Their site admin will configure the default LLM model for chat. However, Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model. ## Selecting Context with @-mentions diff --git a/docs/cody/clients/feature-reference.mdx b/docs/cody/clients/feature-reference.mdx index 4b2855ea4..ad7206622 100644 --- a/docs/cody/clients/feature-reference.mdx +++ b/docs/cody/clients/feature-reference.mdx @@ -21,7 +21,7 @@ | Local context | ✅ | ✅ | ❌ | | OpenCtx context providers (experimental) | ✅ | ❌ | ❌ | | **Prompts and Commands** | | | | -| Access to prompts and Prompt library | ✅ | ❌ | ✅ | +| Access to prompts and Prompt library | ✅ | ✅ | ✅ | | Custom commands | ✅ | ❌ | ❌ | | Edit code | ✅ | ✅ | ❌ | | Generate unit test | ✅ | ✅ | ❌ | diff --git a/docs/cody/clients/install-jetbrains.mdx b/docs/cody/clients/install-jetbrains.mdx index 72c3452f5..5dc584737 100644 --- a/docs/cody/clients/install-jetbrains.mdx +++ b/docs/cody/clients/install-jetbrains.mdx @@ -66,122 +66,105 @@ Once Cody is successfully connected, you'll see that the sign-in panel has been Cody provides intelligent code suggestions and context-aware autocompletions for numerous programming languages like JavaScript, Python, TypeScript, Go, etc. - Create a new file in IntelliJ, for example, `code.js` -- Next, type the following algorithm function to sort an array of numbers - -```js -function bubbleSort(array) -``` - - As you start typing, Cody will automatically provide suggestions and context-aware completions based on your coding patterns and the code context - These autocomplete suggestions appear as grayed text. To accept the suggestion, press the `Tab` key -## Autocomplete +## Chat -Cody provides multi-line autocomplete as you type. Autocomplete suggestions appear as inlay suggestions and are enabled by default in your JetBrains IntelliJ editor. With this setting, there is a list of programming languages supported and enabled by default. +Cody chat in JetBrains is available in a unified interface opened right next to your code. Once connected to Sourcegraph, a new chat input field is opened with a default `@-mention` [context chips](#context-retrieval). -To manually configure the Autocomplete feature, +All your previous and existing chats are stored for later use and can be accessed via the **History** icon from the top menu. You can download them to share or use later in a `.json` file or delete them altogether. -- Go to the **Cody Settings...** from the Cody icon in the sidebar -- Next, click the **Sourcegraph & Cody** dropdown and select **Cody** -- The **Autocomplete** settings will appear with the list of **Enabled Languages** +### Chat interface -Autocomplete suggestions use the same color as inline parameter hints according to your configured editor theme. However, you can optionally enable the **Custom color for completions** checkbox to customize the color of your choice. - -In addition, you can use the following keyboard shortcuts to interact with Cody's autocomplete suggestions: +The chat interface is designed intuitively. Your very first chat input lives at the top of the panel, and the first message in any chat log will stay pinned to the top of the chat. After your first message, the chat input window moves to the bottom of the sidebar. -- `Tab` to accept a suggestion -- `Alt + [` (Windows) or `Opt + [` (macOS) to cycle suggestions -- `Alt + \` (Windows) or `Opt + \` (macOS) to manually trigger autocomplete if no suggestions have been returned +Since your first message to Cody anchors the conversation, you can return to the top chat box anytime, edit your prompt, or re-run it using a different LLM model. -