Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions docs/batch-changes/configuring-credentials.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -201,6 +201,14 @@ GitHub apps follow the same concepts as [personal and global access tokens](#typ
- GitHub apps can only be used with GitHub code hosts.
- The forking mechanism (`fork:true`) is only supported if the GitHub app has access to all repositories of the installation, and if the origin repository and the fork are in the same organization that the GitHub app is installed to ([GitHub Docs](https://docs.github.com/en/rest/repos/forks?apiVersion=2022-11-28#create-a-fork)).

### Migrating from PATs to GitHub Apps

You can migrate your credentials from PATs to GitHub Apps by deleting the PAT credential and creating a GitHub app credential.

Batch Changes will look at the available credentials, and pick one that matches the targeted namespace (e.g. organization).

You can continue to use existing batch changes without modifications.

### Adding a GitHub app

Adding a GitHub app is done through the Batch Changes section of your user settings:
Expand Down
10 changes: 5 additions & 5 deletions docs/cody/clients/cody-with-sourcegraph.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -2,12 +2,10 @@

<p className="subtitle">Learn how to use Cody in the web interface with your Sourcegraph.com instance.</p>

<Callout type="info"> The new chat UI for Cody for web is currently in Beta and is available to users on Sourcegraph versions >=5.5.</Callout>

In addition to the Cody extensions for [VS Code](/cody/clients/install-vscode), [JetBrains](/cody/clients/install-jetbrains) IDEs, and [Neovim](/cody/clients/install-neovim), Cody is also available in the Sourcegraph web app. Community users can use Cody for free by logging into their accounts on Sourcegraph.com, and enterprise users can use Cody within their Sourcegraph instance.

<LinkCards>
<LinkCard href="https://sourcegraph.com/cody/chat" imgSrc="https://sourcegraph.com/.assets/img/sourcegraph-mark.svg" imgAlt="Cody for Web" title="Cody for Web (Beta)" description="Use Cody in the Sourcegraph Web App." />
<LinkCard href="https://sourcegraph.com/cody/chat" imgSrc="https://sourcegraph.com/.assets/img/sourcegraph-mark.svg" imgAlt="Cody for Web" title="Cody for Web" description="Use Cody in the Sourcegraph Web App." />
</LinkCards>

## Initial setup
Expand All @@ -27,13 +25,15 @@ The chat interface for Cody on the web is similar to the one you get with the [V

The chat interface with your Code Search queries is operated per chat. You cannot run multiple chats and store them in parallel. A new chat window opens whenever you click the Cody button from the query editor or the top header.

<Callout type="info"> The new and improved chat UI for Cody for web is currently available to users on Sourcegraph versions >=5.5. It's recommeded to update your Sourcegraph instance to the latest version to use this new chat interface. </Callout>

## LLM Selection

Sourcegraph.com users with Cody **Free** and **Pro** can choose from a list of supported LLM models for a chat. Claude Sonnet 3 is the default LLM model, but users can select the LLM of their choice from the drop-down menu.
Sourcegraph.com users with Cody **Free** and **Pro** can choose from a list of supported LLM models for a chat. Claude Sonnet 3.5 is the default LLM model, but users can select the LLM of their choice from the drop-down menu.

![llm-select-web](https://storage.googleapis.com/sourcegraph-assets/Docs/llm-select-web-0724.jpg)

Users on an Enterprise Sourcegraph instance do not have the option to choose an LLM model. Their site admin will configure the default LLM model for chat.
Users on an Enterprise Sourcegraph instance do not have the option to choose an LLM model. Their site admin will configure the default LLM model for chat. However, Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model.

## Selecting Context with @-mentions

Expand Down
2 changes: 1 addition & 1 deletion docs/cody/clients/feature-reference.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
| Local context | ✅ | ✅ | ❌ |
| OpenCtx context providers (experimental) | ✅ | ❌ | ❌ |
| **Prompts and Commands** | | | |
| Access to prompts and Prompt library | ✅ | | ✅ |
| Access to prompts and Prompt library | ✅ | | ✅ |
| Custom commands | ✅ | ❌ | ❌ |
| Edit code | ✅ | ✅ | ❌ |
| Generate unit test | ✅ | ✅ | ❌ |
Expand Down
Loading