Skip to content

Commit f648588

Browse files
authored
Add Cody docs for VS Code v1.32 release (#586)
This PR adds docs for the v1.32 Cody VS Code extension release. Adds the following: - Cody's Autocomplete capabilities with mentions of DeepSeek as the default model - Smart Apply code additions
1 parent 1c5800b commit f648588

File tree

3 files changed

+38
-21
lines changed

3 files changed

+38
-21
lines changed

docs/cody/capabilities/autocomplete.mdx

Lines changed: 19 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,24 @@
11
# Autocomplete
22

3-
<p className="subtitle">Learn how Cody helps you get contextually-aware autocompletions for your codebase.</p>
3+
<p className="subtitle">Learn how Cody helps you get contextually-aware autocompletions for your codebase.</p>
44

55
Cody provides intelligent **autocomplete** suggestions as you type using context from your code, such as your open files and file history. Cody autocompletes single lines or whole functions in any programming language, configuration file, or documentation. It’s powered by the latest instant LLM models for accuracy and performance.
66

77
Autocomplete supports any programming language because it uses LLMs trained on broad data. It works exceptionally well for Python, Go, JavaScript, and TypeScript.
88

9-
<video width="1920" height="1080" loop playsInline controls style={{ width: '100%', height: 'auto' }}>
10-
<source src="https://storage.googleapis.com/sourcegraph-assets/Docs/Media/cody-in-action.mp4" type="video/mp4" />
11-
</video>
9+
<video width="1920" height="1080" loop playsInline controls style={{ width: '100%', height: 'auto' }}>
10+
<source src="https://storage.googleapis.com/sourcegraph-assets/Docs/Media/cody-in-action.mp4" type="video/mp4" />
11+
</video>
12+
13+
## Cody's autocomplete capabilities
14+
15+
Cody's autocompletion model has been designed to enhance speed, accuracy, and the overall user experience. Both Cody Free and Pro users can expect the following with Cody's autocomplete:
16+
17+
- **Increased speed and reduced latency**: The P75 latency is reduced by 350 ms, making the autocomplete function faster
18+
- **Improved accuracy for multi-line completions**: Completions across multiple lines are more relevant and accurately aligned with the surrounding code context
19+
- **Higher completion acceptance rates**: The average completion acceptance rate (CAR) is improved by more than 4%, providing a more intuitive user interaction
20+
21+
On the technical side, Cody's autocomplete is optimized for both server-side and client-side performance, ensuring seamless integration into your coding workflow. The **default** autocomplete model for Cody Free and Pro users is **[DeepSeek V2](https://huggingface.co/deepseek-ai/DeepSeek-V2)**, which significantly helps boost both the responsiveness and accuracy of autocomplete. Cody Enterprise users get **StarCoder** as the default autocomplete model.
1222

1323
## Prerequisites
1424

@@ -32,17 +42,17 @@ By default, a fully configured Sourcegraph instance picks a default LLM to gener
3242
- Here, edit the `completionModel` option inside the `completions`
3343
- Click the **Save** button to save the changes
3444

35-
<Callout type="note">Cody autocomplete works only with Anthropic's Claude Instant model. Support for other models will be coming later.</Callout>
45+
<Callout type="note">Cody autocomplete works only with Anthropic's Claude Instant model. Support for other models will be coming later.</Callout>
3646

37-
<Callout type="info">Self-hosted customers must update to version 5.0.4 or more to use autocomplete.</Callout>
47+
<Callout type="info">Self-hosted customers must update to version 5.0.4 or more to use autocomplete.</Callout>
3848

3949
Before configuring the autocomplete feature, it's recommended to read more about [Enabling Cody on Sourcegraph Enterprise](/cody/clients/enable-cody-enterprise) guide.
4050

4151
Cody Autocomplete goes beyond basic suggestions. It understands your code context, offering tailored recommendations based on your current project, language, and coding patterns. Let's view a quick demo using the VS Code extension.
4252

43-
<video width="1920" height="1080" loop playsInline controls style={{ width: '100%', height: 'auto' }}>
44-
<source src="https://storage.googleapis.com/sourcegraph-assets/Docs/Media/contexual-autocpmplete.mp4" type="video/mp4" />
45-
</video>
53+
<video width="1920" height="1080" loop playsInline controls style={{ width: '100%', height: 'auto' }}>
54+
<source src="https://storage.googleapis.com/sourcegraph-assets/Docs/Media/contexual-autocpmplete.mp4" type="video/mp4" />
55+
</video>
4656

4757
Here, Cody provides suggestions based on your current project, language, and coding patterns. Initially, the `code.js` file is empty. Start writing a function for `bubbleSort`. As you type, Cody suggests the function name and the function parameters.
4858

docs/cody/capabilities/supported-models.mdx

Lines changed: 12 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -19,8 +19,8 @@ Cody supports a variety of cutting-edge large language models for use in Chat an
1919
| Mistral | [mixtral 8x7b](https://mistral.ai/technology/#models:~:text=of%20use%20cases.-,Mixtral%208x7B,-Currently%20the%20best) ||| - | | | | |
2020
| Mistral | [mixtral 8x22b](https://mistral.ai/technology/#models:~:text=of%20use%20cases.-,Mixtral%208x7B,-Currently%20the%20best) ||| - | | | | |
2121
| Ollama | [variety](https://ollama.com/) | experimental | experimental | - | | | | |
22-
| Google Gemini | [1.5 Pro](https://deepmind.google/technologies/gemini/pro/) ||| ✅ (Beta) | | | | |
23-
| Google Gemini | [1.5 Flash](https://deepmind.google/technologies/gemini/flash/) ||| ✅ (Beta) | | | | |
22+
| Google Gemini | [1.5 Pro](https://deepmind.google/technologies/gemini/pro/) ||| ✅ (Beta) | | | | |
23+
| Google Gemini | [1.5 Flash](https://deepmind.google/technologies/gemini/flash/) ||| ✅ (Beta) | | | | |
2424
| | | | | | | | | |
2525

2626
<Callout type="note">To use Claude 3 (Opus and Sonnets) models with Cody Enterprise, make sure you've upgraded your Sourcegraph instance to the latest version.</Callout>
@@ -29,14 +29,15 @@ Cody supports a variety of cutting-edge large language models for use in Chat an
2929

3030
Cody uses a set of models for autocomplete which are suited for the low latency use case.
3131

32-
| **Provider** | **Model** | **Free** | **Pro** | **Enterprise** |
33-
| :----------- | :---------------------------------------------------------------------------------------- | :------------- | :------------- | :------------- |
34-
| Fireworks.ai | [StarCoder](https://arxiv.org/abs/2305.06161) ||||
35-
| Anthropic | [claude Instant](https://docs.anthropic.com/claude/docs/models-overview#model-comparison) | - | - ||
36-
| Google Gemini (Beta) | [1.5 Flash](https://deepmind.google/technologies/gemini/flash/) | - | - || | | | |
37-
| Ollama (Experimental) | [variety](https://ollama.com/) ||| - |
38-
| | | | | |
32+
| **Provider** | **Model** | **Free** | **Pro** | **Enterprise** | | | | |
33+
| :-------------------- | :---------------------------------------------------------------------------------------- | :------- | :------ | :------------- | --- | --- | --- | --- |
34+
| Fireworks.ai | [DeepSeek-V2](https://huggingface.co/deepseek-ai/DeepSeek-V2) ||| - | | | | |
35+
| Fireworks.ai | [StarCoder](https://arxiv.org/abs/2305.06161) | - | - || | | | |
36+
| Anthropic | [claude Instant](https://docs.anthropic.com/claude/docs/models-overview#model-comparison) | - | - || | | | |
37+
| Google Gemini (Beta) | [1.5 Flash](https://deepmind.google/technologies/gemini/flash/) | - | - || | | | |
38+
| Ollama (Experimental) | [variety](https://ollama.com/) ||| - | | | | |
39+
| | | | | | | | | |
3940

40-
<Callout type="note">[See here for Ollama setup instructions](https://sourcegraph.com/docs/cody/clients/install-vscode#supported-local-ollama-models-with-cody)</Callout>
41+
<Callout type="note">The default autocomplete model for Cody Free and Pro user is DeepSeek-V2. Enterprise users get StarCoder as the default model.</Callout>
4142

42-
For information on context token limits, see our [documentation here](/cody/core-concepts/token-limits).
43+
Read here for [Ollama setup instructions](https://sourcegraph.com/docs/cody/clients/install-vscode#supported-local-ollama-models-with-cody). For information on context token limits, see our [documentation here](/cody/core-concepts/token-limits).

docs/cody/clients/install-vscode.mdx

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -153,7 +153,6 @@ For Edit:
153153
- Select the default model available (this is Claude 3 Opus)
154154
- See the selection of models and click the model you desire. This model will now be the default model going forward on any new edits
155155

156-
157156
### Selecting Context with @-mentions
158157

159158
Cody's chat allows you to add files and symbols as context in your messages.
@@ -272,7 +271,14 @@ For customization and advanced use cases, you can create **Custom Commands** tai
272271

273272
<Callout type="info">Learn more about Custom Commands [here](/cody/capabilities/commands#custom-commands)</Callout>
274273

274+
## Smart Apply code suggestions
275+
276+
Cody lets you dynamically insert code from chat into your files with **Smart Apply**. Every time Cody provides you with a code suggestion, you can click the **Apply** button. Cody will then analyze your open code file, find where that relevant code should live, and add a diff.
277+
278+
For chat messages where Cody provides multiple code suggestions, you can apply each in sequence to go from chat suggestions to written code.
279+
275280
## Keyboard shortcuts
281+
276282
Cody provides a set of powerful keyboard shortcuts to streamline your workflow and boost productivity. These shortcuts allow you to quickly access Cody's features without leaving your keyboard.
277283

278284
* `Opt+L` (macOS) or `Alt+L` (Windows/Linux): Toggles between the chat view and the last active text editor. If a chat view doesn't exist, it opens a new one. When used with an active selection in a text editor, it adds the selected code to the chat for context.

0 commit comments

Comments
 (0)