You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This PR adds docs for the v1.32 Cody VS Code extension release. Adds the
following:
- Cody's Autocomplete capabilities with mentions of DeepSeek as the
default model
- Smart Apply code additions
Copy file name to clipboardExpand all lines: docs/cody/capabilities/autocomplete.mdx
+19-9Lines changed: 19 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,14 +1,24 @@
1
1
# Autocomplete
2
2
3
-
<pclassName="subtitle">Learn how Cody helps you get contextually-aware autocompletions for your codebase.</p>
3
+
<pclassName="subtitle">Learn how Cody helps you get contextually-aware autocompletions for your codebase.</p>
4
4
5
5
Cody provides intelligent **autocomplete** suggestions as you type using context from your code, such as your open files and file history. Cody autocompletes single lines or whole functions in any programming language, configuration file, or documentation. It’s powered by the latest instant LLM models for accuracy and performance.
6
6
7
7
Autocomplete supports any programming language because it uses LLMs trained on broad data. It works exceptionally well for Python, Go, JavaScript, and TypeScript.
Cody's autocompletion model has been designed to enhance speed, accuracy, and the overall user experience. Both Cody Free and Pro users can expect the following with Cody's autocomplete:
16
+
17
+
-**Increased speed and reduced latency**: The P75 latency is reduced by 350 ms, making the autocomplete function faster
18
+
-**Improved accuracy for multi-line completions**: Completions across multiple lines are more relevant and accurately aligned with the surrounding code context
19
+
-**Higher completion acceptance rates**: The average completion acceptance rate (CAR) is improved by more than 4%, providing a more intuitive user interaction
20
+
21
+
On the technical side, Cody's autocomplete is optimized for both server-side and client-side performance, ensuring seamless integration into your coding workflow. The **default** autocomplete model for Cody Free and Pro users is **[DeepSeek V2](https://huggingface.co/deepseek-ai/DeepSeek-V2)**, which significantly helps boost both the responsiveness and accuracy of autocomplete. Cody Enterprise users get **StarCoder** as the default autocomplete model.
12
22
13
23
## Prerequisites
14
24
@@ -32,17 +42,17 @@ By default, a fully configured Sourcegraph instance picks a default LLM to gener
32
42
- Here, edit the `completionModel` option inside the `completions`
33
43
- Click the **Save** button to save the changes
34
44
35
-
<Callouttype="note">Cody autocomplete works only with Anthropic's Claude Instant model. Support for other models will be coming later.</Callout>
45
+
<Callouttype="note">Cody autocomplete works only with Anthropic's Claude Instant model. Support for other models will be coming later.</Callout>
36
46
37
-
<Callouttype="info">Self-hosted customers must update to version 5.0.4 or more to use autocomplete.</Callout>
47
+
<Callouttype="info">Self-hosted customers must update to version 5.0.4 or more to use autocomplete.</Callout>
38
48
39
49
Before configuring the autocomplete feature, it's recommended to read more about [Enabling Cody on Sourcegraph Enterprise](/cody/clients/enable-cody-enterprise) guide.
40
50
41
51
Cody Autocomplete goes beyond basic suggestions. It understands your code context, offering tailored recommendations based on your current project, language, and coding patterns. Let's view a quick demo using the VS Code extension.
Here, Cody provides suggestions based on your current project, language, and coding patterns. Initially, the `code.js` file is empty. Start writing a function for `bubbleSort`. As you type, Cody suggests the function name and the function parameters.
<Callouttype="note">To use Claude 3 (Opus and Sonnets) models with Cody Enterprise, make sure you've upgraded your Sourcegraph instance to the latest version.</Callout>
@@ -29,14 +29,15 @@ Cody supports a variety of cutting-edge large language models for use in Chat an
29
29
30
30
Cody uses a set of models for autocomplete which are suited for the low latency use case.
<Callouttype="note">[See here for Ollama setup instructions](https://sourcegraph.com/docs/cody/clients/install-vscode#supported-local-ollama-models-with-cody)</Callout>
41
+
<Callouttype="note">The default autocomplete model for Cody Free and Pro user is DeepSeek-V2. Enterprise users get StarCoder as the default model.</Callout>
41
42
42
-
For information on context token limits, see our [documentation here](/cody/core-concepts/token-limits).
43
+
Read here for [Ollama setup instructions](https://sourcegraph.com/docs/cody/clients/install-vscode#supported-local-ollama-models-with-cody). For information on context token limits, see our [documentation here](/cody/core-concepts/token-limits).
Copy file name to clipboardExpand all lines: docs/cody/clients/install-vscode.mdx
+7-1Lines changed: 7 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -153,7 +153,6 @@ For Edit:
153
153
- Select the default model available (this is Claude 3 Opus)
154
154
- See the selection of models and click the model you desire. This model will now be the default model going forward on any new edits
155
155
156
-
157
156
### Selecting Context with @-mentions
158
157
159
158
Cody's chat allows you to add files and symbols as context in your messages.
@@ -272,7 +271,14 @@ For customization and advanced use cases, you can create **Custom Commands** tai
272
271
273
272
<Callouttype="info">Learn more about Custom Commands [here](/cody/capabilities/commands#custom-commands)</Callout>
274
273
274
+
## Smart Apply code suggestions
275
+
276
+
Cody lets you dynamically insert code from chat into your files with **Smart Apply**. Every time Cody provides you with a code suggestion, you can click the **Apply** button. Cody will then analyze your open code file, find where that relevant code should live, and add a diff.
277
+
278
+
For chat messages where Cody provides multiple code suggestions, you can apply each in sequence to go from chat suggestions to written code.
279
+
275
280
## Keyboard shortcuts
281
+
276
282
Cody provides a set of powerful keyboard shortcuts to streamline your workflow and boost productivity. These shortcuts allow you to quickly access Cody's features without leaving your keyboard.
277
283
278
284
*`Opt+L` (macOS) or `Alt+L` (Windows/Linux): Toggles between the chat view and the last active text editor. If a chat view doesn't exist, it opens a new one. When used with an active selection in a text editor, it adds the selected code to the chat for context.
0 commit comments