You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
To ensure low latency, Cody uses a more targeted Qwen 2.5 Coder model for Smart Apply. This model improves the responsiveness of the Smart Apply feature in both VS Code and JetBrains while preserving edit quality. Users on Cody Free, Pro, Enterprise Starter, and Enterprise plans get this default Qwen 2.5 Coder model for Smart Apply suggestions.
80
+
81
+
Enterprise users not using Cody Gateway get a Claude Sonnet-based model for Smart Apply.
82
+
77
83
## Chat history
78
84
79
85
Cody keeps a history of your chat sessions. You can view it by clicking the **History** button in the chat panel. You can **Export** it to a JSON file for later use or click the **Delete all** button to clear the chat history.
Copy file name to clipboardExpand all lines: docs/cody/capabilities/supported-models.mdx
+28-20Lines changed: 28 additions & 20 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,20 +6,20 @@ Cody supports a variety of cutting-edge large language models for use in chat an
6
6
7
7
<Callouttype="note">Newer versions of Sourcegraph Enterprise, starting from v5.6, it will be even easier to add support for new models and providers, see [Model Configuration](/cody/enterprise/model-configuration) for more information.</Callout>
<Callouttype="note">To use Claude 3 Sonnet models with Cody Enterprise, make sure you've upgraded your Sourcegraph instance to the latest version. </Callout>
25
25
@@ -39,13 +39,21 @@ In addition, Sourcegraph Enterprise customers using GCP Vertex (Google Cloud Pla
39
39
40
40
Cody uses a set of models for autocomplete which are suited for the low latency use case.
<Callouttype="note">The default autocomplete model for Cody Free, Pro and Enterprise users is DeepSeek-Coder-V2.</Callout>
50
50
51
51
<Callouttype="note">The DeepSeek model used by Sourcegraph is hosted by Fireworks.ai, and is hosted as a single-tenant service in a US-based data center. For more information see our [Cody FAQ](https://sourcegraph.com/docs/cody/faq#is-any-of-my-data-sent-to-deepseek).</Callout>
0 commit comments