You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Docs working branch for the Sourcegraph v6.1 release (#971)
Docs for v6.1
- Autocomplete for Visual Studio
- Updated UI for @-mention prompts
- Deprecating old LLM modes
- Updating outdated images
---------
Co-authored-by: Kalan <[email protected]>
Copy file name to clipboardExpand all lines: README.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
1
# Sourcegraph Docs
2
2
3
-
<!-- Working branch for JAN 2025 Release -->
3
+
<!-- Working branch for FEB 2025 Release -->
4
4
5
5
Welcome to the Sourcegraph documentation! We're excited to have you contribute to our docs. We've recently rearchitectured our docs tech stack — powered by Next.js, TailwindCSS and deployed on Vercel. This guide will walk you through the process of contributing to our documentation using the new tech stack.
You can still switch to your Sourcegraph account whenever you want to use Claude, OpenAI, Gemini, Mixtral, etc.
69
+
You can still switch to your Sourcegraph account whenever you want to use Claude, OpenAI, Gemini, etc.
70
70
71
71
## LLM selection
72
72
@@ -123,29 +123,6 @@ To use Cody's chat, you'll need the following:
123
123
124
124
The enhanced chat experience includes everything in the Free plan, plus the following:
125
125
126
-
## Intent detection
127
-
128
-
Intent detection automatically analyzes user queries and determines whether to provide an AI chat or code search response. This functionality helps simplify developer workflows by providing the most appropriate type of response without requiring explicit mode switching.
129
-
130
-
### How it works
131
-
132
-
When a user submits a query in the chat panel, the intent detection component:
133
-
134
-
- Analyzes the query content and structure
135
-
- Determines the most appropriate response type (search or chat)
136
-
- Returns results in the optimal format
137
-
- Provides the ability to toggle between response types manually
The smart search integration enhances Sourcegraph's chat experience by providing lightweight code search capabilities directly within the chat interface. This feature simplifies developer workflows by offering quick access to code search without leaving the chat environment.
@@ -181,15 +158,16 @@ Search results generated through smart search integration can be automatically u
181
158
The following is a general walkthrough of the chat experience:
182
159
183
160
1. The user enters a query in the chat interface
184
-
2. The system analyzes the query through intent detection
185
-
3. If it's a search query:
161
+
2. By default a user gets a chat response for the query
162
+
3. To get integrated search results, toggle to **Run as search** from the drop-down selector or alternatively use `Cmd+Opt+Enter` (macOS)
163
+
4. For search:
186
164
- Displays ranked results with code snippets
187
165
- Shows personalized repository ordering
188
166
- Provides checkboxes to select context for follow-ups
189
-
4. If it's a chat query:
167
+
5. For chat:
190
168
- Delivers AI-powered responses
191
169
- Can incorporate previous search results as context
<Callouttype="note">To use Claude 3 (Opus and Sonnets) models with Cody Enterprise, make sure you've upgraded your Sourcegraph instance to the latest version.</Callout>
24
+
<Callouttype="note">To use Claude 3 Sonnet models with Cody Enterprise, make sure you've upgraded your Sourcegraph instance to the latest version.</Callout>
30
25
31
26
## Autocomplete
32
27
@@ -37,7 +32,6 @@ Cody uses a set of models for autocomplete which are suited for the low latency
Copy file name to clipboardExpand all lines: docs/cody/clients/install-eclipse.mdx
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -50,7 +50,7 @@ The chat input field has a default `@-mention` [context chips](#context-retrieva
50
50
51
51
## LLM selection
52
52
53
-
Cody offers a variety of large language models (LLMs) to power your chat experience. Cody Free users can access the latest base models from Anthropic, OpenAI, Google, and Mixtral. At the same time, Cody Pro and Enterprise users can access more extended models.
53
+
Cody offers a variety of large language models (LLMs) to power your chat experience. Cody Free users can access the latest base models from Anthropic, OpenAI, Google. At the same time, Cody Pro and Enterprise users can access more extended models.
54
54
55
55
Local models are also available through Ollama to Cody Free and Cody Pro users. To use a model in Cody chat, simply download it and run it in Ollama.
Copy file name to clipboardExpand all lines: docs/cody/clients/install-visual-studio.mdx
+11-1Lines changed: 11 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -43,7 +43,7 @@ The chat input field has a default `@-mention` [context chips](#context-retrieva
43
43
44
44
## LLM selection
45
45
46
-
Cody offers a variety of large language models (LLMs) to power your chat experience. Cody Free users can access the latest base models from Anthropic, OpenAI, Google, and Mixtral. At the same time, Cody Pro and Enterprise users can access more extended models.
46
+
Cody offers a variety of large language models (LLMs) to power your chat experience. Cody Free users can access the latest base models from Anthropic, OpenAI, Google. At the same time, Cody Pro and Enterprise users can access more extended models.
47
47
48
48
Local models are also available through Ollama to Cody Free and Cody Pro users. To use a model in Cody chat, download it and run it in Ollama.
49
49
@@ -78,3 +78,13 @@ To help you get started, there are a few prompts that are available by default.
Cody for Visual Studio supports single and multi-line autocompletions. The autocomplete feature is available for the extension `v0.2.0` and above. It's enabled by default, with settings to turn it off.
Advanced features like [auto-edit](/cody/capabilities/auto-edit) are not yet supported. To disable the autocomplete feature, you can do it from your Cody settings section.
Enterprise users get Claude 3 (Opus and Sonnet) as the default LLM models without extra cost. Moreover, Enterprise users can use Claude 3.5 models through Cody Gateway, Anthropic BYOK, Amazon Bedrock (limited availability), and GCP Vertex.
276
+
Enterprise users get Claude 3.5 Sonnet as the default LLM models without extra cost. Moreover, Enterprise users can use Claude 3.5 models through Cody Gateway, Anthropic BYOK, Amazon Bedrock (limited availability), and GCP Vertex.
277
277
278
278
<Callouttype="info">For enterprise users on Amazon Bedrock: 3.5 Sonnet is unavailable in `us-west-2` but available in `us-east-1`. Check the current model availability on AWS and your customer's instance location before switching. Provisioned throughput via AWS is not supported for 3.5 Sonnet.</Callout>
279
279
280
-
You also get additional capabilities like BYOLLM (Bring Your Own LLM), supporting Single-Tenant and Self Hosted setups for flexible coding environments. Your site administrator determines the LLM, and cannot be changed within the editor. However, Cody Enterprise users when using Cody Gateway have the ability to [configure custom models](/cody/core-concepts/cody-gateway#configuring-custom-models)Anthropic (like Claude 2.0 and Claude Instant), OpenAI (GPT 3.5 and GPT 4) and Google Gemini 1.5 models (Flash and Pro).
280
+
You also get additional capabilities like BYOLLM (Bring Your Own LLM), supporting Single-Tenant and Self Hosted setups for flexible coding environments. Your site administrator determines the LLM, and cannot be changed within the editor. However, Cody Enterprise users when using Cody Gateway have the ability to [configure custom models](/cody/core-concepts/cody-gateway#configuring-custom-models)from Anthropic, OpenAI, and Google Gemini.
281
281
282
282
<Callouttype="note">Read more about all the supported LLM models [here](/cody/capabilities/supported-models)</Callout>
283
283
@@ -333,7 +333,7 @@ You can use Cody with or without an internet connection. The offline mode does n
0 commit comments