diff --git a/docs/cody/capabilities/agentic-context-fetching.mdx b/docs/cody/capabilities/agentic-context-fetching.mdx index 73fac727a..cb7d703f0 100644 --- a/docs/cody/capabilities/agentic-context-fetching.mdx +++ b/docs/cody/capabilities/agentic-context-fetching.mdx @@ -54,7 +54,7 @@ Agentic context fetching can be helpful to assist you with a wide range of tasks ## Enable agentic context fetching -Agentic context fetching is enabled by default for all Cody users. It uses LLM reflection and basic tool use steps to gather and refine context before sending it in the final model query. The review step in agentic context fetching experience defaults to Gemini 2.5 Flash and falls back to Claude Haiku or GPT 4.1 mini if Flash is unavailable. +Agentic context fetching is enabled by default. It uses LLM reflection and basic tool use steps to gather and refine context before sending it in the final model query. The review step in agentic context fetching experience defaults to Gemini 2.5 Flash and falls back to Claude Haiku or GPT 4.1 mini if Flash is unavailable. You can disable agentic context in your extension settings using `cody.agenticContext`. diff --git a/docs/cody/capabilities/auto-edit.mdx b/docs/cody/capabilities/auto-edit.mdx index 0f58fba34..bff568ff5 100644 --- a/docs/cody/capabilities/auto-edit.mdx +++ b/docs/cody/capabilities/auto-edit.mdx @@ -2,7 +2,7 @@

Auto-edit suggests code changes by analyzing cursor movements and typing. After you've made at least one character edit in your codebase, it begins proposing contextual modifications based on your cursor position and recent changes.

-Auto-edit is currently supported with Sourcegraph v6.0+ for Pro, Enterprise Starter, and Enterprise accounts on Cody Gateway. Auto-edit requires Fireworks to be enabled as a provider. Enterprise customers without Fireworks enabled can disable the feature flag. +Auto-edit is currently supported with Sourcegraph v6.0+ for Enterprise accounts on Cody Gateway. Auto-edit requires Fireworks to be enabled as a provider. Enterprise customers without Fireworks enabled can disable the feature flag. ## Capabilities of auto-edit @@ -46,11 +46,11 @@ Auto-edit is supported by Cody VS Code, JetBrains, and Visual Studio plugins. -Auto-edit is available for Pro, Enterprise Starter, and Enterprise users on Cody Gateway. Auto-edit requires Fireworks to be enabled as a provider. Enterprise customers without Fireworks enabled can disable the feature flag. +Auto-edit is available for Enterprise users on Cody Gateway. Auto-edit requires Fireworks to be enabled as a provider. Enterprise customers without Fireworks enabled can disable the feature flag. ## Enabling auto-edit in VS Code -Auto-edit is enabled by default for Cody Pro Enterprise Starter and Enterprise users. You can opt out and switch back to autocomplete by selecting it from the suggestion mode in the Cody VS Code extension settings. +Auto-edit is enabled by default for Cody Enterprise users. You can opt out and switch back to autocomplete by selecting it from the suggestion mode in the Cody VS Code extension settings. Site admins can opt their organization out of the auto-edit feature by disabling it from their config settings. @@ -88,11 +88,11 @@ The following example demonstrates how to add Fireworks as an allowed LLM provid -JetBrains IDEs support auto-edit for versions 7.84.0+. It's available for Pro, Enterprise Starter, and Enterprise users on Cody Gateway. Auto-edit requires Fireworks to be enabled as a provider. Enterprise customers without Fireworks enabled can disable the feature flag. +JetBrains IDEs support auto-edit for versions 7.84.0+. It's available for Enterprise users on Cody Gateway. Auto-edit requires Fireworks to be enabled as a provider. Enterprise customers without Fireworks enabled can disable the feature flag. ## Enabling auto-edit in JetBrains -Auto-edit is enabled by default for Cody Pro Enterprise Starter and Enterprise users. You can opt out and switch back to autocomplete by selecting it from the suggestion mode in the Cody JetBrains extension settings. +Auto-edit is enabled by default for Cody Enterprise users. You can opt out and switch back to autocomplete by selecting it from the suggestion mode in the Cody JetBrains extension settings. Site admins can opt their organization out of the auto-edit feature by disabling it from their config settings. @@ -136,11 +136,11 @@ The following example demonstrates how to add Fireworks as an allowed LLM provid -Visual Studio supports auto-edit for versions 17.6 and above. It's available for Pro, Enterprise Starter, and Enterprise users on Cody Gateway. Auto-edit requires Fireworks to be enabled as a provider. Enterprise customers without Fireworks enabled can disable the feature flag. +Visual Studio supports auto-edit for versions 17.6 and above. It's available for Enterprise users on Cody Gateway. Auto-edit requires Fireworks to be enabled as a provider. Enterprise customers without Fireworks enabled can disable the feature flag. ## Enabling auto-edit in Visual Studio -Auto-edit is enabled by default for Cody Pro Enterprise Starter and Enterprise users. Two settings must be enabled by default in the Visual Studio Cody extension settings to make the auto-edit feature work. +Auto-edit is enabled by default for Cody Enterprise users. Two settings must be enabled by default in the Visual Studio Cody extension settings to make the auto-edit feature work. 1. Automatically trigger completions 2. Enable Cody Auto-edit diff --git a/docs/cody/capabilities/autocomplete.mdx b/docs/cody/capabilities/autocomplete.mdx index 22d16e674..2f2e48e17 100644 --- a/docs/cody/capabilities/autocomplete.mdx +++ b/docs/cody/capabilities/autocomplete.mdx @@ -4,7 +4,7 @@ Cody predicts what you're trying to write before you even type it. It offers single-line and multi-line suggestions based on the provided code context, ensuring accurate autocomplete suggestions. Cody autocomplete supports a [wide range of programming languages](/cody/faq#what-programming-languages-does-cody-support) because it uses LLMs trained on broad data. -Code autocompletions are optimized for both server-side and client-side performance, ensuring seamless integration into your coding workflow. The **default** autocomplete model for Cody Free, Pro, and Enterprise users is **[DeepSeek V2](https://huggingface.co/deepseek-ai/DeepSeek-V2)**, which significantly helps boost both the responsiveness and accuracy of autocomplete. +Code autocompletions are optimized for both server-side and client-side performance, ensuring seamless integration into your coding workflow. The **default** autocomplete model for Cody Enterprise users is **[DeepSeek V2](https://huggingface.co/deepseek-ai/DeepSeek-V2)**, which significantly helps boost both the responsiveness and accuracy of autocomplete. ## Cody's autocomplete capabilities @@ -18,10 +18,10 @@ The autocompletion model is designed to enhance speed, accuracy, and the overall First, you'll need the following setup: -- A Free or Pro account via Sourcegraph.com or a Sourcegraph Enterprise instance +- A Sourcegraph Enterprise account with Cody enabled - A supported editor extension (VS Code, JetBrains, Visual Studio) -The autocomplete feature is enabled by default on all IDE extensions, i.e., VS Code and JetBrains. Generally, there's a checkbox in the extension settings that confirms whether the autocomplete feature is enabled or not. In addition, some autocomplete settings are optionally and explicitly supported by some IDEs. For example, JetBrains IDEs have settings that allow you to customize colors and styles of the autocomplete suggestions. +The autocomplete feature is available on all IDE extensions, i.e., VS Code, JetBrains and Visual Studio. Generally, there's a checkbox in the extension settings that confirms whether the autocomplete feature is enabled or not. In addition, some autocomplete settings are optionally and explicitly supported by some IDEs. For example, JetBrains IDEs have settings that allow you to customize colors and styles of the autocomplete suggestions. When you start typing, Cody will automatically provide suggestions and context-aware completions based on your coding patterns and the code context. These autocomplete suggestions appear as grayed text. Press the `Enter` or `Tab` to accept the suggestion. @@ -37,8 +37,8 @@ By default, a fully configured Sourcegraph instance picks a default LLM to gener - Go to the **Site admin** of your Sourcegraph instance - Navigate to **Configuration > Site configuration** -- Here, edit the `completionModel` option inside the `completions` -- Click the **Save** button to save the changes +- Here, edit the `modelConfiguration` [section](/cody/enterprise/model-configuration) to include the autocomplete model you want to use +- Click **Save** to save the changes Cody supports and uses a set of models for autocomplete. Learn more about these [here](/cody/capabilities/supported-models#autocomplete). It's also recommended to read the [Enabling Cody on Sourcegraph Enterprise](/cody/clients/enable-cody-enterprise) docs. diff --git a/docs/cody/capabilities/chat.mdx b/docs/cody/capabilities/chat.mdx index 8f418d58e..c07d6ce65 100644 --- a/docs/cody/capabilities/chat.mdx +++ b/docs/cody/capabilities/chat.mdx @@ -1,6 +1,6 @@ # Chat -

Chat with the AI assistant in your code editor or via the Sourcegraph web app to get intelligent suggestions, code autocompletions, and contextually aware answers.

+

Chat with Cody in your code editor or via the Sourcegraph web app to get intelligent suggestions, code autocompletions, and contextually aware answers.

You can **chat** with Cody to ask questions about your code, generate code, and edit code. By default, Cody has the context of your open file and entire repository, and you can use `@` to add context for specific files, symbols, remote repositories, or other non-code artifacts. @@ -10,8 +10,8 @@ You can do it from the **chat** panel of the supported editor extensions ([VS Co To use Cody's chat, you'll need the following: -- [Sourcegraph Enterprise Starter](https://sourcegraph.com/pricing) or [Enterprise account](https://sourcegraph.com/pricing) -- A supported editor extension [VS Code](https://marketplace.visualstudio.com/items?itemName=sourcegraph.cody-ai), [JetBrains](https://plugins.jetbrains.com/plugin/9682-cody-ai-coding-assistant-with-autocomplete--chat) installed or use via Web app +- A Sourcegraph Enterprise account with Cody enabled +- A supported editor extension installed or use via the Sourcegraph Web app ## How does chat work? @@ -41,29 +41,12 @@ When you have both a repository and files @-mentioned, Cody will search the repo You can add new custom context by adding `@-mention` context chips to the chat. At any point, you can use `@-mention` a repository, file, line range, or symbol, to ask questions about your codebase. Cody will use this new context to generate contextually relevant code. -## OpenCtx context providers - -OpenCtx context providers are in the Experimental stage for all Cody VS Code users. Enterprise users can also use this but with limited support. If you have feedback or questions, please visit our [support forum](https://community.sourcegraph.com/c/openctx/10). - -[OpenCtx](https://openctx.org/) is an open standard for bringing contextual info about code into your dev tools. Cody Free and Pro users can use OpenCtx providers to fetch and use context from the following sources: - -- [Webpages](https://openctx.org/docs/providers/web) (via URL) -- [Jira tickets](https://openctx.org/docs/providers/jira) -- [Linear issues](https://openctx.org/docs/providers/linear-issues) -- [Notion pages](https://openctx.org/docs/providers/notion) -- [Google Docs](https://openctx.org/docs/providers/google-docs) -- [Sourcegraph code search](https://openctx.org/docs/providers/sourcegraph-search) - -You can use `@-mention` web URLs to pull live information like docs. You can connect Cody to OpenCtx to `@-mention` non-code artifacts like Google Docs, Notion pages, Jira tickets, and Linear issues. - ## LLM selection -Cody allows you to select the LLM you want to use for your chat, which is optimized for speed versus accuracy. Cody Free and Pro users can select multiple models. Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model. +Cody allows you to select the LLM you want to use for your chat, which is optimized for speed versus accuracy. Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model. You can read about these supported LLM models [here](/cody/capabilities/supported-models#chat-and-commands). -![LLM-models-for-cody-free](https://storage.googleapis.com/sourcegraph-assets/Docs/llm-dropdown-options-free-pro-2025.png) - ## Smart Apply and Execute code suggestions Cody lets you dynamically insert code from chat into your files with **Smart Apply**. Whenever Cody provides a code suggestion, you can click the **Apply** button. Cody will then analyze your open code file, find where that relevant code should live, and add a diff. For chat messages where Cody provides multiple code suggestions, you can apply each in sequence to go from chat suggestions to written code. @@ -74,9 +57,9 @@ Smart Apply also supports the executing of commands in the terminal. When you as ### Model used for Smart Apply -To ensure low latency, Cody uses a more targeted Qwen 2.5 Coder model for Smart Apply. This model improves the responsiveness of the Smart Apply feature in both VS Code and JetBrains while preserving edit quality. Users on Cody Free, Pro, Enterprise Starter, and Enterprise plans get this default Qwen 2.5 Coder model for Smart Apply suggestions. +To ensure low latency, Cody uses a more targeted Qwen 2.5 Coder model for Smart Apply. This model improves the responsiveness of the Smart Apply feature in both VS Code and JetBrains while preserving edit quality. -Enterprise users not using Cody Gateway get a Claude Sonnet-based model for Smart Apply. +Enterprise users on Cody Gateway get this default Qwen 2.5 Coder model for Smart Apply suggestions. Enterprise users not using Cody Gateway get a Claude Sonnet-based model for Smart Apply. ## Chat history diff --git a/docs/cody/capabilities/debug-code.mdx b/docs/cody/capabilities/debug-code.mdx index f637003ec..b74113148 100644 --- a/docs/cody/capabilities/debug-code.mdx +++ b/docs/cody/capabilities/debug-code.mdx @@ -2,7 +2,7 @@

Learn how Cody helps you identify errors in your code and provides code fixes.

-Cody is optimized to identify and fix errors in your code. Its debugging capability and autocomplete suggestions can significantly accelerate your debugging process, increasing developer productivity. All Cody IDE extensions (VS Code, JetBrains) support code debugging and fixes capabilities. +Cody is optimized to identify and fix errors in your code. Its debugging capability and autocomplete suggestions can significantly accelerate your debugging process, increasing developer productivity. Cody IDE extensions (VS Code, JetBrains) support code debugging and fixes capabilities. ## Use chat for code fixes @@ -33,8 +33,6 @@ You can detect code smells by the **find-code-smells** prompt from the Prompts d ## Code Actions -Code Actions are available only in Cody VS Code extension. - When you make a mistake while writing code, Cody's **Code Actions** come into play and a red warning triggers. Along with this, you get a lightbulb icon. If you click on this lightbulb icon, there is an **Ask Cody to fix** option. - Click the lightbulb icon in the project file diff --git a/docs/cody/capabilities/ignore-context.mdx b/docs/cody/capabilities/ignore-context.mdx index b17d5965f..9416a9784 100644 --- a/docs/cody/capabilities/ignore-context.mdx +++ b/docs/cody/capabilities/ignore-context.mdx @@ -1,8 +1,6 @@ # Manage Cody Context -

You can control and manage what context from your codebase is used by Cody. You can do this by using Cody Context Filters.

- -Cody Context Filters is available only for Enterprise users on all supported [clients](/cody/clients). +

You can control and manage what context from your codebase is used by Cody. You can do this by using Cody Context Filters. It is supported on all Cody [clients](/cody/clients).

## Context Filters diff --git a/docs/cody/capabilities/index.mdx b/docs/cody/capabilities/index.mdx index 6216177f4..20f83a17c 100644 --- a/docs/cody/capabilities/index.mdx +++ b/docs/cody/capabilities/index.mdx @@ -8,11 +8,11 @@ - + - + diff --git a/docs/cody/capabilities/openctx.mdx b/docs/cody/capabilities/openctx.mdx index a113546da..ba0e8c13f 100644 --- a/docs/cody/capabilities/openctx.mdx +++ b/docs/cody/capabilities/openctx.mdx @@ -4,16 +4,7 @@ MCP is the recommended method for adding external context in Cody due to its broad community adoption and extensive tool support. [Read the docs](/cody/capabilities/agentic-context-fetching#mcp-support) to learn more about configuring MCP. -[OpenCtx](https://openctx.org/) is an open standard for bringing contextual info about code into your dev tools. OpenCtx context providers are in the Experimental stage for all Cody users. Enterprise users can use this, but with limited support. Cody Free and Pro users can use OpenCtx providers to fetch and use context from the following sources: - -- [Webpages (via URL)](https://openctx.org/docs/providers/web) (enabled in Cody by default) -- [Jira tickets](https://openctx.org/docs/providers/jira) -- [Linear issues](https://openctx.org/docs/providers/linear-issues) -- [Notion pages](https://openctx.org/docs/providers/notion) -- [Google Docs](https://openctx.org/docs/providers/google-docs) -- [Slack](https://openctx.org/docs/providers/slack) -- [Storybook](https://openctx.org/docs/providers/storybook) -- [Sourcegraph code search](https://openctx.org/docs/providers/sourcegraph-search) +[OpenCtx](https://openctx.org/) is an open standard for bringing contextual info about code into your dev tools. OpenCtx context providers are in the Experimental stage for Enterprise users with limited support. ## Enable OpenCtx context providers diff --git a/docs/cody/capabilities/prompts.mdx b/docs/cody/capabilities/prompts.mdx index 6c8c8f1b6..dc6953ba8 100644 --- a/docs/cody/capabilities/prompts.mdx +++ b/docs/cody/capabilities/prompts.mdx @@ -2,18 +2,18 @@

Learn how prompts can automate and accelerate your workflow with Cody.

-Cody offers quick, ready-to-use **Prompts** to automate key tasks in your workflow. Prompts are created and saved in the **Prompt Library** and can be accessed from the top navigation bar in the Sourcegraph.com instance. +Cody offers quick, ready-to-use **Prompts** to automate key tasks in your workflow. Prompts are created and saved in the **Prompt Library** and can be accessed from the top navigation bar in your Sourcegraph Enterprise instance. To run Prompts and access Prompt Library, you must have the following: -- Free account on Sourcegraph.com or Sourcegraph Enterprise instance with Cody enabled +- A Sourcegraph Enterprise account with Cody enabled - Cody extension installed in your IDE (VS Code, JetBrains, Visual Studio) ## Prompt Library The **Prompt Library** allows you to create, edit, share, and save prompts you’ve created or shared within your organization. You can also search for prompts, filter the list to find a specific prompt by the owner, and sort by name or updated recently. -Go to **Tools > Prompt Library** from the top navigation bar in the Sourcegraph.com instance. Alternatively, you can access the **Prompt Library** from the **Cody** extension in your IDE, which directs you to the Prompt Library page. +Go to **Prompts** from the top navigation bar in your Sourcegraph Enterprise instance. Alternatively, you can access the **Prompt Library** from the **Cody** extension in your IDE, which directs you to the Prompt Library page. Here, you can view all prompts (shared with you in an organization or created by you) and some core (built-in) prompts to help you get started. @@ -32,7 +32,7 @@ You can run these prompts by clicking the **play** icon next to the prompt name, ## Create prompts -Click the **New prompt** button from the **Prompt Library** page. +Click the **Create new prompt** button from the **Prompt Library** page. - Select the **Owner** and **Prompt Name** - Write a prompt description diff --git a/docs/cody/capabilities/supported-models.mdx b/docs/cody/capabilities/supported-models.mdx index 0aa56bf69..0f2dc0ea2 100644 --- a/docs/cody/capabilities/supported-models.mdx +++ b/docs/cody/capabilities/supported-models.mdx @@ -6,30 +6,30 @@ Cody supports a variety of cutting-edge large language models for use in chat an Newer versions of Sourcegraph Enterprise, starting from v5.6, it will be even easier to add support for new models and providers, see [Model Configuration](/cody/enterprise/model-configuration) for more information. -| **Provider** | **Model** | **Free** | **Pro** | **Enterprise** | | | | | -| :------------ | :-------------------------------------------------------------------------------------------------------------------------------------------- | :----------- | :----------- | :------------- | --- | --- | --- | --- | -| OpenAI | [GPT-4 Turbo](https://platform.openai.com/docs/models/gpt-4-and-gpt-4-turbo#:~:text=TRAINING%20DATA-,gpt%2D4%2D0125%2Dpreview,-New%20GPT%2D4) | - | ✅ | ✅ | | | | | -| OpenAI | [GPT-4o](https://platform.openai.com/docs/models#gpt-4o) | - | ✅ | ✅ | | | | | -| OpenAI | [GPT-4o-mini](https://platform.openai.com/docs/models#gpt-4o-mini) | ✅ | ✅ | ✅ | | | | | -| OpenAI | [o3-mini-medium](https://openai.com/index/openai-o3-mini/) (experimental) | ✅ | ✅ | ✅ | | | | | -| OpenAI | [o3-mini-high](https://openai.com/index/openai-o3-mini/) (experimental) | - | - | ✅ | | | | | -| OpenAI | [o3](https://platform.openai.com/docs/models#o3) | - | ✅ | ✅ | | | | | -| OpenAI | [o4-mini](https://platform.openai.com/docs/models/o4-mini) | ✅ | ✅ | ✅ | | | | | -| OpenAI | [GPT-4.1](https://platform.openai.com/docs/models/gpt-4.1) | - | ✅ | ✅ | | | | | -| OpenAI | [GPT-4.1-mini](https://platform.openai.com/docs/models/gpt-4o-mini) | ✅ | ✅ | ✅ | | | | | -| OpenAI | [GPT-4.1-nano](https://platform.openai.com/docs/models/gpt-4.1-nano) | ✅ | ✅ | ✅ | | | | | -| Anthropic | [Claude 3.5 Haiku](https://docs.anthropic.com/claude/docs/models-overview#model-comparison) | ✅ | ✅ | ✅ | | | | | -| Anthropic | [Claude 3.5 Sonnet](https://docs.anthropic.com/claude/docs/models-overview#model-comparison) | ✅ | ✅ | ✅ | | | | | -| Anthropic | [Claude 3.7 Sonnet](https://docs.anthropic.com/claude/docs/models-overview#model-comparison) | - | ✅ | ✅ | | | | | -| Anthropic | [Claude Sonnet 4](https://docs.anthropic.com/en/docs/about-claude/models/overview) | ✅ | ✅ | ✅ | | | | | -| Anthropic | [Claude Sonnet 4 w/Thinking](https://docs.anthropic.com/en/docs/about-claude/models/overview) | - | ✅ | ✅ | | | | | -| Anthropic | [Claude Opus 4](https://docs.anthropic.com/en/docs/about-claude/models/overview) | - | - | ✅ | | | | | -| Anthropic | [Claude Opus 4 w/Thinking](https://docs.anthropic.com/en/docs/about-claude/models/overview) | - | - | ✅ | | | | | -| Google | [Gemini 1.5 Pro](https://deepmind.google/technologies/gemini/pro/) | ✅ | ✅ | ✅ (beta) | | | | | -| Google | [Gemini 2.0 Flash](https://deepmind.google/technologies/gemini/flash/) | ✅ | ✅ | ✅ | | | | | -| Google | [Gemini 2.0 Flash](https://deepmind.google/technologies/gemini/flash/) | ✅ | ✅ | ✅ | | | | | -| Google | [Gemini 2.5 Pro Preview](https://cloud.google.com/vertex-ai/generative-ai/docs/models/gemini/2-5-pro) | - | ✅ | ✅ | | | | | -| Google | [Gemini 2.5 Flash Preview](https://cloud.google.com/vertex-ai/generative-ai/docs/models/gemini/2-5-flash) (experimental) | ✅ | ✅ | ✅ | | | | | +| **Provider** | **Model** | **Status** | +| :----------- | :-------------------------------------------------------------------------------------------------------------------------------------------- | :--------------- | +| OpenAI | [GPT-4 Turbo](https://platform.openai.com/docs/models/gpt-4-and-gpt-4-turbo#:~:text=TRAINING%20DATA-,gpt%2D4%2D0125%2Dpreview,-New%20GPT%2D4) | ✅ | +| OpenAI | [GPT-4o](https://platform.openai.com/docs/models#gpt-4o) | ✅ | +| OpenAI | [GPT-4o-mini](https://platform.openai.com/docs/models#gpt-4o-mini) | ✅ | +| OpenAI | [o3-mini-medium](https://openai.com/index/openai-o3-mini/) | ✅ (experimental) | +| OpenAI | [o3-mini-high](https://openai.com/index/openai-o3-mini/) | ✅ (experimental) | +| OpenAI | [o3](https://platform.openai.com/docs/models#o3) | ✅ | +| OpenAI | [o4-mini](https://platform.openai.com/docs/models/o4-mini) | ✅ | +| OpenAI | [GPT-4.1](https://platform.openai.com/docs/models/gpt-4.1) | ✅ | +| OpenAI | [GPT-4.1-mini](https://platform.openai.com/docs/models/gpt-4o-mini) | ✅ | +| OpenAI | [GPT-4.1-nano](https://platform.openai.com/docs/models/gpt-4.1-nano) | ✅ | +| Anthropic | [Claude 3.5 Haiku](https://docs.anthropic.com/claude/docs/models-overview#model-comparison) | ✅ | +| Anthropic | [Claude 3.5 Sonnet](https://docs.anthropic.com/claude/docs/models-overview#model-comparison) | ✅ | +| Anthropic | [Claude 3.7 Sonnet](https://docs.anthropic.com/claude/docs/models-overview#model-comparison) | ✅ | +| Anthropic | [Claude Sonnet 4](https://docs.anthropic.com/en/docs/about-claude/models/overview) | ✅ | +| Anthropic | [Claude Sonnet 4 w/ Thinking](https://docs.anthropic.com/en/docs/about-claude/models/overview) | ✅ | +| Anthropic | [Claude Opus 4](https://docs.anthropic.com/en/docs/about-claude/models/overview) | ✅ | +| Anthropic | [Claude Opus 4 w/ Thinking](https://docs.anthropic.com/en/docs/about-claude/models/overview) | ✅ | +| Google | [Gemini 1.5 Pro](https://deepmind.google/technologies/gemini/pro/) | ✅ (beta) | +| Google | [Gemini 2.0 Flash](https://deepmind.google/technologies/gemini/flash/) | ✅ | +| Google | [Gemini 2.0 Flash](https://deepmind.google/technologies/gemini/flash/) | ✅ | +| Google | [Gemini 2.5 Pro Preview](https://cloud.google.com/vertex-ai/generative-ai/docs/models/gemini/2-5-pro) | ✅ | +| Google | [Gemini 2.5 Flash Preview](https://cloud.google.com/vertex-ai/generative-ai/docs/models/gemini/2-5-flash) | ✅ (experimental) | To use Claude 3 Sonnet models with Cody Enterprise, make sure you've upgraded your Sourcegraph instance to the latest version. @@ -49,28 +49,25 @@ See [Model Configuration: Reasoning models](/cody/enterprise/model-configuration Cody uses a set of models for autocomplete which are suited for the low latency use case. -| **Provider** | **Model** | **Free** | **Pro** | **Enterprise** | | | | | -| :----------- | :---------------------------------------------------------------------------------------- | :------- | :------ | :------------- | --- | --- | --- | --- | -| Fireworks.ai | [DeepSeek-Coder-V2](https://huggingface.co/deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct) | ✅ | ✅ | ✅ | | | | | -| Anthropic | [claude Instant](https://docs.anthropic.com/claude/docs/models-overview#model-comparison) | - | - | ✅ | | | | | -| | | | | | | | | | +| **Provider** | **Model** | **Status** | +| :----------- | :---------------------------------------------------------------------------------------- | :------------- | +| Fireworks.ai | [DeepSeek-Coder-V2](https://huggingface.co/deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct) | ✅ | +| Anthropic | [claude Instant](https://docs.anthropic.com/claude/docs/models-overview#model-comparison) | ✅ | -The default autocomplete model for Cody Free, Pro and Enterprise users is DeepSeek-Coder-V2. + +The default autocomplete model for Enterprise users is DeepSeek-Coder-V2. The DeepSeek model used by Sourcegraph is hosted by Fireworks.ai, and is hosted as a single-tenant service in a US-based data center. For more information see our [Cody FAQ](https://sourcegraph.com/docs/cody/faq#is-any-of-my-data-sent-to-deepseek). ## Smart Apply -| **Provider** | **Model** | **Free** | **Pro** | **Enterprise** | | | | | | | -| :----------- | :------------- | :------- | :------ | :------------- | --- | --- | --- | --- | --- | --- | -| Fireworks.ai | Qwen 2.5 Coder | ✅ | ✅ | ✅ | | | | | | | - -Fireworks.ai is the default model for cody-gateway, but if you wish to switch to Claude models, Site admins can do it following these steps- - -Go to "Site admin" - -Click on the "Feature flags" +| **Provider** | **Model** | **Status** | +| :----------- | :------------- | :--------- | +| Fireworks.ai | Qwen 2.5 Coder | ✅ | -Search for cody-smart-apply-instant-mode-enabled feature flag +Fireworks.ai is the default model for cody-gateway, but if you wish to switch to Claude models, Site admins can do it following these steps: -Turn off/delete the "cody-smart-apply-instant-mode-enabled" feature flag +- Go to **Site admin** +- Click on the **Feature flags** +- Search for `cody-smart-apply-instant-mode-enabled` feature flag +- Turn off/delete the **cody-smart-apply-instant-mode-enabled** feature flag diff --git a/docs/cody/clients/cody-with-sourcegraph.mdx b/docs/cody/clients/cody-with-sourcegraph.mdx index ed2b967d3..2f8f330cf 100644 --- a/docs/cody/clients/cody-with-sourcegraph.mdx +++ b/docs/cody/clients/cody-with-sourcegraph.mdx @@ -1,35 +1,33 @@ # Cody for Web -

Learn how to use Cody in the web interface with your Sourcegraph.com instance.

+

Learn how to use Cody in the web interface with your Sourcegraph.com Enterprise instance.

-In addition to the Cody extensions for [VS Code](/cody/clients/install-vscode), [JetBrains](/cody/clients/install-jetbrains), and [Visual Studio](/cody/clients/install-visual-studio ) IDEs, Cody is also available in the Sourcegraph web app. Community users can use Cody for free by logging into their accounts on Sourcegraph.com, and enterprise users can use Cody within their Sourcegraph instance. +In addition to the Cody extensions for [VS Code](/cody/clients/install-vscode), [JetBrains](/cody/clients/install-jetbrains), and [Visual Studio](/cody/clients/install-visual-studio ) IDEs, Cody is also available in the Sourcegraph web app. - + ## Initial setup -Create a [Sourcegraph.com account](https://sourcegraph.com/sign-up) by logging in through codehosts like GitHub and GitLab or via traditional Google sign-in. This takes you to Sourcegraph’s web interface. From here, there are two ways to access the Cody chat: +Log in to your Sourcegraph.com Enterprise instance through codehosts like GitHub and GitLab or via traditional Google sign-in. This takes you to Sourcegraph’s web interface. From here, there are two ways to access the Cody chat: -1. Run any search query via **Code Search** and click the **Cody** button on the left to open the chat window +1. Run any search query via **Code Search** and click the **Cody** button on the right to open the chat window 2. Directly click the **Chat** tab from the top header to open the chat interface ![cody-web](https://storage.googleapis.com/sourcegraph-assets/Docs/cody-web-2025.png) -Enterprise users can also log in to their Sourcegraph.com Enterprise instance and use Cody in the web interface. - ## Chat interface -The Cody chat interface for the web is similar to the one you get with the IDE extensions. However, the chat experience is slightly different depending on whether you use Cody with your search query results or directly from the top header. +The Cody chat interface for the web is similar to the one you get with the IDE extensions. However, the chat experience is slightly different depending on whether you use Cody with your search query results or directly from the top Chat header. -The chat interface with your Code Search queries opens parallel to your query search results, similar to the chat window in the IDE extensions. However, when you click **Cody** from the top header in your Sourcegraph.com instance, the chat interface opens on a new page. +The chat interface with your Code Search queries opens parallel to your query search results, similar to the chat window in the IDE extensions. However, when you click **Chat** from the top header in your Sourcegraph.com Enterprise instance, the chat interface opens on a new page. The new and improved chat UI for Cody for the web is currently available to users on Sourcegraph versions >=5.5. To use this new chat interface, you should update your Sourcegraph instance to the latest version. ## Chat with Cody on the web interface -The feature set for the Cody chat is the same as the IDE extensions. Your previous chats can be viewed from the **History** tab. Claude 3.5 Sonnet (New) is selected as the default chat model. You can change this LLM model based on your use case to optimize speed, accuracy, or cost. Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model. You can read about these supported LLM models [here](/cody/capabilities/supported-models#chat-and-commands). +The feature set for the Cody chat is the same as the IDE extensions. Your previous chats can be viewed from the **History** tab. Claude 3.5 Sonnet is selected as the default chat model. You can change this LLM model based on your use case to optimize speed, accuracy, or cost. Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model. You can read about these supported LLM models [here](/cody/capabilities/supported-models#chat-and-commands). To help you automate your key tasks in your development workflow, you get **[Prompts](/cody/capabilities/commands)**. If you are a part of an organization on Sourcegraph.com or a self-hosted Sourcegraph instance, you can view these pre-built Prompts created by your teammates. On the contrary, you can create your Prompts via the **Prompt Library** from your Sourcegraph instance. diff --git a/docs/cody/clients/feature-reference.mdx b/docs/cody/clients/feature-reference.mdx index 01ac4b628..a26e6ac16 100644 --- a/docs/cody/clients/feature-reference.mdx +++ b/docs/cody/clients/feature-reference.mdx @@ -4,26 +4,29 @@ ## Chat -| **Feature** | **VS Code** | **JetBrains** | **Visual Studio** | **Web** | **CLI** | -| ---------------------------------------- | ----------- | ------------- | ----------------- | -------------------- | ------- | -| Chat | ✅ | ✅ | ✅ | ✅ | ✅ | -| Chat history | ✅ | ✅ | ✅ | ✅ | ❌ | -| Clear chat history | ✅ | ✅ | ✅ | ✅ | ❌ | -| Edit sent messages | ✅ | ✅ | ✅ | ✅ | ❌ | -| SmartApply/Execute | ✅ | ❌ | ❌ | ❌ | ❌ | -| Show context files | ✅ | ✅ | ✅ | ✅ | ❌ | -| @-file | ✅ | ✅ | ✅ | ✅ | ❌ | -| @-symbol | ✅ | ❌ | ✅ | ✅ | ❌ | -| LLM Selection | ✅ | ✅ | ✅ | ✅ | ❌ | -| Agentic Context Fetching | ✅ | ✅ | ✅ | ✅ | ✅ | -| **Context Selection** | | | | | | -| Single-repo context | ✅ | ✅ | ✅ | ✅ | ❌ | -| Multi-repo context | ❌ | ❌ | ❌ | ✅ (public code only) | ❌ | -| Local context | ✅ | ✅ | ✅ | ❌ | ✅ | -| OpenCtx context providers (experimental) | ✅ | ❌ | ❌ | ❌ | ❌ | -| **Prompts** | | | | | | -| Access to prompts and Prompt library | ✅ | ✅ | ✅ | ✅ | ❌ | -| Promoted Prompts | ✅ | ❌ | ❌ | ✅ | ❌ | +| **Feature** | **VS Code** | **JetBrains** | **Visual Studio** | **Web** | **CLI** | +| ------------------------------------ | ----------- | ------------- | ----------------- | ------- | ------- | +| Chat | ✅ | ✅ | ✅ | ✅ | ✅ | +| Chat history | ✅ | ✅ | ✅ | ✅ | ❌ | +| Clear chat history | ✅ | ✅ | ✅ | ✅ | ❌ | +| Edit sent messages | ✅ | ✅ | ✅ | ✅ | ❌ | +| SmartApply/Execute | ✅ | ❌ | ❌ | ❌ | ❌ | +| Show context files | ✅ | ✅ | ✅ | ✅ | ❌ | +| @-file | ✅ | ✅ | ✅ | ✅ | ❌ | +| @-symbol | ✅ | ❌ | ✅ | ✅ | ❌ | +| @-directories | ✅ | ✅ | ✅ | ✅ | ❌ | +| LLM Selection | ✅ | ✅ | ✅ | ✅ | ❌ | +| Admin LLM Selection | ✅ | ✅ | ✅ | ✅ | ❌ | +| Agentic Context Fetching | ✅ | ✅ | ✅ | ✅ | ✅ | +| **Context Selection** | | | | | | +| Single-repo context | ✅ | ✅ | ✅ | ✅ | ❌ | +| Multi-repo context | ✅ | ✅ | ✅ | ✅ | ❌ | +| Local context | ✅ | ✅ | ✅ | ❌ | ✅ | +| Guardrails | ✅ | ✅ | ❌ | ✅ | ❌ | +| Repo-based context filters | ✅ | ✅ | ✅ | ✅ | ✅ | +| **Prompts** | | | | | | +| Access to prompts and Prompt library | ✅ | ✅ | ✅ | ✅ | ❌ | +| Promoted Prompts | ✅ | ❌ | ❌ | ✅ | ❌ | ## Code Autocomplete and Auto-edit @@ -33,21 +36,4 @@ | Cycle through multiple completion suggestions | ✅ | ✅ | ✅ | | Accept suggestions word-by-word | ✅ | ❌ | ❌ | | Auto-edit suggestions via cursor movements and typing | ✅ | ✅ | ❌ | - -Few exceptions that apply to Cody Pro and Cody Enterprise users: - - - -- Multi-repo context is not supported in VS Code, JetBrains, or the Web UI for Cody Pro - - - - - -- Admin LLM selection is suported on VS Code, JetBrains, Visual Studio, and Web both for chat and code autocomplete -- Multi-repo context is supported on VS Code, JetBrains, Visual Studio, and Web -- [Guardrails](/cody/clients/enable-cody-enterprise#guardrails) are supported on VS Code, JetBrains, and Web -- [Repo-based Cody Context Filters](/cody/capabilities/ignore-context#cody-context-filters) are supported on all Cody clients. -- `@-mention` directories are supported on VS Code, JetBrains, Visual Studio, and Web - - +| Admin LLM selection | ✅ | ✅ | ✅ | diff --git a/docs/cody/clients/index.mdx b/docs/cody/clients/index.mdx index c90372f3a..c992e57f1 100644 --- a/docs/cody/clients/index.mdx +++ b/docs/cody/clients/index.mdx @@ -3,9 +3,9 @@

There are multiple ways to use Cody: you can install its extension in your favorite IDEs, access it via the Sourcegraph web app, or use it through the Cody CLI.

- - - + + + diff --git a/docs/cody/clients/install-jetbrains.mdx b/docs/cody/clients/install-jetbrains.mdx index 1aa87efaf..105c56c98 100644 --- a/docs/cody/clients/install-jetbrains.mdx +++ b/docs/cody/clients/install-jetbrains.mdx @@ -5,13 +5,13 @@ The Cody plugin by Sourcegraph enhances your coding experience in your IDE by providing intelligent code suggestions, context-aware completions, and advanced code analysis. This guide will walk you through the steps to install and set up Cody within your JetBrains environment. - + ## Prerequisites - You have the latest version of JetBrains IDEs installed -- You have a Free or Pro account via Sourcegraph.com or a Sourcegraph Enterprise account +- A Sourcegraph Enterprise account with Cody enabled - Cody is compatible with the following JetBrains IDEs: - [Android Studio](https://developer.android.com/studio) - [AppCode](https://www.jetbrains.com/objc/) @@ -41,14 +41,6 @@ Alternatively, you can also [download and install the plugin from the JetBrains After a successful installation, the Cody icon appears in the Tool Windows Bar. -### Cody Free or Cody Pro Users - -Cody Free and Pro users can sign in to their Sourcegraph.com accounts using SSO through GitHub, GitLab, or Google. - -![cody-for-intellij-login](https://storage.googleapis.com/sourcegraph-assets/Docs/sign-in-cody-jb-2025.jpg) - -### Sourcegraph Enterprise Cody Users - Sourcegraph Enterprise users should connect Cody to their Enterprise instance by clicking **Sign in with an Enterprise Instance**. To connect the plugin with your Enterprise instance, @@ -94,13 +86,9 @@ Since your first message to Cody anchors the conversation, you can return to the Users must be on JetBrains v2023.2 and Cody plugin v7.0.0 or above to get the new and improved chat UI. -### Chat History - -A chat history icon at the top of your chat input window allows you to navigate between chats (and search chats) without opening the Cody sidebar. - ### Changing LLM model for chat - You need to be a Cody Free or Pro user to have multi-model selection capability. You can view which LLMs you can access on our [supported LLMs page](/cody/capabilities/supported-models). Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model. +You can view which LLMs you can access on our [supported LLMs page](/cody/capabilities/supported-models). Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model. For Chat: @@ -115,7 +103,7 @@ For Edit: - Select the default model available - See the selection of models and click the model you desire. This model will now be the default model for any new edits -### Selecting Context with @-mentions +### Selecting context with @-mentions Cody's chat allows you to add files as context in your messages. @@ -149,17 +137,13 @@ If Cody's answer isn't helpful, you can try asking again with a different contex ![jb-rerun-context](https://storage.googleapis.com/sourcegraph-assets/Docs/jb-rerun-context-2025.png) -## Context fetching mechanism - -JetBrains users on the Free or Pro plan use [local context](/cody/core-concepts/context#context-sources). +## Context fetching mechanism and sources Enterprise users can leverage the full power of the Sourcegraph search engine as Cody's primary context provider. Read more about [Context fetching mechanisms](/cody/core-concepts/context/#context-fetching-mechanism) in detail. -## Context sources -You can @-mention files and web pages in Cody. Cody Enterprise also supports @-mentioning repositories to search for context in a broader scope. -Cody Free and Pro offer single-repo context, and Cody Enterprise supports multi-repo context. +You can @-mention files and web pages in Cody. Cody Enterprise also supports @-mentioning multi-repo context to search in a broader scope. ### Cody Context Filters @@ -171,28 +155,6 @@ For repos mentioned in the `exclude` field, Cody prompts are disabled, and you c [Read more about Cody Context Filters here →](/cody/capabilities/ignore-context) -## Autocomplete - -Cody provides multi-line autocomplete as you type. Autocomplete suggestions appear as inlay suggestions and are enabled by default in your JetBrains IDE. This setting lists the programming languages supported and enabled by default. - -To manually configure the Autocomplete feature, - -- Go to the **Cody Settings...** from the Cody icon in the sidebar -- Next, click the **Sourcegraph & Cody** dropdown and select **Cody** -- The **Autocomplete** settings will appear with the list of **Enabled Languages** - -Autocomplete suggestions use the same color as inline parameter hints according to your configured editor theme. However, you can optionally enable the **Custom color for completions** checkbox to customize the color of your choice. - -In addition, you can use the following keyboard shortcuts to interact with Cody's autocomplete suggestions: - -- `Tab` to accept a suggestion -- `Alt + [` (Windows) or `Opt + [` (macOS) to cycle suggestions -- `Alt + \` (Windows) or `Opt + \` (macOS) to manually trigger autocomplete if no suggestions have been returned - - - ## Prompts Cody allows you create quick, ready-to-use [prompts](/cody/capabilities/commands) to automate key tasks in your workflow. Prompts are created and saved in the Prompt Library and can be accessed from the **Tools > Prompt Library** in the top navigation bar in your Sourcegraph instance. @@ -226,28 +188,11 @@ Cody with JetBrains can also propose fixes and updates to errors in your code. T All you need to do is select and highlight the code line with the error and click the lightbulb icon. Then select **Ask Cody to Fix**. You can then view the diff and accept or undo the suggested change. -## Updating the plugin - -JetBrains IDEs will typically notify you when updates are available for installed plugins. Follow the prompts to update the Cody AI plugin to the latest version. - -## Change the update channel for stable or nightly releases - -Our nightly release channel gets updated much more frequently, which might help verify bug fixes that will be included in the next stable release. -To update your update channel, you can do the following: - -1. Open your JetBrains IDE settings by selecting **IDE Name | Settings** on macOS or **File | Settings** on Windows and Linux from the main menu. -1. Get to the Cody Settings by navigating to `Tools -> Sourcegraph & Cody` -1. Under the update channel, select `Stable` or `Nightly` - ## Supported LLM models -Cody Free and Pro users can choose from a list of supported LLM models for chat. - -![llm-selection-cody](https://storage.googleapis.com/sourcegraph-assets/Docs/jb-llm-select-2025.png) - Enterprise users who have [model configuration](/Cody/clients/model-configuration#model-configuration) configured can also select from the available models for their instance. On instances with the ["completions" configuration](/Cody/clients/model-configuration#completions-configuration), a site admin determines the LLM, which cannot be changed within the editor. -Read and learn more about the [supported LLMs](/cody/capabilities/supported-models) and [token limits](/cody/core-concepts/token-limits) on Cody Free, Pro and Enterprise. +Read and learn more about the [supported LLMs](/cody/capabilities/supported-models) and [token limits](/cody/core-concepts/token-limits). ## Add/remove account @@ -264,6 +209,15 @@ Alternatively, you can also manage multiple accounts in Cody Settings: 1. Under authentication, see the accounts that are currently logged in 1. To remove, select your account and hit `-`. To add click `+` and choose the appropriate login method +## Change the update channel for stable or nightly releases + +Our nightly release channel gets updated much more frequently, which might help verify bug fixes that will be included in the next stable release. +To update your update channel, you can do the following: + +1. Open your JetBrains IDE settings by selecting **IDE Name | Settings** on macOS or **File | Settings** on Windows and Linux from the main menu. +1. Get to the Cody Settings by navigating to `Tools -> Sourcegraph & Cody` +1. Under the update channel, select `Stable` or `Nightly` + ## Find Cody features You can find and discover all Cody features and actions using the **Search Everywhere** option in JetBrains IDEs. Press `Shift` twice to open the `Search Everywhere` window. Then, type in the `Cody:` prefix to get a list of all supported Cody actions. diff --git a/docs/cody/clients/install-visual-studio.mdx b/docs/cody/clients/install-visual-studio.mdx index ea92cc12d..0edafd8b0 100644 --- a/docs/cody/clients/install-visual-studio.mdx +++ b/docs/cody/clients/install-visual-studio.mdx @@ -2,18 +2,18 @@

Learn how to use Cody and its features with the Visual Studio editor.

-Cody for Visual Studio is currently in the Experimental stage and currently supports chat and autocomplete. +Cody for Visual Studio is currently in the Experimental stage. Cody extension for Visual Studio enhances your coding experience by providing intelligent and contextually aware answers to your questions. This guide will walk you through installing and setting Cody within your Visual Studio editor. - + ## Prerequisites - You have the latest version of [Visual Studio](https://visualstudio.microsoft.com/) installed -- You have a Free or Pro account via Sourcegraph.com or a Sourcegraph Enterprise account +- A Sourcegraph Enterprise account with Cody enabled ## Install the Visual Studio extension @@ -23,15 +23,13 @@ Cody extension for Visual Studio enhances your coding experience by providing in ## Connect the extension to Sourcegraph -Cody for Visual Studio is available for all Cody plans, including Cody Free, Pro, and Enterprise. - After a successful installation, go to **Tools** from the main toolbar at the top and click the **Cody Chat** from the drop-down. This opens the dialog box to connect to your Sourcegraph instance. -Cody Free or Pro users can sign in to their Sourcegraph.com accounts through GitHub, GitLab, or Google. Meanwhile, Sourcegraph Enterprise users should connect Cody via their Enterprise instance URL and the Access Token. +Sourcegraph Enterprise users should connect Cody via their Enterprise instance URL and the Access Token. Complete these steps, and you'll be ready to start using Cody in Visual Studio. -![install-cody-vscode](https://storage.googleapis.com/sourcegraph-assets/Docs/cody-vs-setup-102024-2.png) +![install-cody-vscode](https://storage.googleapis.com/sourcegraph-assets/Docs/cody-vs-setup-0725.png) ## Chat @@ -43,7 +41,7 @@ The chat input field has a default `@-mention` [context chips](#context-retrieva ## LLM selection -Cody offers a variety of large language models (LLMs) to power your chat experience. Cody Free users can access the latest base models from Anthropic, OpenAI, Google. At the same time, Cody Pro and Enterprise users can access more extended models. +Cody offers a variety of large language models (LLMs) to power your chat experience.You can access the latest base models from Anthropic, OpenAI, Google. You can read more about it in our [Supported LLM models docs](/cody/capabilities/supported-models). @@ -84,5 +82,3 @@ Cody for Visual Studio supports single and multi-line autocompletions. The autoc - -Advanced features like [auto-edit](/cody/capabilities/auto-edit) are not yet supported. To disable the autocomplete feature, you can do it from your Cody settings section. diff --git a/docs/cody/clients/install-vscode.mdx b/docs/cody/clients/install-vscode.mdx index 1f0ac77fc..28cd08745 100644 --- a/docs/cody/clients/install-vscode.mdx +++ b/docs/cody/clients/install-vscode.mdx @@ -5,13 +5,13 @@ The Cody extension by Sourcegraph enhances your coding experience in VS Code by providing intelligent code suggestions, context-aware autocomplete, and advanced code analysis. This guide will walk you through the steps to install and set up Cody within your VS Code environment. - + ## Prerequisites - You have the latest version of [VS Code](https://code.visualstudio.com/) installed -- You have a Free or Pro account via Sourcegraph.com or a Sourcegraph Enterprise account +- A Sourcegraph Enterprise account with Cody enabled ## Install the VS Code extension @@ -30,14 +30,6 @@ Alternatively, you can also [download and install the extension from the VS Code After a successful installation, the Cody icon appears in the [Activity sidebar](https://code.visualstudio.com/api/ux-guidelines/activity-bar). -### Cody Free or Cody Pro Users - -Cody Free and Cody Pro users can sign in to their Sourcegraph.com accounts through GitHub, GitLab, or Google. - -![](https://storage.googleapis.com/sourcegraph-assets/Docs/cody-new-ui-2025.jpg) - -### Sourcegraph Enterprise Cody Users - If you are using an older version of Cody, uninstall it and reload VS Code. It's always recommended to install the latest version before proceeding to the next steps. Sourcegraph Enterprise users should connect Cody to their Enterprise instance by clicking **Sign In to Your Enterprise Instance**. @@ -124,7 +116,7 @@ A chat history icon at the top of your chat input window allows you to navigate ### Changing LLM model for chat - You need to be a Cody Free or Pro user to have multi-model selection capability. You can view which LLMs you have access to on our [supported LLMs page](/cody/capabilities/supported-models). Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model. +You can view which LLMs you have access to on our [supported LLMs page](/cody/capabilities/supported-models). Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model. For Chat: @@ -164,31 +156,6 @@ At any point in time, you can edit these context chips or remove them completely When you have both a repository and files @-mentioned, Cody will search the repository for context while prioritizing the mentioned files. -### @-mention context providers with OpenCtx - -OpenCtx context providers are in Experimental stage for all Cody users. Enterprise users can also use this but with limited support. If you have feedback or questions, please visit our [support forum](https://community.sourcegraph.com/c/openctx/10). - -[OpenCtx](https://openctx.org/) is an open standard for bringing contextual info about code into your dev tools. Cody Free and Pro users can use OpenCtx providers to fetch and use context from the following sources: - -- [Webpages](https://openctx.org/docs/providers/web) (via URL) -- [Jira tickets](https://openctx.org/docs/providers/jira) -- [Linear issues](https://openctx.org/docs/providers/linear-issues) -- [Notion pages](https://openctx.org/docs/providers/notion) -- [Google Docs](https://openctx.org/docs/providers/google-docs) -- [Sourcegraph code search](https://openctx.org/docs/providers/sourcegraph-search) - -To try it out, add context providers to your VS Code settings. For example, to use the [DevDocs provider](https://openctx.org/docs/providers/devdocs), add the following to your `settings.json`: - -```json -"openctx.providers": { - "https://openctx.org/npm/@openctx/provider-devdocs": { - "urls": ["https://devdocs.io/go/", "https://devdocs.io/angular~16/"] - } -}, -``` - -You don't need the OpenCtx VS Code extension to use context fetching with OpenCtx. We recommend uninstalling the extension before using this feature in Cody. - ### Rerun prompts with different context If Cody's answer isn't helpful, you can try asking again with different context: @@ -198,19 +165,13 @@ If Cody's answer isn't helpful, you can try asking again with different context: ![re-run-with-context](https://storage.googleapis.com/sourcegraph-assets/Docs/re-run-with-context-2025.png) -## Context fetching mechanism - -VS Code users on the Free or Pro plan use [local context](/cody/core-concepts/context#context-sources). +## Context fetching mechanism and sources Enterprise users can use the full power of the Sourcegraph search engine as Cody's primary context provider. Read more about [Context fetching mechanism](/cody/core-concepts/context/#context-fetching-mechanism) in detail. -## Context sources - -You can @-mention files, symbols, and web pages in Cody. Cody Enterprise also supports @-mentioning repositories to search for context in a broader scope. Cody's experimental [OpenCtx](https://openctx.org) support adds even more context sources, including Jira, Linear, Google Docs, Notion, and more. - -Cody Free and Pro offer single-repo context, and Cody Enterprise supports multi-repo context. +You can @-mention files, symbols, and web pages in Cody. It also supports @-mentioning multi-repo context to search context in a broader scope. ### Cody Context Filters @@ -257,10 +218,6 @@ Cody provides a set of powerful keyboard shortcuts to streamline your workflow a * `Cmd+.` (macOS) or `Ctrl+.` (Windows/Linux): Opens the Quick Fix menu, which includes options for Cody to edit or generate code based on your current context. -## Updating the extension - -VS Code will typically notify you when updates are available for installed extensions. Follow the prompts to update the Cody AI extension to the latest version. - ## Authenticating Cody with VS Code forks Cody also works with Cursor, Gitpod, IDX, and other similar VS Code forks. To access VS Code forks like Cursor, select **Sign in with URL and access token** and generate an access token. Next, copy and paste into the allocated field, using `https://sourcegraph.com` as the URL. @@ -269,9 +226,7 @@ Cody also works with Cursor, Gitpod, IDX, and other similar VS Code forks. To ac Claude 3.5 Sonnet is the default LLM model for inline edits and prompts. If you've used a different or older LLM model for inline edits before, remember to manually change your model to Claude 3.5 Sonnet. Default model changes only affect new users. -Users on Cody **Free** and **Pro** can choose from a list of [supported LLM models](/cody/capabilities/supported-models) for chat. - -![LLM-models-for-cody-free](https://storage.googleapis.com/sourcegraph-assets/Docs/llm-dropdown-options-0225.jpg) +Here's a list of [supported LLM models](/cody/capabilities/supported-models) for chat. Enterprise users get Claude 3.5 Sonnet as the default LLM models without extra cost. Moreover, Enterprise users can use Claude 3.5 models through Cody Gateway, Anthropic BYOK, Amazon Bedrock (limited availability), and GCP Vertex. @@ -281,89 +236,6 @@ You also get additional capabilities like BYOLLM (Bring Your Own LLM), supportin Read more about all the supported LLM models [here](/cody/capabilities/supported-models) -## Experimental models - -Support for the following models is currently in the Experimental stage, and available for Cody Free and Pro plans. - -The following experimental model providers can be configured in Cody's extension settings JSON: - -- Google (requires [Google AI Studio API key](https://aistudio.google.com/app/apikey)) -- Groq (requires [GroqCloud API key](https://console.groq.com/docs/api-keys)) -- OpenAI & OpenAI-Compatible API (requires [OpenAI API key](https://platform.openai.com/account/api-keys)) -- Ollama (remote) - -Once configured, and VS Code has been restarted, you can select the configured model from the dropdown both for chat and for edits. - -Example VS Code user settings JSON configuration: - -```json -{ - "cody.dev.models": [ - { - "provider": "google", - "model": "gemini-2.0-flash-exp", - "inputTokens": 1048576, - "outputTokens": 8192, - "apiKey": "", - "options": { - "temperature": 0.0 - } - }, - { - "provider": "groq", - "model": "llama2-70b-4096", - "inputTokens": 4000, - "outputTokens": 4000, - "apiKey": "", - "options": { - "temperature": 0.0 - } - }, - { - "provider": "openai", - "model": "some-model-id", - "inputTokens": 32000, - "outputTokens": 4000, - "apiKey": "", - "options": { - "temperature": 0.0 - }, - "apiEndpoint": "https://host.domain/path" - }, - { - "provider": "ollama", - "model": "some-model-id", - "apiEndpoint": "https://host.domain/path" - } - ] -} -``` - -### Provider configuration options - -- `"provider"`: `"google"`, `"groq"`, `"ollama"` or `"openai"` - - The LLM provider type. -- `"model"`: `string` - - The ID of the model, e.g. `"gemini-2.0-flash-exp"` -- `"inputTokens"`: `number` - optional - - The context window size of the model's input. Default: 7000. -- `"outputTokens"`: `number` - optional - - The context window size of the model's output. Default: 4000. -- `"apiKey"`: `string` - optional - - The API key for the endpoint. Required if the provider is `"google"`, `"groq"`, `"ollama"` or `"openai"`. -- `"apiEndpoint"`: `string` - optional - - The endpoint URL, if you don't want to use the provider’s default endpoint. -- `"options"` : `object` - optional - - Additional parameters like `temperature`, `topK`, `topP` based on provider documentation. - -### Debugging experimental models - -To debug problems with the experimental models, use the VS Code output panel which can be opened using the following steps: - -- Open the Cody Sidebar -- Next to "Settings and Support" click the "..." icon -- Click "Open Output Channel" - ## Add/remove account To add/remove an account you can do the following: diff --git a/docs/cody/core-concepts/context.mdx b/docs/cody/core-concepts/context.mdx index c0d9b3076..194877c15 100644 --- a/docs/cody/core-concepts/context.mdx +++ b/docs/cody/core-concepts/context.mdx @@ -27,32 +27,25 @@ All these methods collectively ensure Cody's ability to provide relevant and hig Cody uses @-mentions to retrieve context from your codebase. Inside the chat window, there is an `@` icon that you can click to select a context source. Alternatively, you can press `@` to open the context picker. -Based on your Cody tier, you can @-mention the following: - -| **Tier** | **Client** | **Files** | **Symbols** | **Web URLs** | **Remote Files/Directories** | **OpenCtx** | -| -------------- | ------------- | --------- | ----------- | ------------ | ---------------------------- | ----------- | -| **Free/Pro** | VS Code | ✅ | ✅ | ✅ | ❌ | ✅ | -| | JetBrains | ✅ | ❌ | ✅ | ❌ | ❌ | -| | Visual Studio | ✅ | ✅ | ✅ | ❌ | ❌ | -| | Cody Web | ✅ | ✅ | ✅ | ❌ | ❌ | -| **Enterprise** | VS Code | ✅ | ✅ | ✅ | ✅ | ✅ | -| | JetBrains | ✅ | ❌ | ✅ | ✅ | ❌ | -| | Visual Studio | ✅ | ✅ | ✅ | ✅ | ✅ | -| | Cody Web | ✅ | ✅ | ✅ | ✅ | ❌ | +Enterprise users can @-mention the following context sources on: + +| **Client** | **Files** | **Symbols** | **Web URLs** | **Remote Files/Directories** | **OpenCtx** | +| ------------- | --------- | ----------- | ------------ | ---------------------------- | ----------- | +| VS Code | ✅ | ✅ | ✅ | ✅ | ✅ | +| JetBrains | ✅ | ❌ | ✅ | ✅ | ❌ | +| Visual Studio | ✅ | ✅ | ✅ | ✅ | ✅ | +| Cody Web | ✅ | ✅ | ✅ | ✅ | ❌ | ## Repo-based context -Cody supports repo-based context. You can link single or multiple repositories based on your tier. Here's a detailed breakdown of the number of repositories supported by each client for Cody Free, Pro, and Enterprise users: - -| **Tier** | **Client** | **Repositories** | -| -------------- | ------------- | ---------------- | -| **Free/Pro** | VS Code | 1 | -| | JetBrains | 1 | -| | Visual Studio | 1 | -| **Enterprise** | Cody Web | Multi | -| | VS Code | Multi | -| | JetBrains | Multi | -| | Visual Studio | Multi | +Cody supports repo-based context. You can link single or multiple repositories. Here's a detailed breakdown of the number of repositories supported by each client: + +| **Client** | **Repositories** | +| ------------- | ---------------- | +| Cody Web | Multi | +| VS Code | Multi | +| JetBrains | Multi | +| Visual Studio | Multi | ## How does context work with Cody prompts? diff --git a/docs/cody/core-concepts/token-limits.mdx b/docs/cody/core-concepts/token-limits.mdx index b68fad569..609324fa0 100644 --- a/docs/cody/core-concepts/token-limits.mdx +++ b/docs/cody/core-concepts/token-limits.mdx @@ -11,46 +11,6 @@ All other models are currently capped at **7,000 tokens** of shared context betw Here's a detailed breakdown of the token limits by model: - - -| **Model** | **Conversation Context** | **@-mention Context** | **Output** | -| ----------------------------- | ------------------------ | --------------------- | ---------- | -| GPT 4o mini | 7,000 | shared | 4,000 | -| GPT o3 mini medium | 7,000 | shared | 4,000 | -| Claude 3.5 Haiku | 7,000 | shared | 4,000 | -| Claude 3.5 Sonnet (New) | 15,000 | 30,000 | 4,000 | -| **Claude Sonnet 4** | **15,000** | **45,000** | **4,000** | -| Gemini 1.5 Pro | 7,000 | shared | 4,000 | -| Gemini 2.0 Flash | 7,000 | shared | 4,000 | -| Gemini 2.0 Flash-Lite Preview | 7,000 | shared | 4,000 | - - - - - -The Pro tier supports the token limits for the LLM models on Free tier, plus: - -| **Model** | **Conversation Context** | **@-mention Context** | **Output** | -| ------------------------------ | ------------------------ | --------------------- | ---------- | -| GPT 4o mini | 7,000 | shared | 4,000 | -| GPT o3 mini medium | 7,000 | shared | 4,000 | -| GPT 4 Turbo | 7,000 | shared | 4,000 | -| GPT 4o | 7,000 | shared | 4,000 | -| o1 | 7,000 | shared | 4,000 | -| Claude 3.5 Haiku | 7,000 | shared | 4,000 | -| Claude 3.5 Sonnet (New) | 15,000 | 30,000 | 4,000 | -| **Claude Sonnet 4 w/Thinking** | **15,000** | **45,000** | **4,000** | -| Claude 3.7 Sonnet | 15,000 | 30,000 | 4,000 | -| Gemini 1.5 Pro | 15,000 | 30,000 | 4,000 | -| Gemini 2.0 Flash | 7,000 | shared | 4,000 | -| Gemini 2.0 Flash-Lite Preview | 7,000 | shared | 4,000 | - - - - - -The Enterprise tier supports the token limits for the LLM models on Free and Pro tier, plus: - | **Model** | **Conversation Context** | **@-mention Context** | **Output** | | ------------------------------ | ------------------------ | --------------------- | ---------- | | GPT 4o mini | 7,000 | shared | 4,000 | @@ -65,14 +25,10 @@ The Enterprise tier supports the token limits for the LLM models on Free and Pro | **Claude Opus 4** | **15,000** | **45,000** | **4,000** | | **Claude Opus 4 w/Thinking** | **15,000** | **45,000** | **4,000** | | Claude 3.7 Sonnet | 15,000 | 30,000 | 4,000 | +| Gemini 1.5 Pro | 15,000 | 30,000 | 4,000 | | Gemini 2.0 Flash | 7,000 | shared | 4,000 | | Gemini 2.0 Flash-Lite Preview | 7,000 | shared | 4,000 | - - - -
- For Cody Enterprise, the token limits are the standard limits. Exact token limits may vary depending on your deployment. Please get in touch with your Sourcegraph representative. For more information on how Cody builds context, see our [docs here](/cody/core-concepts/context). ## What is a Context Window? diff --git a/docs/cody/faq.mdx b/docs/cody/faq.mdx index 46b0e3020..820d0b5e9 100644 --- a/docs/cody/faq.mdx +++ b/docs/cody/faq.mdx @@ -6,9 +6,7 @@ ### Does Cody train on my code? -For Enterprise customers, Sourcegraph will not train on your company’s data. For Free and Pro tier users, Sourcegraph will not train on your data without your permission. - -Our third-party Language Model (LLM) providers do not train on your specific codebase. Cody operates by following a specific process to generate answers to your queries: +For Enterprise customers, Sourcegraph will not train on your company’s data. Our third-party Language Model (LLM) providers do not train on your specific codebase. Cody operates by following a specific process to generate answers to your queries: - **User query**: A user asks a question - **Code retrieval**: Sourcegraph, our underlying code intelligence platform, performs a search and code intelligence operation to retrieve code snippets relevant to the user's question. During this process, strict permissions are enforced to ensure that only code that the user has read permission for is retrieved @@ -24,10 +22,6 @@ Yes, Cody is compatible with self-hosted Sourcegraph instances. However, there a - Cody operates by sending code snippets (up to 28 KB per request) to a third-party cloud service. By default, this service is Anthropic but can also be OpenAI - To use Cody effectively, your self-hosted Sourcegraph instance must have internet access for these interactions with external services -### Is Cody licensed for private code, and does it allow GPL-licensed code? - -There are no checks or exclusions for Cody PLG (VS Code, JetBrains) for private and GPL-licensed code. We are subject to whatever the LLMs are trained on. However, Cody can be used with [StarCoder for autocomplete](/cody/clients/enable-cody-enterprise#use-starcoder-for-autocomplete) which is trained only on permissively licensed code. - ### Is there a public facing Cody API? Currently, there is no public-facing Cody API available. @@ -76,7 +70,7 @@ Cody Chat is optimized for coding related use cases and can be used primarily fo ### What happened to the Cody App? -We’ve deprecated the Cody App to streamline the experience for our Cody Free and Cody Pro users.The Cody App is no longer available for download. +We’ve deprecated the Cody App to streamline the experience. The Cody App is no longer available for download. ## Embeddings @@ -91,9 +85,9 @@ Cody leverages **Sourcegraph Search** as a primary context provider, which comes We leverage multiple retrieval mechanisms to give Cody the right context and will be constantly iterating to improve Cody's quality. The most important aspect is getting the files from the codebase, not the specific algorithm used to find those files. -### Why are embeddings no longer supported on Cody PLG and Enterprise? +### Why are embeddings no longer supported on Enterprise? -Cody does not support embeddings on Cody PLG and Cody Enterprise because we have replaced them with Sourcegraph Search. There are two driving factors: +Cody does not support embeddings on Cody Enterprise because we have replaced them with Sourcegraph Search. There are two driving factors: - The need for a retrieval system that can scale across repos and to repos of greater size - A system that is secure and requires low maintenance on the part of users @@ -126,7 +120,7 @@ Please refer to this [terms and conditions](https://about.sourcegraph.com/terms/ ### Can I use my own API keys? -Yes, [you can use your own API keys](https://sourcegraph.com/docs/cody/clients/install-vscode#experimental-models). However, this is an experimental feature. Bring-your-own-API-key is fully supported in the Enterprise plan. +Yes, BYOK (Bring Your Own Key) is fully supported in the Enterprise plan. ### Can I use Cody with my Cloud IDE? @@ -139,7 +133,7 @@ Yes, Cody supports the following cloud development environments: Yes you can. In the CLI you can use the following command to get started. Please replace `$name_of_the_model` with the LLM model of your choice. -``` +```shell cody chat --model '$name_of_the_model' -m 'Hi Cody!' ``` diff --git a/docs/cody/index.mdx b/docs/cody/index.mdx index e6b662293..2fd0cc118 100644 --- a/docs/cody/index.mdx +++ b/docs/cody/index.mdx @@ -3,7 +3,7 @@ Supported on [Sourcegraph Enterprise](https://about.sourcegraph.com/pricing). - Available on VS Code, JetBrains, Visual Studio, and the Web. + Available on VS Code, JetBrains, Visual Studio, and the Web app. @@ -21,12 +21,12 @@ Cody connects seamlessly with codehosts like [GitHub](https://github.com/login?c ## Getting started -You can start using Cody with one of the following options: +You can start using Cody with the following options: - - - + + + @@ -39,7 +39,7 @@ Cody's main features include: | **Feature** | **Description** | | -------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | | **[Chat](/cody/capabilities/chat)** | Chat directly with AI to ask questions about your code, generate code, and edit code. Cody has the context of your open file and repository by default, and you can use `@` to add context on specific files, symbols, remote repositories, or other non-code artifacts. | -| **[Autocomplete](/cody/capabilities/autocomplete)** | Cody predicts what you're trying to write before you type it. It makes single-line and multi-line suggestions as you type, using the context of the code around your cursor to make accurate suggestions. | +| **[Auto-edit](/cody/capabilities/auto-edit)** | Suggests code changes by analyzing cursor movements and typing. After you've made at least one character edit in your codebase, it begins proposing contextual modifications based on your cursor position and recent changes. | | **[Prompts](/cody/capabilities/commands)** | Automate key tasks in your workflow with premade and customizable prompts. Any common query or task can be built into a prompt to save and share with your team. | | **[Context](/cody/core-concepts/context)** | Cody provides the best LLM models and context to power chat. It uses the powerful Sourcegraph's advanced Search API to pull context from both local and remote codebases. | | **[Debug code](/cody/capabilities/debug-code)** | Cody is optimized to identify and fix errors in your code. Its debugging capability and autocomplete suggestions can significantly accelerate your debugging process, increasing developer productivity. | diff --git a/docs/cody/prompts-guide.mdx b/docs/cody/prompts-guide.mdx index 9a5501cab..05ac2ebac 100644 --- a/docs/cody/prompts-guide.mdx +++ b/docs/cody/prompts-guide.mdx @@ -96,9 +96,9 @@ Cody leverages the `@-mention` syntax to source context via files, symbols, web You can learn more about context [here](/cody/core-concepts/context). ### Indexing your repositories for context -@-mention local and current repositories are only available if you have your repository indexed. Enterprise and Enterprise Starter users can request their admins to add their local project for indexing to get access to @-mention context. +@-mention local and current repositories are only available if you have your repository indexed. Enterprise and Enterprise Starter users can request their admins to add their local project for indexing to get access to @-mention context. -Repository indexing is only available to supported [Code Hosts](https://sourcegraph.com/docs/admin/code_hosts), please reach out to your admins if you require assistance with indexing. +Repository indexing is only available to supported [Code Hosts](https://sourcegraph.com/docs/admin/code_hosts), please reach out to your admins if you require assistance with indexing. ## Selecting the right LLM @@ -139,7 +139,7 @@ Create a function that takes the total amount and loyalty points as input and re While preparing your codebase for Cody, you learned about the importance of context chips. In addition to this default context, you can provide additional and more specific context to help Cody better understand your codebase. -You can continue to `@-mention` files, symbols, and other context sources (as supported by your Cody tier) to make your search more specific and granular. You should approach this as if explaining the situation to a new team member. You should: +You can continue to `@-mention` files, symbols, and other context sources to make your search more specific and granular. You should approach this as if explaining the situation to a new team member. You should: - Reference important files and symbols - Provide examples from other similar functions diff --git a/docs/cody/quickstart.mdx b/docs/cody/quickstart.mdx index f8d3e2b13..4894c4b0a 100644 --- a/docs/cody/quickstart.mdx +++ b/docs/cody/quickstart.mdx @@ -14,14 +14,14 @@ Before you start, you'll need the following: - [Cody extension installed](/cody/clients/install-vscode) in your VS Code editor -- Free or Pro account via Sourcegraph.com or a Sourcegraph Enterprise account +- A Sourcegraph Enterprise account with Cody enabled - A project open in VS Code ## Getting started with Cody -After installing the extension and connecting to a Sourcegraph instance, you can leverage Cody to use the best LLM and context to understand, write, and fix code. Click the **Cody** icon from the VS Code side activity bar to open the Cody chat panel. +After installing the extension and connecting to your Sourcegraph Enterprise instance, you can leverage Cody to use the best LLM and context to understand, write, and fix code. Click the **Cody** icon from the VS Code side activity bar to open the Cody chat panel. -By default, the chat input will have the context of your entire codebase, and Claude 3.5 Sonnet (New) is selected as the default chat model. Based on your [Cody tier](https://sourcegraph.com/pricing?product=cody), you can change the LLM model and context based on your use case to optimize speed, accuracy, or cost. +By default, the chat input will have the context of your entire codebase, and Claude 3.5 Sonnet is selected as the default chat model. You can change the LLM model and context based on your use case to optimize speed, accuracy, or cost. To help you automate your key tasks in your development workflow, you get **[Prompts](/cody/capabilities/commands)**. If you are a part of an organization on Sourcegraph.com or a self-hosted Sourcegraph instance, you can view these pre-built Prompts created by your teammates. Alternatively, you can create your own Prompts via the **Prompt Library** from your Sourcegraph instance. @@ -57,9 +57,9 @@ Cody automatically predicts the function body in gray-dimmed text. Press `Tab` t ## Use Cody to refactor code -You can refactor your code with inline edits. All you need to do is highlight the code, hit the edit hotkey (`Opt + K`), and describe a change. Cody will generate a diff for the change in seconds. +You can refactor your code with inline edits. All you need to do is highlight the code, hit the edit hotkey (`Opt+K`), and describe a change. Cody will generate a diff for the change in seconds. -Let's use the same `bubbleSort()` function from the previous section. Now, refactor the function to sort dates in descending order. Highlight the function and press `Opt + K`. +Let's use the same `bubbleSort()` function from the previous section. Now, refactor the function to sort dates in descending order. Highlight the function and press `Opt+K`. ![cody-refactor](https://storage.googleapis.com/sourcegraph-assets/Docs/cody-refactor-code-1124.png) diff --git a/docs/cody/troubleshooting.mdx b/docs/cody/troubleshooting.mdx index c8694e37c..457848ddb 100644 --- a/docs/cody/troubleshooting.mdx +++ b/docs/cody/troubleshooting.mdx @@ -2,7 +2,7 @@

Learn about common reasons for errors that you might run into when using Cody and how to troubleshoot them.

-If you encounter errors or bugs while using Cody, try applying these troubleshooting methods to understand and configure the issue better. If the problem persists, you can report Cody bugs using the [issue tracker](https://github.com/sourcegraph/cody/issues), by using the [Support Forum](https://community.sourcegraph.com/), or by asking in the [Discord](https://discord.gg/s2qDtYGnAE) server. +If you encounter errors or bugs while using Cody, try applying these troubleshooting methods to understand and configure the issue better. If the problem persists, you can report by using the [Support Forum](https://community.sourcegraph.com/), or by asking in the [Discord](https://discord.gg/s2qDtYGnAE) server. ## VS Code extension @@ -82,13 +82,7 @@ To troubleshoot further: ### Rate limits -Cody Free provides **unlimited autocomplete suggestions** and **200 chat invocations** per user per month. - -On Cody Pro and Enterprise plans, usage limits are increased, and controlled by **Fair Usage**. This means that some users occasionally experience a limitation placed on their account. This limitation resets within 24 hours. If this issue persists, contact us through our [community forum](https://community.sourcegraph.com), Discord, or email support@sourcegraph.com. - -#### 429 errors - -A 429 status code means you are on a free account and hit your usage limit/quota for the day. It can also mean you were sending too many requests in a short period of time. If you have Cody Pro and you are seeing 429 errors, you can contact us at [support@sourcegraph.com](mailto:support@sourcegraph.com) to resolve this. +On Enterprise plans, usage limits are increased, and controlled by **Fair Usage**. This means that some users occasionally experience a limitation placed on their account. This limitation resets within 24 hours. If this issue persists, contact us through our [community forum](https://community.sourcegraph.com), Discord, or email support@sourcegraph.com. ### Error logging in VS Code on Linux and Windows @@ -110,7 +104,7 @@ On Windows, If you encounter this error: -``` +```bash Request Failed: Request to https://sourcegraph.com/.api/completions/stream?api-version=1&client-name=vscode&client-version=1.34.3 failed with 403 Forbidden ``` @@ -122,7 +116,7 @@ Consider disabling anonymizers, VPNs, or open proxies. If using a VPN is essenti The `contextFilters` setting in Cody is used to control which files are included or excluded when Cody searches for relevant context while answering questions or providing code assistance. Sometimes, you can see the following error: -``` +```bash Edit failed to run: file is ignored (due to cody.contextFilters Enterprise configuration setting) ``` @@ -132,13 +126,6 @@ If the error occurs with a file that's not been excluded, the workaround is to u This should clear the error. -### VS Code Pro License Issues - -If VS Code prompts you to upgrade to Pro despite already having a Pro license, this usually happens because you're logged into a free Cody/Sourcegraph account rather than your Pro account. To fix this: - -- Check which account you're currently logged into -- If needed, log out and sign in with your PRO account credentials - ### Error exceeding `localStorage` quota When using Cody chat, you may come across this error: @@ -162,6 +149,7 @@ You can get performance traces from the Cody VS Code extension in production wit ``` Note that you may need to quit VSCode first, then run that command to re-launch it. It will open all of your windows and tabs again. + - After VS Code is started, head over to Chrome and go to `chrome://inspect`, which takes you to the following: ![](https://gist.github.com/assets/458591/0a17881b-5449-48d5-a53e-5556f4f2dedd) @@ -301,30 +289,6 @@ $filteredResults = preg_grep('*\.' . basename($inputPath) . '\.*', $fileList); If you would like to add a forked repository as Cody context, you may need to add `"search.includeForks": true` to the [global settings](/admin/config/settings#editing-global-settings-for-site-admins) for your instance. -{/* ## Eclipse extension - -### See a white screen the first time you open Cody chat - -This can happen if Eclipse prompts you to set up a password for secure storage and Cody timeouts while waiting. Simply close and re-open the Cody chat. - -### "No password provided" in the error log - -If you see this error in the error log, it happens because the default OS password integration has been corrupted. Go to **Preferences > General > Security > Secure Storage** and ensure your OS integration is checked. - -Then click **Clear Passwords** at the top, and then click **Change Password**. If you see a dialog saying **An error occurred while decrypting stored values... Do you want to cancel password change?** Click **No**. - -This will reset the secure storage master password for OS integration. You will be asked if you want to provide additional information for password recovery, which is optional. Click **Apply and Close** and then restart Eclipse. - -### General Tips - -You can open the Cody Log view using the same steps as above, but instead, select **Cody Log**. - -![cody-logs](https://storage.googleapis.com/sourcegraph-assets/Docs/eclipse-cody-log-1124.png) - -This will include more information about what Cody is doing, including any errors. There is a copy button at the top right of the log view that you can use to copy the log to your clipboard and send it to us. Be careful not to include any sensitive information, as the log communication is verbose and may contain tokens. - -Additionally, Eclipse's built-in Error Log can be used to view any uncaught exceptions and their stack traces. You can open the Error Log using the **Window > Show View > Error Log** menu. */} - ## OpenAI o1 ### Context Deadline Exceeded Error @@ -358,7 +322,7 @@ Symptoms: Solutions: - Break down complex requests into smaller steps -- Consider using Sonnet 3.5 for tasks requiring longer outputs +- Consider using Sonnet 4 for tasks requiring longer outputs Limits: @@ -377,7 +341,6 @@ Solutions: - Restart IDE/VS Code - Sign out and sign back in -- Check Pro subscription status - Contact support if issues persist ### Response Format Errors diff --git a/docs/cody/usage-and-pricing.mdx b/docs/cody/usage-and-pricing.mdx deleted file mode 100644 index 8a0e05ce9..000000000 --- a/docs/cody/usage-and-pricing.mdx +++ /dev/null @@ -1,156 +0,0 @@ -# Cody Pricing Plans - -

Learn about the different plans available for Cody.

- -Cody provides three subscription plans: **Free**, **Pro**, and **Enterprise**. Each plan is aimed to cater to a diverse range of users, from individual projects to large-scale enterprises. Cody Free includes basic features, while the Pro and Enterprise plans offer additional advanced features and resources to meet varying user requirements. - - - - -The free plan is designed for individuals to get started with Cody. It comes with a set of features to enhance your coding experience. It includes **unlimited autocompletion suggestions** per month, covering both whole and multi-line suggestions. You also get **200 chats/prompts** per month with access to creating custom prompts. - -The free plan provides access to local context to enhance Cody's understanding of your code and ability to provide accurate suggestions. Local context uses keyword search to search the local repository (the one currently open in your IDE). - -Cody Free uses the DeepSeek-Coder-V2 model for autocomplete. It uses the Claude 3.5 Sonnet (New) model for chat and prompts, and users have several other model options they can switch to. See [Supported LLMs](https://sourcegraph.com/docs/cody/capabilities/supported-models) for more information. - -### Billing cycle - -There is no billing cycle for Cody Free, as it is free to use. If you complete your monthly chat/prompts limit, you'll be prompted to upgrade to the Pro plan. Otherwise, you'll have to wait approximately 30 days for the limits to reset. - -The reset date is based on your sign-up date. For example, if you sign up on December 15th, your limit will reset on January 15th. - -### Upgrading from Free to Pro - -![upgrade-cody-pro](https://storage.googleapis.com/sourcegraph-assets/Docs/upgrade-cody-pro.png) - -To upgrade from Cody Free to Pro: - -- Log in to your [Cody dashboard](https://sourcegraph.com/cody/manage) -- Click the **Upgrade** button in the top right corner -- It takes you to the [Subscription plans](https://sourcegraph.com/cody/subscription) page -- Under the Pro tier, click **Get Pro** -- Enter your card details for billing -- Click confirm and upgrade to Cody Pro for **$9 per month** - -Once your Pro subscription is confirmed, click **My subscription** to manage and view your activation details, or click **Dashboard** for the overall view. - - - - - -Cody Pro, designed for individuals or small teams at **$9 per user per month**, offers an enhanced coding experience beyond the free plan. It provides unlimited autocompletion suggestions plus increased limits for chat and prompts. It also uses local repository context to enhance Cody's understanding and accuracy. - -Cody Pro uses DeepSeek-Coder-V2 by default for autocomplete. Pro accounts also default to the Claude 3.5 Sonnet (New) model for chat and prompts, but users can switch to other LLM model choices. You can refer to the [supported LLM models](/cody/capabilities/supported-models) docs for more information. - -Support for Cody Pro is available through our Support team via support@sourcegraph.com, ensuring prompt assistance and guidance. - -### Downgrading from Pro to Free - -To revert back to Cody Free from Pro: - -- Go to your Sourcegraph dashboard **Cody > Dashboard** -- Next, **Manage subscription** that takes you to **Cody > Subscription** -- Clicks **Cancel** on the Pro tier to cancel your Pro subscription -- This automatically downgrades you to Cody Free once after your billing cycle ends - -### Upgrading from Pro to Enterprise - -To upgrade from Cody Pro to Cody Enterprise, you should [Contact Sales](https://sourcegraph.com/contact/request-info) and connect with one of our account teams. They will help you set up your account and start with Cody Enterprise. - -## Billing FAQs for Cody Pro - -### What is Cody's pricing plan? - -You can get this info from our [pricing page](https://sourcegraph.com/pricing?product=cody). - -### When are payments taken? - -We charge payments at the beginning of each billing cycle, and they get automatically renewed once you've added your credit card details. - -### What is a billing cycle? - -The billing cycle refers to the start date you start your Cody Pro plan. Your Cody Pro plan will automatically renew at each billing date. You can view your current and previous billing cycles from the **My subscription** tab. - -### What payment methods are available? - -We currently only support payments through a credit card. - -### What happens when I cannot pay or don't want to renew the Pro plan? - -If you do not want to renew Cody Pro, you can cancel your Cody Pro subscription at any time by going to the [accounts page](https://accounts.sourcegraph.com/cody/subscription). If you cancel the Pro plan, your Cody Pro will continue until the end of the billing cycle. Once the billing cycle ends, your plan will not renew, and your card will not be charged. This automatically downgrades your plan from Cody Pro to Cody Free. - -If you change your mind after canceling your plan please contact our Support team at support@sourcegraph.com and we can help you get re-subscribed. - -### Are there any refunds for the Pro subscription? - -We don't offer refunds, but if you have any queries regarding the Cody Pro plan, please write to support@sourcegraph.com, and we'll help resolve the issue. - -### How do I access previous invoices? - -You can access your invoices via the [Cody Dashboard](https://sourcegraph.com/cody/manage) and clicking "Manage Subscription". Note that invoices are not emailed every month. - -## Enterprise Starter - -Cody Pro users can also switch to the Enterprise Starter plan for **$19 per user per month**. This plan includes all the features of Cody Pro plus a multi-tenant Sourcegraph instance with core features like a fully managed version of Sourcegraph (AI + code search + intent detection with integrated search results, with privately indexed code) through a self-serve flow. - -Read more about the [Enterprise Starter plan](/pricing/enterprise-starter). - - - - - -Cody Enterprise is designed for enterprises prioritizing security and administrative controls. We offer either seat-based or token based pricing models, depending on what makes most sense for your organization. You get Claude Haiku 3.5 and Claude Sonnet 3.5 as the default LLM models without extra cost. You also get additional capabilities like BYOLLM (Bring Your Own LLM), supporting Single-Tenant and Self Hosted setups for flexible coding environments. - -Security features include SAML/SSO for enhanced authentication and guardrails to enforce coding standards. Cody Enterprise supports advanced Code Graph context and multi-code host context for a deeper understanding of codebases, especially in complex projects. With 24/5 enhanced support, Cody Enterprise ensures timely assistance. - -## Billing FAQs for Cody Enterprise - -### How are active users calculated for Cody? - -### Billable active users - -This only applies to Enterprise Cody users. Cody Pro users pay for a seat every month, regardless of usage. - -A billable active Cody user is signed in to their Enterprise Sourcegraph account and actively interacts with Cody (for example, they see suggested autocompletions, run commands, or chat with Cody, they start new discussions, clear chat history, or copy text from chats. They change Cody's settings and more). - -Having Cody installed and signing in is not enough to be considered a billable user. - -Specific categories of user actions that qualify as active Cody usage for billing include, but are not limited to: - -- Signing in to Cody in an editor -- Seeing suggested autocompletions and/or accepting/rejecting them -- Running commands or chatting with Cody -- Starting new chats, clearing chat history, copying text from chats or otherwise interacting with the Cody chat interface in the editor or on the web -- Adding or changing context in a Cody chat interface or changing models -- Interacting with Cody tutorials -- Changing Cody personal or team settings - -### Authenticated users - -An authenticated Cody user is authenticated (or signed in) to Cody in their editor within a given time period. This includes all users who interact with Cody on the web. - -Authenticated users are a superset of [billable users](#billable-active-users) defined above. They include users who are signed in but do not actively engage with Cody. - - - - - -## Plans Comparison - -The following table shows a high-level comparison of the three plans available on Cody. - -| **Features** | **Free** | **Pro** | **Enterprise Starter** | **Enterprise** | -| --------------------------------- | ---------------------------------------------------------- | ----------------------------------------------------------------- | -------------------------------------------------- | -------------------------------------------------- | -| **Autocompletion suggestions** | Unlimited | Unlimited | Unlimited | Unlimited | -| **Chat Executions** | 200 per month | Increased limits | Increased limits | Unlimited | -| **Keyword Context (local code)** | Supported | Supported | Supported | Supported | -| **Developer Limitations** | 1 developer | 1 developer | Scalable, per-seat pricing | Scalable, consumption-based pricing | -| **LLM Support** | [View latest](/cody/capabilities/supported-models) | [View latest](/cody/capabilities/supported-models) | [View latest](/cody/capabilities/supported-models) | [View latest](/cody/capabilities/supported-models) | -| **Code Editor Support** | VS Code, JetBrains IDEs, Visual Studio (Preview) | VS Code, JetBrains IDEs, Visual Studio (Preview) | VS Code, JetBrains IDEs, Visual Studio (Preview) | VS Code, JetBrains IDEs, Visual Studio (Preview) | -| **Single-Tenant and Self Hosted** | N/A | N/A | N/A | Yes | -| **SAML/SSO** | N/A | N/A | N/A | Yes | -| **Guardrails** | N/A | N/A | N/A | Yes | -| **Advanced Code Graph Context** | N/A | N/A | N/A | Included | -| **Multi-Code Host Context** | N/A | N/A | GitHub only | Included | -| **Discord Support** | Yes | Yes | Yes | Yes | -| **24/5 Enhanced Support** | N/A | N/A | Yes | Yes | diff --git a/docs/index.mdx b/docs/index.mdx index eefd0d199..1fd049a91 100644 --- a/docs/index.mdx +++ b/docs/index.mdx @@ -1,20 +1,13 @@ Sourcegraph is a Code Intelligence platform that deeply understands your code, no matter how large or where it’s hosted, to power modern developer experiences. -- **Cody**: Use Cody our AI code assistant to read, write, and understand your entire codebase faster -- **Code search:** Search through all of your repositories across all branches and all code hosts -- **Code intelligence:** Navigate code, find references, see code owners, trace history, and more -- **Fix & refactor:** Roll out and track large-scale changes and migrations across repos at once +- **Code Search:** Search through all of your repositories across all branches and all code hosts +- **Code Intelligence:** Navigate code, find references, see code owners, trace history, and more +- **Fix and Refactor:** Roll out and track large-scale changes and migrations across repos at once +- **AI Assistant:** Use Cody our AI code assistant to read, write, and understand your entire codebase faster ## Quickstart - + This only applies to Cody Enterprise contracts. Cody Pro and Enterprise Starter users pay for a seat every month, regardless of usage. +This only applies to Cody Enterprise contracts. ‍A billable user is one who is signed in to their Enterprise account and actively interacts with the product (e.g., they see suggested autocompletions, run commands or chat with Cody, start new discussions, clear chat history, or copy text from chats, change settings, and more). Simply having Cody installed is not enough to be considered a billable user. @@ -89,7 +67,7 @@ The Enterprise Starter plan is currently compatible with GitHub. Its limit for i ## What are the limits of the Enterprise starter plan? -The Enterprise Starter plan supports up to 50 developers and, alongside a limit of 100 repositories for search and context, also includes 5Gb of storage. Adding additional seats gives you 1GB of additional storage per seat, for a maximum total of 10GB. +The Enterprise Starter plan supports up to 50 developers and, alongside a limit of 100 repositories for search and context, also includes 5GB of storage. Adding additional seats gives you 1GB of additional storage per seat, for a maximum total of 10GB. ## Billing FAQs for Enterprise Starter diff --git a/docs/pricing/plan-comparison.mdx b/docs/pricing/plan-comparison.mdx index a074b65bc..1c98b6bdf 100644 --- a/docs/pricing/plan-comparison.mdx +++ b/docs/pricing/plan-comparison.mdx @@ -2,41 +2,39 @@

This page lists a detailed comparison of the features available in each plan.

-| **Features** | **Free** | **Enterprise Starter** | **Enterprise** | -| -------------------------------- | ----------------------------------------------------- | ----------------------------------------------------- | ----------------------------------------------------- | -| **AI** | | | | -| Autocomplete | Unlimited | Unlimited | Unlimited | -| Chat messages and prompts | 200/month | Increased limits | Unlimited | -| Code context and personalization | Local codebase | Remote codebase (GitHub only) | Remote, enterprise-scale codebases | -| Integrated search results | - | ✓ | ✓ | -| Prompt Library | ✓ | ✓ | ✓ | -| Bring your own LLM Key | - | - | Self-Hosted only | -| Auto-edit | - | Beta | Beta | -| Aentic chat experience | - | Experimental | Experimental | -| **Code Search** | | | | -| Code Search | - | ✓ | ✓ | -| Code Navigation | - | ✓ | ✓ | -| Code Insights | - | - | ✓ | -| Code Monitoring | - | - | ✓ | -| Batch Changes | - | - | ✓ | -| **Deployment** | | | | -| Cloud deployment | Multi-tenant | Multi-tenant | Single tenant | -| Self hosted option | - | - | ✓ | -| Private workspace | - | ✓ | ✓ | -| **Admin and Security** | | | | -| SSO/SAML | Basic (GH/GL/Google) | Basic (GH/GL/Google) | ✓ | -| Role-based access control | - | - | ✓ | -| Analytics | - | Basic | ✓ | -| Audit logs | - | - | ✓ | -| Guardrails | - | - | Beta | -| Indexed code | - | Private | Private | -| Context Filters | - | - | ✓ | -| **Compatibility** | | | | -| Code hosts | Local codebase | GitHub | All major codehosts | -| IDEs | VS Code, JetBrains IDEs, Visual Studio (Experimental) | VS Code, JetBrains IDEs, Visual Studio (Experimental) | VS Code, JetBrains IDEs, Visual Studio (Experimental) | -| Human languages | Many human languages, dependent on the LLM used | Many human languages, dependent on the LLM used | Many human languages, dependent on the LLM used | -| Programming languages | All popular programming languages | All popular programming languages | All popular programming languages | -| **Support** | | | | -| Support level | Community support | Community support | Enterprise support | -| Dedicated TA support | - | - | Add-on | -| Premium support | - | - | Add-on | +| **Features** | **Free** | **Enterprise Starter** | **Enterprise** | +| -------------------------------- | --------------------------------- | --------------------------------- | ----------------------------------------------------- | +| **AI** | | | | +| Autocomplete | N/A | N/A | Unlimited | +| Chat messages and prompts | N/A | N/A | Unlimited | +| Code context and personalization | N/A | N/A | Remote, enterprise-scale codebases | +| Prompt Library | N/A | N/A | ✓ | +| Bring your own LLM Key | N/A | N/A | Self-Hosted only | +| Auto-edit | N/A | N/A | ✓ | +| Aentic chat experience | N/A | N/A | ✓ | +| **Code Search** | | | | +| Code Search | Public Code | ✓ | ✓ | +| Code Navigation | - | ✓ | ✓ | +| Code Insights | - | - | ✓ | +| Code Monitoring | - | - | ✓ | +| Batch Changes | - | - | ✓ | +| **Deployment** | | | | +| Cloud deployment | Multi-tenant | Multi-tenant | Single tenant | +| Self hosted option | - | - | ✓ | +| Private workspace | - | ✓ | ✓ | +| **Admin and Security** | | | | +| SSO/SAML | Basic (GH/GL/Google) | Basic (GH/GL/Google) | ✓ | +| Role-based access control | - | - | ✓ | +| Analytics | - | Basic | ✓ | +| Audit logs | - | - | ✓ | +| Guardrails | - | - | Beta | +| Indexed code | - | Private | Private | +| Context Filters | - | - | ✓ | +| **Compatibility** | | | | +| Code hosts | Local codebase | GitHub | All major codehosts | +| IDEs | N/A | N/A | VS Code, JetBrains IDEs, Visual Studio (Experimental) | +| Programming languages | All popular programming languages | All popular programming languages | All popular programming languages | +| **Support** | | | | +| Support level | Community support | Community support | Enterprise support | +| Dedicated TA support | - | - | Add-on | +| Premium support | - | - | Add-on | diff --git a/docs/pricing/plans/enterprise-starter.mdx b/docs/pricing/plans/enterprise-starter.mdx index e7918823d..50caf2d74 100644 --- a/docs/pricing/plans/enterprise-starter.mdx +++ b/docs/pricing/plans/enterprise-starter.mdx @@ -2,11 +2,11 @@

Learn about the Enterprise Starter plan tailored for individuals and teams wanting private code indexing and search to leverage the Sourcegraph platform better.

-The Enterprise Starter plan offers a multi-tenant Sourcegraph instance designed for individuals and teams. It provides the core features of a traditional Sourcegraph instance but with a simplified management experience. This plan provides a fully managed version of Sourcegraph (AI + code search with integrated search results, with privately indexed code) through a self-serve flow. +The Enterprise Starter plan offers a multi-tenant Sourcegraph instance designed for individuals and teams. It provides the core features of a traditional Sourcegraph instance but with a simplified management experience. This plan provides a fully managed version of Sourcegraph through a self-serve flow. ## Team seats -The Enterprise Starter plan is priced at **$19 per month per seat**. You can add or remove team members at any time. Existing Cody Pro users can also sign up for the Enterprise Starter by paying $19 per seat. However, their Cody Pro subscription will neither be upgraded nor canceled. Instead, they will have two live subscriptions. +The Enterprise Starter plan is priced at **$19 per month per seat**. You can add or remove team members at any time. ## Enterprise Starter team roles @@ -25,14 +25,13 @@ Please also see [FAQs](faqs.mdx) for more FAQs, including how to downgrade Enter ## Features supported -The Enterprise Starter plan supports a variety of AI and search-based features like: +The Enterprise Starter plan supports a variety of search-based features like: -| **AI features** | **Code Search** | **Management** | **Support** | -| -------------------------------------- | ------------------------------ | --------------------------------------------------------- | ------------------------- | -| Code autocompletions and chat messages | Indexed Code Search | Simplified admin experience with UI-based repo-management | Support with limited SLAs | -| Powerful LLM models for chat | Indexed Symbol Search | User management | - | -| Integrated search results | Searched based code-navigation | GitHub code host integration | - | -| Cody integration | - | - | - | +| **Code Search** | **Management** | **Support** | +| ------------------------------ | --------------------------------------------------------- | ------------------------- | +| Indexed Code Search | Simplified admin experience with UI‑based repo‑management | Support with limited SLAs | +| Indexed Symbol Search | User management | - | +| Searched‑based code‑navigation | GitHub code host integration | - | ## Limits diff --git a/docs/pricing/plans/enterprise.mdx b/docs/pricing/plans/enterprise.mdx index 9c6c5cf88..05e3f1d77 100644 --- a/docs/pricing/plans/enterprise.mdx +++ b/docs/pricing/plans/enterprise.mdx @@ -2,7 +2,7 @@

Learn about the Sourcegraph's Enterprise plan and the features included.

-Sourcegraph offers multiple Enterprise plan options, including Enterprise Dedicated Cloud (default) and Enterprise Self Hosted (on-request) for organizations and enterprises that need AI and search with enterprise-level security, scalability, and flexibility. +Sourcegraph offers multiple Enterprise plan options, including Enterprise Dedicated Cloud (default) and Enterprise Self Hosted (on-request) for organizations and enterprises that need search with enterprise-level security, scalability, and flexibility. ## Features breakdown @@ -10,9 +10,9 @@ Here's a detailed breakdown of features included in the different Enterprise pla | **Feature** | **Enterprise Dedicated Cloud** | **Enterprise Self Hosted** | | ------------------------ | -------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -| **Description** | - AI and search with enterprise-level security, scalability, and flexibility | - AI and search with enterprise-level security, scalability, and flexibility | +| **Description** | - Search with enterprise-level security, scalability, and flexibility | - Search with enterprise-level security, scalability, and flexibility | | **Price** | - $59/user/month
- 25+ devs | - $59/user/month
- Contact Sales | -| **AI features** | - Everything in Enterprise Starter | - Everything in Enterprise Starter
- Bring your own LLM key | +| **AI features** | - Cody AI Assistant | - Cody AI Assistant
- Bring your own LLM key | | **Code Search features** | - Everything in Enterprise Starter, plus:
- Batch Changes
- Code Insights
- Code Navigation | - Everything in Enterprise Starter, plus:
- Batch Changes
- Code Insights
- Code Navigation | | **Deployment types** | - Single-tenant Coud | - Self- Hosted | | **Compatibility** | - Everything in Enterprise Starter, plus:
- Enterprise admin and security features
- All major code hosts
- Guardrails
- Context Filters | - Everything in Enterprise Starter, plus:
- Enterprise admin and security features
- All major code hosts
- Guardrails
- Context Filters | diff --git a/docs/pricing/plans/free.mdx b/docs/pricing/plans/free.mdx index 35186d398..1f7eddd4d 100644 --- a/docs/pricing/plans/free.mdx +++ b/docs/pricing/plans/free.mdx @@ -2,21 +2,11 @@

Learn about the Sourcegraph's Free plan and the features included.

-Sourcegraph's Free plan is designed for hobbyists, and light usage is aimed at users with personal projects and small-scale applications. It offers an AI editor assistant with a generous set of features for individual users, like autocompletion and multiple LLM choices for chat. - -## Features - -The Free plan includes the following features: - -| **AI features** | **Compatibility** | **Deployment** | **Admin/Security** | **Support** | -| ----------------------------------------------------------------------------- | --------------------------------------------------- | ------------------ | ------------------------------------------ | ---------------------- | -| Reasonable use autocomplete limits | VS Code, JetBrains IDEs, and Visual Studio | Multi-tenant Cloud | SSO/SAML with basic GitHub, GitLab, Google | Community support only | -| Reasonable use chat messages and prompts per month | All popular coding languages | - | - | - | -| Multiple LLM selection (Claude 3.5 Sonnet, Gemini 1.5 Pro and Flash) | Natural language search | - | - | - | +Sourcegraph's Free plan is designed for hobbyists, and light usage is aimed at users with personal projects and small-scale applications. ## Pricing and billing cycle -There is no billing cycle, as it's free to use and supports one user per account. If you exceed your daily limits, you will have to wait until the end of the month to use the feature again. You can upgrade to our Enterprise Starter plan for more advanced features and usage limits. +There is no billing cycle, as it's free to use and supports one user per account. You can upgrade to our Enterprise Starter plan for more advanced features. ## Free vs. Enterprise Starter comparison @@ -24,12 +14,11 @@ The Enterprise Starter plan provides extended usage limits and advanced features | **Feature** | **Free** | **Enterprise Starter** | | ------------------------ | -------------------------------------------------------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------- | -| **Description** | - AI editor assistant for hobbyists or light usage | - AI and search for growing organizations hosted on our cloud | +| **Description** | - Public Code Search for hobbyists or light usage | - Code Search for growing organizations hosted on our cloud | | **Price** | - $0/month
- 1 user | - $19/user/month
- Up to 50 devs | -| **AI features** | - Autocompletions
- 200 chat messages and prompts per month
- Multiple LLM choices for chat | - Code autocomplete and chat
- More powerful LLMs for chat (GPT-4o, Claude 3 Opus)
- Integrated search results | | **Code Search features** | N/A | - Code Search
- Symbol Search | | **Deployment types** | - Multi-tenant Coud | - Multi-tenant Cloud
- Private Workspace
- Privately indexed code (100 repos) | -| **Compatibility** | - VS Code, JetBrains IDEs, and Visual Studio
- All popular coding languages
Natural language search
- All major code hosts | - VS Code, JetBrains IDEs, and Visual Studio
- All popular coding languages
Natural language search
- Code hosted on GitHub | +| **Compatibility** | - All popular coding languages
Natural language search
- All major code hosts | - All popular coding languages
Natural language search
- Code hosted on GitHub | | **Support** | - Community support only | - 9x5 Support | ## Moving to Enterprise Starter plan diff --git a/src/components/Layout.tsx b/src/components/Layout.tsx index ee67d704b..b7df58d46 100644 --- a/src/components/Layout.tsx +++ b/src/components/Layout.tsx @@ -11,7 +11,6 @@ import { usePathname } from 'next/navigation'; import { useEffect, useState } from 'react'; import { LogoMark } from './LogoMark'; import { Search } from './search/Search'; -import { TopBanner } from './TopBanner'; import VersionSelector from './VersionSelector'; function GitHubIcon(props: React.ComponentPropsWithoutRef<'svg'>) { @@ -46,24 +45,24 @@ function Header() { className="sticky top-0 z-50" > {/* Cody docs banner */} - {isCodyDocs && !isopenCtxDocs && } + />*/} {/* Pricing docs banner */} - {isPricingDocs && !isopenCtxDocs && } + />*/} {/* Openctx docs banner */} {/* {isopenCtxDocs && {