Skip to content

Commit 9367e47

Browse files
Add Enhanced Context Windows documentation for Cody Token Limits docs page (#1333)
Documenting the feature flag is important for customers to understand that they can take advantage of enhanced context limits. I added this section to the Cody Input and Output Limits page to reflect how the feature flag enables larger context windows for various models supported by Sourcegraph. • Enhanced context windows require the enhanced-context-window feature flag to be enabled • Available since Sourcegraph 6.5 for Enterprise customers • Input: Claude/Gemini up to 150k, OpenAI GPT up to 102k, OpenAI o-series up to 93k • Output: Claude 64k, Gemini 65k, OpenAI GPT 16k, OpenAI o-series 100k, Reasoning models 100k Testing: Ran docs locally, confirmed everything looks good
1 parent 3bcdd3c commit 9367e47

File tree

1 file changed

+21
-0
lines changed

1 file changed

+21
-0
lines changed

docs/cody/core-concepts/token-limits.mdx

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -31,6 +31,27 @@ Here's a detailed breakdown of the token limits by model:
3131

3232
<Callout type="info">For Cody Enterprise, the token limits are the standard limits. Exact token limits may vary depending on your deployment. Please get in touch with your Sourcegraph representative. For more information on how Cody builds context, see our [docs here](/cody/core-concepts/context).</Callout>
3333

34+
## Enhanced Context Windows (Feature Flag)
35+
36+
Since 6.5 for Enterprise, we rolled out a feature flag `enhanced-context-window` that significantly expands Cody's context capabilities. This feature addresses developers' need to work with more context by expanding both input and output context windows.
37+
38+
When the `enhanced-context-window` feature flag is enabled, Cody Enterprise customers get access to:
39+
40+
**Input context window (via @mention and user input):**
41+
- Anthropic Claude: up to **150k tokens**
42+
- Google Gemini: up to **150k tokens**
43+
- OpenAI GPT-series: up to **102k tokens**
44+
- OpenAI o-series: up to **93k tokens**
45+
46+
**Output context window:**
47+
- Anthropic Claude: up to **64k tokens**
48+
- Google Gemini: up to **65k tokens**
49+
- OpenAI GPT-series: **16k tokens**
50+
- OpenAI o-series: **100k tokens**
51+
- Reasoning models: up to **100k tokens**
52+
53+
<Callout type="note">The enhanced context windows require the `enhanced-context-window` feature flag to be set to `true` in your Sourcegraph instance. Contact Sourcegraph support if you need help enabling this feature.</Callout>
54+
3455
## What is a Context Window?
3556

3657
A context window in large language models refers to the maximum number of tokens (words or subwords) the model can process simultaneously. This window determines how much context the model can consider when generating text or code.

0 commit comments

Comments
 (0)