Skip to content

Commit c9d4259

Browse files
committed
docs: Clarify context window limit description for automatic summarization in Intelligent Context Condensation feature
1 parent 434de5a commit c9d4259

File tree

2 files changed

+2
-4
lines changed

2 files changed

+2
-4
lines changed

docs/update-notes/v3.17.0.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -28,10 +28,9 @@ We've introduced an experimental feature called **Intelligent Context Condensati
2828

2929
Here's how it works:
3030

31-
* **Automatic Summarization:** When a conversation approaches its context window limit (specifically, when it's 50% full), Roo Code now automatically uses a Large Language Model (LLM) to summarize the existing conversation history.
31+
* **Automatic Summarization:** When a conversation approaches its context window limit (specifically, when the context window is almost full), Roo Code now automatically uses a Large Language Model (LLM) to summarize the existing conversation history.
3232
* **Preserving Key Information:** The goal is to reduce the token count of the history while retaining the most essential information, ensuring the LLM has a coherent understanding of past interactions. This helps avoid the silent dropping of older messages.
3333
* **Checkpoint Integrity:** While summarized for ongoing LLM calls, all original messages are preserved when you rewind to old checkpoints.
34-
* **Image Handling:** Images within messages are specially managed during the summarization process.
3534
* **Opt-in Experimental Feature:** Disabled by default, this feature can be enabled in "Advanced Settings" under "Experimental Features." Please note that the LLM call for summarization incurs a cost, which is not currently displayed in the UI's cost tracking.
3635

3736
<img src="/img/intelligent-context-condensation/intelligent-context-condensation.png" alt="Settings for Intelligent Context Condensation" width="600" />

docs/update-notes/v3.17.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -28,10 +28,9 @@ We've introduced an experimental feature called **Intelligent Context Condensati
2828

2929
Here's how it works:
3030

31-
* **Automatic Summarization:** When a conversation approaches its context window limit (specifically, when it's 50% full), Roo Code now automatically uses a Large Language Model (LLM) to summarize the existing conversation history.
31+
* **Automatic Summarization:** When a conversation approaches its context window limit (specifically, when the context window is almost full), Roo Code now automatically uses a Large Language Model (LLM) to summarize the existing conversation history.
3232
* **Preserving Key Information:** The goal is to reduce the token count of the history while retaining the most essential information, ensuring the LLM has a coherent understanding of past interactions. This helps avoid the silent dropping of older messages.
3333
* **Checkpoint Integrity:** While summarized for ongoing LLM calls, all original messages are preserved when you rewind to old checkpoints.
34-
* **Image Handling:** Images within messages are specially managed during the summarization process.
3534
* **Opt-in Experimental Feature:** Disabled by default, this feature can be enabled in "Advanced Settings" under "Experimental Features." Please note that the LLM call for summarization incurs a cost, which is not currently displayed in the UI's cost tracking.
3635

3736
<img src="/img/intelligent-context-condensation/intelligent-context-condensation.png" alt="Settings for Intelligent Context Condensation" width="600" />

0 commit comments

Comments
 (0)