Skip to content

Conversation

@logancyang
Copy link
Owner

@logancyang logancyang commented Jan 12, 2026

Auto-Compact Context Feature

Summary

Implements automatic context compaction using map-reduce summarization when context exceeds a configurable token threshold. This prevents context overflow errors and reduces token usage for large context windows.

Changes

New Files

  • src/core/ContextCompactor.ts - Singleton class that compresses large context using map-reduce pattern
  • src/types/compaction.ts - Type definitions for CompactionResult and ParsedContextItem

Modified Files

  • src/core/ContextManager.ts - Integrated compaction, preserves compacted content across turns
  • src/settings/model.ts - Added autoCompactThreshold setting
  • src/constants.ts - Added default threshold (128k tokens) and "Compacting" loading message
  • src/settings/v2/components/ModelSettings.tsx - Added threshold slider in Models tab
  • src/state/ChatUIState.ts - Thread loading message callback
  • src/core/ChatManager.ts - Thread loading message callback
  • src/components/Chat.tsx - Pass loading message callback to sendMessage
  • src/core/ChatManager.test.ts - Added mocks and updated test assertions
  • src/LLMProviders/chainRunner/utils/chatHistoryUtils.ts - Added addChatHistoryWithCompaction for conversation history compaction
  • src/LLMProviders/chainRunner/CopilotPlusChainRunner.ts - Uses compaction-aware chat history
  • src/LLMProviders/chainRunner/AutonomousAgentChainRunner.ts - Uses compaction-aware chat history
  • src/contextProcessor.ts - Removed verbose "Processing note" logs

How It Works

When context attached to a user message exceeds the threshold (in tokens), the compactor:

  1. PARSE: Extracts XML-structured context blocks (note_context, active_note, url_content, etc.) into discrete items
  2. MAP: Summarizes large items (>50k chars) in parallel using the current chat model with low temperature (0.1)
  3. REDUCE: Rebuilds XML structure with summarized content, preserving metadata (title, path, timestamps)

Multi-Turn Compaction Preservation

Key improvement: Once context is compacted, it stays compacted across turns.

  • Turn 1: User attaches folders A, B, C → compacted → stored in envelope L3 with compactedPaths metadata
  • Turn 2: Same folders attached again
    • L2 uses Turn 1's compacted content (not re-read from disk)
    • L3 only includes new files not already in L2 (deduped via compactedPaths)
    • No re-compaction of already-summarized content

This is achieved by:

  1. buildL2ContextFromPreviousTurns reads stored L3 content from previous envelopes
  2. Extracts paths from segment.metadata.compactedPaths for deduplication
  3. Returns l2Paths set to filter out files already in L2
  4. After compaction, buildCompactedEnvelope stores original paths in metadata for future turns

Conversation History Compaction

When total context (system + chat history + current message) exceeds the threshold, older conversation turns are automatically summarized:

  • Keeps the 2 most recent turns intact
  • Summarizes older turns into a concise summary preserving key decisions and context
  • Shows "Compacting..." during summarization
  • Prevents context overflow errors from long conversations

Fail-safes

  • Max 3 concurrent LLM calls to avoid API overload
  • Aborts if >50% of summarizations fail
  • Truncates items >500k chars before summarization
  • Projects mode uses fixed 1M token threshold
  • Lazy-loads ContextCompactor to avoid test dependency issues
  • Chat history compaction keeps 2 most recent turns intact, summarizes older ones

Configuration

Settings > Models > Auto-compact threshold

  • Range: 64k - 1M tokens
  • Default: 128k tokens
  • Step: 64k tokens

Higher values effectively disable compaction since context won't reach the threshold.

User Experience

  • Shows "Compacting..." with animated dots during compaction
  • Summarized content is marked with [SUMMARIZED] prefix
  • Original metadata preserved for accurate citations

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 4c71e17861

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

@logancyang
Copy link
Owner Author

@codex review

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: d50896b484

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

@logancyang
Copy link
Owner Author

@codex review

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 8224ec7ab6

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

@logancyang
Copy link
Owner Author

@codex review

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: ffe851797e

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

@logancyang
Copy link
Owner Author

@codex review

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: c10b070d54

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

@logancyang
Copy link
Owner Author

@codex review

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 79e53830ca

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

@logancyang
Copy link
Owner Author

@codex review

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 5b68559d21

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

@logancyang
Copy link
Owner Author

@codex review

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 9d62172a13

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

@logancyang
Copy link
Owner Author

@codex review

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 8490870042

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

@logancyang
Copy link
Owner Author

@codex review

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 9d7612b3c8

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

@logancyang
Copy link
Owner Author

@codex review

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 6863bfacb9

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

@logancyang
Copy link
Owner Author

@codex review

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 1a358f1c71

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

@logancyang
Copy link
Owner Author

@codex review

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: b1ffb143ef

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: b1ffb143ef

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

@logancyang
Copy link
Owner Author

@codex review

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 881ace1d85

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment on lines +385 to +420
// Track paths from ALL messages for deduplication
for (const segment of l3Layer.segments || []) {
if (segment.metadata?.notePath) {
l2Paths.add(segment.metadata.notePath as string);
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Track tag/folder note paths in L2 dedupe

The L2 dedupe set only collects segment.metadata.notePath (or compactedPaths), but tag/folder context is stored as a single L3 segment without notePath metadata (via appendTurnContextSegment). That means notes added via tags/folders in a previous turn are pulled into L2 (their <note_context> blocks are concatenated into l2Context), yet their paths are missing from l2Paths, so the same notes are reprocessed and appended to L3 again on later turns. This duplicates large note content and can prematurely trigger compaction/overflow. Consider storing per-note notePath metadata for tag/folder segments or parsing paths out of those segments when building l2Paths.

Useful? React with 👍 / 👎.

@logancyang
Copy link
Owner Author

@codex review

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 31fc31744d

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

@logancyang
Copy link
Owner Author

@codex review

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 015975af0f

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment on lines +392 to +401
const l3Layer = msg.contextEnvelope?.layers?.find((l) => l.id === "L3_TURN");

if (l3Layer) {

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Preserve L2 context for messages without envelopes

The new L2 builder only reads msg.contextEnvelope and skips messages without an envelope. Persisted chats only store the lightweight context metadata (see ChatPersistenceManager), so any loaded history will have contextEnvelope === undefined. That means continuing a saved chat drops all prior context from L2 and defeats deduplication/compaction for those messages. Users who resume a saved conversation will lose the “context library” from earlier turns. Consider rebuilding L2 from message.context when contextEnvelope is missing (or persisting envelopes).

Useful? React with 👍 / 👎.

- Add ContextCompactor class for map-reduce style context summarization
- Implement auto-compact when context exceeds configurable threshold
- Track note paths for tags/folders in L3 segment metadata for L2 deduplication
- Add loadAndAddChatHistory for streamlined chat history loading
- Store compacted paths in envelope metadata for multi-turn deduplication
- Fix: Only track L3 context paths in compactedPaths (excludes L5 user message files)

Co-Authored-By: Claude Opus 4.5 <[email protected]>
@logancyang
Copy link
Owner Author

@codex review

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 7e63742ab2

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment on lines +78 to +82
private readonly BLOCK_TYPES = [
"note_context",
"active_note",
"url_content",
"selected_text",

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Compact web-selected text blocks too

Web selections are emitted as <web_selected_text> blocks (see processSelectedTextContexts), but the compactor only parses the BLOCK_TYPES list shown here. Because web_selected_text is missing, large web selections are never summarized, so auto-compaction can no-op when the oversized context is dominated by web-selected text. That leaves prompts above the threshold and defeats the overflow protection for that common case.

Useful? React with 👍 / 👎.

Comment on lines +397 to +401
for (let i = 0; i < previousUserMessages.length; i++) {
const msg = previousUserMessages[i];
const l3Layer = msg.contextEnvelope?.layers?.find((l) => l.id === "L3_TURN");

if (l3Layer) {

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Rebuild L2 when envelopes are missing

This loop only consumes contextEnvelope to build L2. ChatPersistenceManager does not persist envelopes, so messages loaded from disk have no L3_TURN layer and are skipped here. After reopening a saved chat, the prior “context library” is silently lost and note paths won’t be deduped, so users can re-attach notes that were already present before the save. Consider falling back to message.context when contextEnvelope is absent, or persisting envelopes.

Useful? React with 👍 / 👎.

@logancyang logancyang merged commit a576ad7 into master Jan 14, 2026
1 check passed
@logancyang logancyang deleted the implement-compact branch January 14, 2026 06:15
@logancyang logancyang mentioned this pull request Jan 15, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants