Skip to content

v2.2.6-beta0 - Compression nudge bounds and ID consistency

Pre-release
Pre-release

Choose a tag to compare

@Tarquinen Tarquinen released this 28 Feb 05:57
· 15 commits to master since this release

What's Changed

  • Added max/min nudge limits for compress guidance.
  • Tightened message-id injection for ignored user messages and first subagent prompt messages.
  • Added subagent safety guidance to system prompts.
  • Bumped package metadata to 2.2.6-beta0.

Config Changes

  • Rename compress.contextLimit -> compress.maxContextLimit.
  • Rename compress.modelLimits -> compress.modelMaxLimits.
  • Add optional lower-bound settings: compress.minContextLimit and compress.modelMinLimits.
"compress": {
  "maxContextLimit": "80%",
  "minContextLimit": 30000,
  "modelMaxLimits": {
    "openai/gpt-5.3-codex": 120000,
    "anthropic/claude-sonnet-4.6": "75%"
  },
  "modelMinLimits": {
    "openai/gpt-5.3-codex": 30000,
    "anthropic/claude-sonnet-4.6": "20%"
  }
}

Defaults:

  • compress.maxContextLimit: 100000
  • compress.minContextLimit: 30000

Full Changelog: v2.2.5-beta0...v2.2.6-beta0