Skip to content

v0.3.2 β€” Hermes Agent, Memory Plugin Fix, Configurable Context Window

Choose a tag to compare

@richard-peng-xia richard-peng-xia released this 23 Mar 16:57
· 5 commits to main since this release

What's New

Hermes Agent support (claw_type: hermes)

MetaClaw now auto-configures Hermes Agent on metaclaw start. It injects a metaclaw entry into ~/.hermes/config.yaml custom_providers and sets model.provider: custom:metaclaw. The gateway is restarted automatically.

metaclaw config claw_type hermes
metaclaw start

Fix: OpenClaw memory plugins β€” rawMessage undefined (Hindsight, mem0, memory-lancedb)

Memory plugins relying on the before_prompt_build hook (Hindsight, openclaw-mem0, memory-lancedb, and any custom plugin using the same pattern) now receive event.rawMessage correctly. MetaClaw now configures OpenClaw using the anthropic-messages API format, which is the path that populates all hook event fields. MetaClaw's /v1/messages Anthropic-compatible endpoint fully supports tool calls, streaming, and all other agent features β€” no functionality is lost.

Configurable context window (removes hardcoded 20k / 32k caps)

Two new config fields replace the previous hardcoded limits:

Field Default Description
max_context_tokens 20000 Prompt token cap before truncation. Set to 0 to disable truncation entirely. Recommended for skills_only mode with large-context cloud models (MiniMax M2.7, Kimi K2, etc.).
context_window 0 (auto) Context window advertised to OpenClaw (compaction threshold). 0 = auto: 200 000 in skills_only mode, 32 768 in rl/madmax mode. Set explicitly to match your model.
# skills_only + large-context model β€” remove all artificial caps:
metaclaw config max_context_tokens 0
# context_window auto-resolves to 200000 in skills_only mode

# explicit override for any mode:
metaclaw config context_window 128000
metaclaw config max_context_tokens 120000

Previous release notes (v0.3.2 β€” 03/16/2026): Multi-claw support: IronClaw, PicoClaw, ZeroClaw, CoPaw, NanoClaw, and NemoClaw alongside OpenClaw. NanoClaw via /v1/messages Anthropic-compatible endpoint; NemoClaw via OpenShell inference routing. OpenRouter LLM platform support.