Skip to content

LCM compaction with github-copilot fails: missing token exchange + baseUrl (fixed upstream in PR #322 / v0.8.0) #21

@allenliang2022

Description

@allenliang2022

Summary

lossless-claw-enhanced (fork at v0.5.2) is missing the GitHub Copilot token-exchange fix that has already landed upstream in Martian-Engineering/lossless-claw PR #322 (released in v0.8.0+). As a result, all LCM compactions using github-copilot/* summary models fail with Connection error. and silently fall back to truncation, losing summaries.

Repro

  1. Configure lossless-claw-enhanced with a github-copilot/* model as the LCM summary model.
  2. Run any session long enough to trigger compaction.
  3. Observe gateway.err.log:
[lcm] empty normalized summary on first attempt; provider=github-copilot;
  model=gpt-5.4-mini; ... finish=error; error_message=Connection error.;
  retrying with conservative settings
[lcm] retry also returned empty summary; ... falling back to truncation

This happens for every Copilot-backed compaction, including with gpt-5.4-mini, gemini-3.1-pro-preview, and Claude models routed through github-copilot.

Root cause

In src/plugin/index.ts, the complete() bridge calls runtime.modelAuth.getApiKeyForModel() and forwards the resulting key directly to completeSimple(). For the github-copilot provider this returns the raw GitHub user token (ghu_...), which the downstream pi-ai SDK can't use:

  • Copilot endpoints require an exchanged Copilot API token (tid=...)
  • The correct runtime baseUrl must be derived from the exchanged token's proxy-ep field (e.g. api.enterprise.githubcopilot.com vs api.individual.githubcopilot.com)

OpenClaw core does this in its agent-runtime layer (setRuntimeApiKeyForCompletionresolveCopilotApiToken), but that layer isn't reachable from the plugin path — so LCM's complete() skips the exchange entirely.

This is the exact same root cause as upstream issue Martian-Engineering/lossless-claw#269, fixed by PR #322 ("fix: use runtime-ready model auth for summarization", merged 2026-04-08).

Suggested fix

Easiest path: rebase / cherry-pick upstream PR #322 (and ideally sync to the latest upstream v0.9.1, which also contains a number of compatibility fixes for newer OpenClaw releases).

Local workaround currently in production on my side (in case it helps as a reference, but the real fix should be the upstream sync):

// in src/plugin/index.ts, after `completeOptions` is built,
// before requestMetadata / completeSimple:
if (providerId === "github-copilot" && resolvedApiKey) {
  try {
    const mod = await import(
      "/opt/homebrew/lib/node_modules/openclaw/dist/github-copilot-token-4lekVV0W.js"
    );
    const exchanged = await mod.resolveCopilotApiToken({
      githubToken: resolvedApiKey,
    });
    if (exchanged?.token) {
      resolvedApiKey = exchanged.token;
      (completeOptions as any).apiKey = exchanged.token;
    }
    if (exchanged?.baseUrl) {
      (resolvedModel as any).baseUrl = exchanged.baseUrl;
    }
  } catch (err) {
    console.error("[lcm-patch] copilot token exchange failed:", err);
  }
}

After this patch (and a gateway restart): Connection error. count in LCM logs drops to 0.

Environment

  • lossless-claw-enhanced 0.5.2 (installed via clawhub)
  • OpenClaw 2026.4.15
  • Provider: github-copilot (Individual plan)
  • Models tried: gpt-5.4-mini, gemini-3.1-pro-preview, claude-opus-4.7

Why it matters

Compaction silently falling back to truncation means users on Copilot-backed setups quietly lose summary fidelity — the symptom shows up not as an error to the user, but as worse long-conversation memory and degraded recall. Worth syncing upstream sooner rather than later.

Happy to test once a fix lands. Thanks for maintaining the fork! 🙏

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions