Skip to content

[Bug]: Large MCP responses #2281

@tusharmath

Description

@tusharmath

Bug Description

Large MCP responses don't get truncated like other tool calls and result in a failure from the upstream

Image

Steps to Reproduce

Adding a dump

2025-10-31_18-38-11-dump.html

Expected Behavior

Large MCP response should be written to a file and the link to the file should be provided to the LLM.

Actual Behavior

ERROR: POST https://openrouter.ai/api/v1/chat/completions
    Caused by: 400 Bad Request Reason: {"error":{"message":"This endpoint's maximum context length is 204800 tokens. However, you requested about 220799 tokens (193088 of text input, 7231 of tool input, 20480 in the output). Please reduce the length of either one, or use the \"middle-out\" transform to compress your prompt automatically.","code":400,"metadata":{"p

Output got truncated because of another bug in the CLI.

Error Logs

Forge Version

1.18.0

Operating System

macOS

OS Version

26.2

AI Provider

OpenRouter

Model

GLM 4.7

Installation Method

npx forgecode@latest

Configuration

Additional Context

No response

Metadata

Metadata

Labels

type: bugSomething isn't working.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions