-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Open
Labels
type: bugSomething isn't working.Something isn't working.
Description
Bug Description
Large MCP responses don't get truncated like other tool calls and result in a failure from the upstream
Steps to Reproduce
Adding a dump
Expected Behavior
Large MCP response should be written to a file and the link to the file should be provided to the LLM.
Actual Behavior
ERROR: POST https://openrouter.ai/api/v1/chat/completions
Caused by: 400 Bad Request Reason: {"error":{"message":"This endpoint's maximum context length is 204800 tokens. However, you requested about 220799 tokens (193088 of text input, 7231 of tool input, 20480 in the output). Please reduce the length of either one, or use the \"middle-out\" transform to compress your prompt automatically.","code":400,"metadata":{"p
Output got truncated because of another bug in the CLI.
Error Logs
Forge Version
1.18.0
Operating System
macOS
OS Version
26.2
AI Provider
OpenRouter
Model
GLM 4.7
Installation Method
npx forgecode@latest
Configuration
Additional Context
No response
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
type: bugSomething isn't working.Something isn't working.