First message is dropped w/ MCP servers selected and custom endpoints #8890
-
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 4 replies
-
You are running into a model context issue. GPT-4 has a max context window of about 8k and you can easily surpass this with MCP servers configured, which add a lot to the context. Use a different model and/or agent where you can configure the max context window as well as selectively attach individual tools from an MCP server to help manage context. |
Beta Was this translation helpful? Give feedback.
-
I'm seeing this too, particularly when an MCP provides a lot of context. But it's odd that the failure mode is VLLM reporting that the messages list is empty, that seems like a bug. (and it's not actually overflowing our model context window, it's some context limit internal to LibreChat that's being crossed, which then causes the chat mesages to be cleared -- the exact same input, with the same LLM, via OpenWebUI works fine) |
Beta Was this translation helpful? Give feedback.
-
definitely seems like a bug |
Beta Was this translation helpful? Give feedback.
You are running into a model context issue. GPT-4 has a max context window of about 8k and you can easily surpass this with MCP servers configured, which add a lot to the context. Use a different model and/or agent where you can configure the max context window as well as selectively attach individual tools from an MCP server to help manage context.