How do I trigger MCPs? #6284
Replies: 4 comments
-
Having same problem here |
Beta Was this translation helpful? Give feedback.
-
https://www.librechat.ai/docs/features/agents#model-context-protocol-mcp |
Beta Was this translation helpful? Give feedback.
-
OK thank you @danny-avila This kind of works, but with llama-server I have this issue:
Unfortunately llama-server itself does not seem to have a startup flag that disables streaming, it can be passed as an argument during a request, but there is no option to include this in the request sent to the API, not as far as I can see. Can/how can |
Beta Was this translation helpful? Give feedback.
-
@vmajor Hah! I have been tearing the code apart, I have added logging and tracked down as many places as I could to try and work around this! From my exploration the Agent client extension of Base client is using processStream. So where in there stream=false is being overwritten in the model options. @danny-avila Am I understanding correctly that by using Agents it is forced into streaming tokens? Ie, is this a dead end and I can't use llama.cpp with as my local inference engine for agents and tool calling? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
In Claude Desktop they trigger autonomously as needed, in my own GUI I can trigger them with custom prompts that I inject depending on which MCPs I enable, but before I start tearing LibreChat apart, what is the correct way to let the LLM know that MCPs exist. I have these enabled right now and none can be triggered:
Beta Was this translation helpful? Give feedback.
All reactions