MCP issue with Llama cpp docker models #8736
Replies: 4 comments 3 replies
-
This seems like a shortcoming with the “llama-swap” custom endpoint you are using, maybe it can’t process specific function definitions that are needed for the MCP servers you are using. Have you tried another endpoint? which specific server are you attempting to use at the time of the error? |
Beta Was this translation helpful? Give feedback.
-
@danny-avila Llama-swap works fine with the same mcp server with Jan and open webui. I suspect something is happening with Librechat and llama-swap; notice from the logs librechat is failing to fetch models from Llamaswap API error |
Beta Was this translation helpful? Give feedback.
-
@danny-avila, i ran llama cpp in docker and I still get the same error, so its not llama-swap related.
|
Beta Was this translation helpful? Give feedback.
-
I have this problem too with llama.cpp |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
I have Librechat setup with custom endpoint with llama-swap.
I have setup a couple of mcp servers but I get an error message when I try to call an mcp
error: [api/server/controllers/agents/client.js #sendCompletion] Unhandled error type 500 JSON schema conversion failed: Unrecognized schema: {"not":{}}
Any thoughts?
Version Information
ce3b00be875a
Steps to Reproduce
What browsers are you seeing the problem on?
Microsoft Edge
Relevant log output
Screenshots
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions