MCP tools not working for LibreChat + LiteLLM + Ollama #11253
Answered
by
snowstopxt
snowstopxt
asked this question in
Q&A
-
Beta Was this translation helpful? Give feedback.
Answered by
snowstopxt
Jun 2, 2025
Replies: 1 comment 5 replies
-
What transport does your MCP server use? Right now litellm only supports SSE. You can also check out the Litellm proxy UI to inspect and run the tool from there at the One other thing you can try is to use the docker-compose identifier as the network like |
Beta Was this translation helpful? Give feedback.
5 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I see, I have also managed to get the MCP tool call to work in LibreChat thanks to this discussion danny-avila/LibreChat#2215. I needed to use openai prefix for the model and add /v1 to the api_base in litellm-config.yaml.