after install with helm chart librechat installation always return error #7620
Replies: 4 comments 9 replies
-
i tried to capture traffic inside i don't see any 11434 port related traffic |
Beta Was this translation helpful? Give feedback.
-
@hofq can you look into this? |
Beta Was this translation helpful? Give feedback.
-
@danny-avila @hofq could you suggest what exactly wrong with my |
Beta Was this translation helpful? Give feedback.
-
i'm not familiar with mcp support in Librechat, but as there is no seperate container i don't see how this has anything to do with the helm chart. Correct me if i'm wrong tho @danny-avila |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
I use following deploy approach
install ollama models with ollama-operator
install gateway database MCP server
install librechat + librechat-rag-api
librechat-values.yaml
librechat-rag-api-values.yaml
all pods run
all
ollama
andgateway
services availableand
curl -vvv http://ollama-srv-graphite.ollama-operator-system.svc.cluster.local:11434/api/tags
return expected model descriptionlibrechat UI run on 3080 port
kubectl port-forward -n librechat svc/librechat --address 0.0.0.0 3080
i can login with user which create with npm run create-user
but any chat message return error
inside librchat logs i see only following
which Agent shall I add to allow usage MCP+Ollama?
Version Information
Steps to Reproduce
What browsers are you seeing the problem on?
No response
Relevant log output
Screenshots
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions