Replies: 1 comment
-
I'm having this exact issue as well. I have no issues with litellm-proxy using docker-compose.yml locally but the same or similar configurations aren't working with docker-deploy.yml. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Still banging my head against the wall on this one. Used to have this working but not in the latest version.
Any glaring issues with the configuration below?
My LiteLLM is running on the same Docker network (librechat_default) as my LibreChat app but am getting this error message when chatting with the Bedrock model through LibreChat.
{"cause":{"code":"ECONNREFUSED","errno":"ECONNREFUSED","message":"request to http://librechat-litellm-1:4000/chat/completions failed, reason: connect ECONNREFUSED 172.18.0.8:4000","type":"system"},"level":"error","message":"[handleAbortError] AI response error; aborting request: Connection error.","stack":"Error: Connection error.\n at OpenAI.makeRequest
In my librechat.yaml the LiteLLM proxy container is configured as follows
I can see from inspecting the Docker network
docker inspect librechat_default
that both theapi
andlibrechat-litellm-1
containers are on the same network.It also doesn't work if I specify
baseURL: "http://litellm:4000"
. Same error messageThe curl command to the LiteLLM directly does work.
If it makes a difference I am using deploy-compose.yml instead of docker-compose.yml and passing both the deploy file and the override file using the
-f
flags to thedocker compose up -d
commandBeta Was this translation helpful? Give feedback.
All reactions