o3-mini returns error 404 from OpenAI API response [REQUIRES UPDATE] #5663
-
What happened?Greetings, I tried to implement o3-mini on my self hosted LibreChat and it returns the error: "Something went wrong. Here's the specific error message we encountered: Failed to send message. HTTP 404 - { "error": { "message": "This is a chat model and not supported in the v1/completions endpoint. Did you mean to use v1/chat/completions?", "type": "invalid_request_error", "param": "model", "code": null } }" Weirdly enough, all other models work, even o1. Steps to Reproduce
What browsers are you seeing the problem on?Chrome Relevant log output"Something went wrong. Here's the specific error message we encountered: Failed to send message. HTTP 404 - { "error": { "message": "This is a chat model and not supported in the v1/completions endpoint. Did you mean to use v1/chat/completions?", "type": "invalid_request_error", "param": "model", "code": null } }" ScreenshotsCode of Conduct
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
Requires an update. Updating Instructions (docker): # Linux command to Remove all existing images
docker images -a | grep "librechat" | awk '{print $3}' | xargs docker rmi
# Windows Powershell
docker images -a | findstr "librechat" | ForEach-Object { docker rmi $_.Split()[2] } Then follow the directions here: https://www.librechat.ai/docs/local/docker#update-librechat |
Beta Was this translation helpful? Give feedback.
Requires an update.
Updating Instructions (docker):
Then follow the directions here: https://www.librechat.ai/docs/local/docker#update-librechat