Model o4-mini not working #7033
Closed
ThomasZoellinger
started this conversation in
Help Wanted
Replies: 1 comment 1 reply
-
Updating Instructions (docker): # Linux command to Remove all existing images
docker images -a | grep "librechat" | awk '{print $3}' | xargs docker rmi
# Windows Powershell
docker images -a | findstr "librechat" | ForEach-Object { docker rmi $_.Split()[2] } Then follow the directions here: https://www.librechat.ai/docs/local/docker#update-librechat |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I get the following error when I try the new o4-mini model:
Something went wrong. Here's the specific error message we encountered: Failed to send message. HTTP 404 - { "error": { "code": null, "message": "Invalid URL (POST /v1/completions)", "param": null, "type": "invalid_request_error" } }
The other openai models seem to work just fine. Currently I have these in production that work:
OPENAI_MODELS=gpt-4.1-mini,gpt-4.1,gpt-4o-search-preview,gpt-4.1-nano
Beta Was this translation helpful? Give feedback.
All reactions