O1-Mini & O3-Mini rejected the request #5817
Unanswered
ramifara
asked this question in
Troubleshooting
Replies: 2 comments 5 replies
-
Can’t reproduce, no relevant log output provided, can’t help. |
Beta Was this translation helpful? Give feedback.
5 replies
-
There's a way in LibreChat to drop certain parameters. E.g. for o3-mini: - group: "openai-models"
...
dropParams: ["stream", "temperature", "top_p", "presence_penalty", "frequency_penalty", "best_of", "max_tokens", "stop"]
... |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
When trying to use any version of O1-Mini & O3-Mini I get the error in the screenshot below:
I have checked with OpenAI and I have access to those models
Version Information
ghcr.io/danny-avila/librechat-dev-api latest 5c28cb2558cc 12 hours ago 989MB
ghcr.io/danny-avila/librechat-rag-api-dev-lite latest 1ae8fe67976f 10 months ago 1.51GB
Steps to Reproduce
Choose any o1-mini/o3-mini models
send any requests
What browsers are you seeing the problem on?
Chrome
Relevant log output
Screenshots
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions