[Question]: How to deploy o3-mini azure model #5630
Unanswered
andremachado94
asked this question in
Q&A
Replies: 3 comments 7 replies
-
Update to latest to be able to use o3-mini |
Beta Was this translation helpful? Give feedback.
0 replies
-
I just pulled the latest dev (7c8a930) and it's still throwing errors. No issues with o1. |
Beta Was this translation helpful? Give feedback.
3 replies
-
This is what I have... |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
What is your question?
o3-mini was recently added to the available Azure models and I wanted to see if I could use it with Librechat. I simply tried to add it to the list of available models. My configuration looks like this:
Model
gpt-4o-mini
works fine as expected, howevero3-mini
fails and the only log message I can get is:Am I missing anything on this configuration?
More Details
Doing a curl request to the azure chat completions endpoint works fine. While doing that I noticed that some params like
temperature
,frequency_penalty
orpresence_penalty
were not supported by this model.I figured Librechat could be sending these params by default, leading to the 400 error so I also added the following property:
Using API version:
2025-01-01-preview
Using Librechat version:
v0.7.6
Issue persists
What is the main subject of your question?
Endpoints
Screenshots
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions