Help with Open AI o Models #6283
OfmanAI
started this conversation in
Help Wanted
Replies: 1 comment
-
This problem has been resolved in the recently commit 🔃 refactor: Allow streaming for o1 models in OpenAIClient and agent runs (https://github.com/danny-avila/LibreChat/pull/6509[)](https://github.com/danny-avila/LibreChat/commit/c4fea9cd79686bd675256be8d336d462873450c3) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
When migrating to OpenAI's new reasoning models, they appear in the model list but fail to function. The following error is returned:
"error": {
"message": "This is a chat model and not supported in the v1/completions endpoint. Did you mean to use v1/chat/completions?",
"type": "invalid_request_error",
"param": "model",
"code": null
}
}
Previous GPT-4 versions work flawlessly.
The new reasoning models do not work with the same API setup.
Steps to Reproduce:
Attempt to call a reasoning model via the v1/completions endpoint.
Observe the 404 error returned.
Potential Cause
The error suggests that the reasoning models require v1/chat/completions instead of v1/completions.
Has anyone successfully migrated to the new reasoning models?
Beta Was this translation helpful? Give feedback.
All reactions