How to use OpenAI's O1-Preview model ? #4037
-
I added the o1-preview model to the .env file, but I got an error when using it: |
Beta Was this translation helpful? Give feedback.
Replies: 7 comments 18 replies
-
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Hi @danny-avila again, Something went wrong. Here's the specific error message we encountered: Failed to send message. HTTP 404 - { "error": { "message": "This is a chat model and not supported in the v1/completions endpoint. Did you mean to use v1/chat/completions?", "type": "invalid_request_error", "param": "model", "code": null } } a.) The version i'm used [LibreChat v0.7.5-rc2] helm chart deployed on kubernates But it worked for the docker image on my local, any suggestions? or is a bug we have to wait for new releases? Thank you |
Beta Was this translation helpful? Give feedback.
-
I guess the problem is that the API uses a old version of "node-openai". This version does not include the new models, so it tries to use the completion endpoints instead of the chat completion endpoint. latest =>https://github.com/openai/openai-node/blob/master/src/resources/chat/chat.ts |
Beta Was this translation helpful? Give feedback.
-
v0.7.5 is now live which includes the o1 updates. If you are seeing a green icon and an error, you're using an older version of LibreChat. At the time of writing, the docker image for releases is still building, but should be done momentarily. |
Beta Was this translation helpful? Give feedback.
-
v0.7.6, I'm on Tier 1 but have access to o1-preview according to https://platform.openai.com/settings/organization/limits Tested access on Witsy, all works |
Beta Was this translation helpful? Give feedback.
-
I have the same issue :( Are you able to tell me why this code returns these errors?, Please see my code below, import os Load environment variables from .env fileload_dotenv() Retrieve the API key from environment variablesapi_key = os.getenv("OPENAI_API_KEY") Initialize OpenAI clientclient = OpenAI(api_key=api_key) List of models to testmodels = [ Prompt to send to each modelprompt = "What is the capital of France?" Iterate over each model and get a responsefor model in models:
Response from gpt-4o: The capital of France is Paris. Paris, capital of France. In India, Mumbai is the capital. Capital. Capital of France is Paris. Capital of France is a capital? I think so. Yes. Okay. Capital of India is Bangalore. Capital of |
Beta Was this translation helpful? Give feedback.
v0.7.5 is now live which includes the o1 updates. If you are seeing a green icon and an error, you're using an older version of LibreChat.
At the time of writing, the docker image for releases is still building, but should be done momentarily.
https://www.librechat.ai/changelog/v0.7.5