how to use azureOpenAI GPT-5 (Responses API) #9188
-
Hello, I have been using Azure OpenAI models from a locally hosted Docker LibreChat instance. Mainly, I have been using the router model for GPT-4. Since the GPT-5 model was released, I tried to use it, but it didn’t work properly. What I did:
The URL and key are exactly as shown on Microsoft’s site, but to verify, I tested with curl and it worked:
Is there something wrong with my configuration? Here are my current settings:
Any help would be greatly appreciated. |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments 1 reply
-
I tried overriding global.fetch to check the logs, and it seems that the request destination is not set as expected. The URL that works when sending via curl (and set in librechat.yml):
The URL accessed during chat operation:
The api-version is incorrect... why? |
Beta Was this translation helpful? Give feedback.
-
I'm not very familiar with this kind of code, so I can't quite tell where the URL is being modified… As a brute-force workaround, I just wanted to report that I was able to get it working by overriding global.fetch as shown below:
|
Beta Was this translation helpful? Give feedback.
-
Hey, I'm facing the same issue after adding GPT-5, can you please show where you added your global.fetch code? Thanks! |
Beta Was this translation helpful? Give feedback.
-
I actually managed to fix this error by changing the librechat.yaml config:
Also, under modelSpecs, make sure to use
|
Beta Was this translation helpful? Give feedback.
I actually managed to fix this error by changing the librechat.yaml config:
Also, under modelSpecs, make sure to use