Required azureOpenAIApiInstanceName with non-OpenAI models #6402
Replies: 2 comments 4 replies
-
https://www.librechat.ai/docs/configuration/azure#serverless-inference-endpoints |
Beta Was this translation helpful? Give feedback.
4 replies
-
Recreated as issue: #6425 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
Model can not be used with agent endpoint, yields log:
LibreChat | 2025-03-18 19:40:09 error: [api/server/controllers/agents/client.js #titleConvo] Error azureOpenAIApiInstanceName is required when using azureOpenAIApiKey
note: this is not an Azure OpenAI Service but an Azure AI Service, however, it's config does work under LibreChat's Azure OpenAI endpoint.
Version Information
docker images | grep librechat
ghcr.io/danny-avila/librechat-dev latest 0fcd67e41a63 23 hours ago 872MB
ghcr.io/danny-avila/librechat-rag-api-dev-lite latest 333b44cb7d38 12 days ago 1.27GB
ghcr.io/danny-avila/librechat-dev 499a0d44ad00 6 weeks ago 871MB
ghcr.io/danny-avila/librechat-rag-api-dev-lite cf692235882f 3 months ago 1.26GB
git rev-parse HEAD
4d04904
Steps to Reproduce
and add non-OpenAI model config (assumed already deployed)
note: I have another Azure OpenAI group config with OpenAI models. Removing those makes no difference, though.
What browsers are you seeing the problem on?
No response
Relevant log output
Screenshots
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions