Custom azure open AI hosted model + MCP server - azureOpenAIApiInstanceName is required when using azureOpenAIApiKey #7307
Replies: 1 comment
-
This Issue resolved after the latest code change pulled from last week |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello @danny-avila , I am trying Librechat out with our custom internal hosted LLM , I was able to successfully set that up using an azureOpenAI end points ,
-apiKey- is sent as the bearer token . Or I have the option of passing a http_api_key to my proxy where I can always attach as bearer token header
I also wanted to integrate an in built MCP server. I was able to hook that up also in the librechat.yaml configuration .
Issue :
After the MCP server integration. when I start the
nom backend:dev
and ask any question I get an errorazureOpenAIApiInstanceName is required when using azureOpenAIApiKey
with debug enabled its pointing to a AzureChatOpenAI._getClientOptions ()
Any help on this would be greatly appreciated.
2025-05-09T21:06:40.130Z error: [api/server/controllers/agents/client.js #sendCompletion] Operation aborted azureOpenAIApiInstanceName is required when using azureOpenAIApiKey
2025-05-09T21:06:40.130Z error: [api/server/controllers/agents/client.js #sendCompletion] Unhandled error type azureOpenAIApiInstanceName is required when using azureOpenAIApiKey
Error: azureOpenAIApiInstanceName is required when using azureOpenAIApiKey
at Object.getEndpoint (XXX/LibreChat/node_modules/@librechat/agents/node_modules/@langchain/openai/dist/utils/azure.cjs:42:19)
at AzureChatOpenAI._getClientOptions (XXX/LibreChat/node_modules/@librechat/agents/dist/cjs/llm/openai/index.cjs:109:37)
at AzureChatOpenAI.completionWithRetry (XXX/LibreChat/node_modules/@librechat/agents/node_modules/@langchain/openai/dist/chat_models.cjs:2336:37)
at AzureChatOpenAI._streamResponseChunks XXX/node_modules/@librechat/agents/node_modules/@langchain/openai/dist/chat_models.cjs:1941:43)
at _streamResponseChunks.next ()
at AzureChatOpenAI._generateUncached (XXX/LibreChat/node_modules/@langchain/core/dist/language_models/chat_models.cjs:215:34)
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async AzureChatOpenAI.invoke (XXX/LibreChat/node_modules/@langchain/core/dist/language_models/chat_models.cjs:92:24)
at async RunnableCallable.func (XXX/@librechat/agents/dist/cjs/graphs/Graph.cjs:382:39)
at async RunnableCallable.invoke (/LibreChat/node_modules/@langchain/langgraph/dist/utils.cjs:82:27) {
pregelTaskId: 'b6a8b7c9-06d9-54ae-8462-39f143556600'
LibreChat.yml
endpoints:
azureOpenAI:
# Endpoint-level configuration
titleModel: "gpt-4-turbo"
plugins: true
assistants: false
groups:
# Group-level configuration
- group: "XXXXX"
apiKey: "XXXX"
deploymentName: "gpt-4o_2024-05-13"
version: "2024-10-21"
baseURL: "http://localhost/openai/deployments/gpt-4o_2024-05-13/"
MCP Server configuration
mcpServers:
mcpServerName:
type: stdio
command: docker
args:
- run
- -i
- --rm
- -e
- TOKEN=${TOPKEN}
- our_mcp_image:latest
env:
TOKEN: "XXXXXX"
Based on my research looks like code is failing somewhere in this method
LibreChat/api/server/controllers/agents/client.js
Line 545 in 4af72aa
Beta Was this translation helpful? Give feedback.
All reactions