Serverless deployments running into error with 7.9-rc1 version #8547
Replies: 2 comments
-
Hi Danny, this I found after some experimentation. If the name of the model is changed - like instead of grok-3 I use- grok-3-version-1 then also error persists. ![]() If the endpoint is changed for example -
then the grok-3 model works absolutely fine. ![]() So just by changing the endpoint name the models from xai in azure foundry started working, can you please fix this as before 7.9-rc1, the grok models from xai used to work normally. ![]() The config for this set up (the one above working is)- endpoints: Thanks !! |
Beta Was this translation helpful? Give feedback.
-
As the grok models are not working with the name of the custom endpoint- xai, and x-AI works as shared above, the icon is also not rendered. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
Hi Danny, for the version 7.9-rc1 and 62b4f3b I tried deploying the serverless models from the same azure foundry, so naturally all the models share the same base URL and the same API key. My config is as follows-
endpoints:
custom:
- name: "Deepseek"
apiKey: "${DEEPSEEK_API_KEY}"
baseURL: "https://regionname.api.cognitive.microsoft.com/models"
models:
default: ["DeepSeek-R1"]
fetch: false
titleConvo: true
directEndpoint: false
titleModel: "DeepSeek-V3-0324"
modelDisplayLabel: "Deepseek"
- name: "xai"
apiKey: "${XAI_API_KEY}"
baseURL: "https://regionname.api.cognitive.microsoft.com/models"
models:
default: ["grok-3", "grok-3-mini"]
fetch: false
titleConvo: true
directEndpoint: false
titleModel: "grok-3"
modelDisplayLabel: "xAI"
so the deepseek and xai api keys I have defined separately, have the same value as the models in the azure foundry have the same base url and same key. Deepseek works perfectly fine, xai runs in incorrect api key but the key is correct and was working until this version.
I had even kept the complete URL as -
custom:
- name: "Deepseek"
apiKey: "${DEEPSEEK_API_KEY}"
baseURL: "https://region.api.cognitive.microsoft.com/models/chat/completions?api-version=2024-05-01-preview"
models:
default: ["DeepSeek-R1"]
fetch: false
titleConvo: true
directEndpoint: true
titleModel: "DeepSeek-V3-0324"
modelDisplayLabel: "Deepseek"
- name: "xai"
apiKey: "${XAI_API_KEY}"
baseURL: "https://region.api.cognitive.microsoft.com/models/chat/completions?api-version=2024-05-01-preview"
models:
default: ["grok-3","grok-3-mini"]
fetch: false
titleConvo: true
directEndpoint: true
titleModel: "grok-3"
modelDisplayLabel: "xAI"
I still face the same issue-
The fun part is that the same models as serverless endpoints are performing well in the azure ai endpoint
apiKey: "${DEEPSEEK_API_KEY}" /// even if i keep the key deepseek or xai its the same and all the models work
baseURL: "https://regionname.api.cognitive.microsoft.com/models/"
version: "2024-05-01-preview"
serverless: true
models:
grok-3: true
grok-3-mini: true
DeepSeek-R1: true
Can you please suggest something here............thanks!!!!!
Version Information
Using the LC version- 62b4f3b
Steps to Reproduce
What browsers are you seeing the problem on?
Microsoft Edge
Relevant log output
Screenshots
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions