Skip to content

Commit 5936d07

Browse files
authored
Update reference-model-inference-api.md
1 parent 9ec4ede commit 5936d07

File tree

1 file changed

+3
-2
lines changed

1 file changed

+3
-2
lines changed

articles/ai-studio/reference/reference-model-inference-api.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -136,7 +136,7 @@ Content-Type: application/json
136136

137137
The Azure AI Model Inference API specifies a set of modalities and parameters that models can subscribe to. However, some models may have further capabilities that the ones the API indicates. On those cases, the API allows the developer to pass them as extra parameters in the payload.
138138

139-
By setting a header `extra-parameters: allow`, the API will attempt to pass any unknown parameter directly to the underlying model. If the model can handle that parameter, the request completes.
139+
By setting a header `extra-parameters: pass-through`, the API will attempt to pass any unknown parameter directly to the underlying model. If the model can handle that parameter, the request completes.
140140

141141
The following example shows a request passing the parameter `safe_prompt` supported by Mistral-Large, which isn't specified in the Azure AI Model Inference API:
142142

@@ -163,6 +163,7 @@ var messages = [
163163
];
164164

165165
var response = await client.path("/chat/completions").post({
166+
"extra-parameters": "pass-through",
166167
body: {
167168
messages: messages,
168169
safe_mode: true
@@ -178,7 +179,7 @@ __Request__
178179
POST /chat/completions?api-version=2024-04-01-preview
179180
Authorization: Bearer <bearer-token>
180181
Content-Type: application/json
181-
extra-parameters: allow
182+
extra-parameters: pass-through
182183
```
183184

184185
```JSON

0 commit comments

Comments
 (0)