Skip to content

Commit 349a028

Browse files
Merge pull request #7475 from MicrosoftDocs/main
Auto Publish – main to live - 2025-10-05 22:00 UTC
2 parents 6633d97 + 734a83b commit 349a028

File tree

13 files changed

+1622
-1206
lines changed

13 files changed

+1622
-1206
lines changed

.gitignore

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,4 +21,5 @@ _repo.*/
2121
.github/prompts/*.zip
2222
.github/patterns/*.md
2323

24-
articles/ai-foundry/includes/get-started-fdp.md
24+
articles/ai-foundry/includes/get-started-fdp.md
25+
articles/ai-foundry/toc-files/setup-management/toc.yml

articles/ai-foundry/foundry-local/get-started.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -58,10 +58,10 @@ Get started fast with Foundry Local:
5858
winget install Microsoft.FoundryLocal
5959
```
6060
- **macOS**: Open a terminal and run the following command:
61-
`bash
62-
brew tap microsoft/foundrylocal
63-
brew install foundrylocal
64-
`
61+
```bash
62+
brew tap microsoft/foundrylocal
63+
brew install foundrylocal
64+
```
6565
Alternatively, you can download the installer from the [Foundry Local GitHub repository](https://aka.ms/foundry-local-installer).
6666

6767
1. **Run your first model**. Open a terminal and run this command:

articles/ai-foundry/openai/api-version-lifecycle.md

Lines changed: 190 additions & 60 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ manager: nitinme
66
ms.service: azure-ai-foundry
77
ms.subservice: azure-ai-foundry-openai
88
ms.topic: conceptual
9-
ms.date: 10/01/2025
9+
ms.date: 10/06/2025
1010
author: mrbullwinkle
1111
ms.author: mbullwin
1212
recommendations: false
@@ -31,6 +31,7 @@ Starting in August 2025, you can now opt in to our next generation v1 Azure Open
3131
- Faster API release cycle with new features launching more frequently.
3232
- OpenAI client support with minimal code changes to swap between OpenAI and Azure OpenAI when using key-based authentication.
3333
- OpenAI client support for token based authentication and automatic token refresh without the need to take a dependency on a separate Azure OpenAI client.
34+
- Make chat completions calls with models from other providers like DeepSeek and Grok which support the v1 chat completions syntax.
3435

3536
Access to new API calls that are still in preview will be controlled by passing feature specific preview headers allowing you to opt in to the features you want, without having to swap API versions. Alternatively, some features will indicate preview status through their API path and don't require an additional header.
3637

@@ -271,70 +272,199 @@ curl -X POST https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/responses \
271272
}'
272273
```
273274

274-
# [Output](#tab/output)
275-
276-
```json
277-
{
278-
"id": "resp_682f7eb5dc408190b491cbbe57be2fbf0f98d661c3dc276d",
279-
"created_at": 1747943093.0,
280-
"error": null,
281-
"incomplete_details": null,
282-
"instructions": null,
283-
"metadata": {},
284-
"model": "gpt-4.1-nano",
285-
"object": "response",
286-
"output": [
287-
{
288-
"id": "msg_682f7eb61d908190926a004c15c5ddd00f98d661c3dc276d",
289-
"content": [
290-
{
291-
"annotations": [],
292-
"text": "Hello! It looks like you've sent a test message. How can I assist you today?",
293-
"type": "output_text"
275+
---
276+
277+
## Model support
278+
279+
For Azure OpenAI models we recommend using the [Responses API](./supported-languages.md), however, the v1 API also allows you to make chat completions calls with models from other providers like DeepSeek and Grok which support the OpenAI v1 chat completions syntax.
280+
281+
`base_url` will accept both `https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/` and `https://YOUR-RESOURCE-NAME.services.ai.azure.com/openai/v1/` formats.
282+
283+
# [Python](#tab/python)
284+
285+
```python
286+
from openai import OpenAI
287+
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
288+
289+
token_provider = get_bearer_token_provider(
290+
DefaultAzureCredential(), "https://cognitiveservices.azure.com/.default"
291+
)
292+
293+
client = OpenAI(
294+
base_url = "https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/",
295+
api_key=token_provider,
296+
)
297+
completion = client.chat.completions.create(
298+
model="grok-3-mini", # Replace with your model deployment name.
299+
messages=[
300+
{"role": "system", "content": "You are a helpful assistant."},
301+
{"role": "user", "content": "Tell me about the attention is all you need paper"}
302+
]
303+
)
304+
305+
#print(completion.choices[0].message)
306+
print(completion.model_dump_json(indent=2))
307+
```
308+
309+
# [C#](#tab/dotnet)
310+
311+
```csharp
312+
using Azure.Identity;
313+
using OpenAI;
314+
using OpenAI.Chat;
315+
using System.ClientModel.Primitives;
316+
317+
#pragma warning disable OPENAI001
318+
319+
BearerTokenPolicy tokenPolicy = new(
320+
new DefaultAzureCredential(),
321+
"https://cognitiveservices.azure.com/.default");
322+
323+
ChatClient client = new(
324+
model: "grok-3-mini", // Replace with your model deployment name.
325+
authenticationPolicy: tokenPolicy,
326+
options: new OpenAIClientOptions() {
327+
328+
Endpoint = new Uri("https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1")
329+
}
330+
);
331+
332+
ChatCompletion completion = client.CompleteChat("Tell me about the attention is all you need paper");
333+
334+
Console.WriteLine($"[ASSISTANT]: {completion.Content[0].Text}");
335+
```
336+
337+
# [JavaScript](#tab/javascript)
338+
339+
```javascript
340+
import { DefaultAzureCredential, getBearerTokenProvider } from "@azure/identity";
341+
import { OpenAI } from "openai";
342+
343+
const tokenProvider = getBearerTokenProvider(
344+
new DefaultAzureCredential(),
345+
'https://cognitiveservices.azure.com/.default');
346+
const client = new OpenAI({
347+
baseURL: "https://france-central-test-001.openai.azure.com/openai/v1/",
348+
apiKey: tokenProvider
349+
});
350+
351+
const messages = [
352+
{ role: 'system', content: 'You are a helpful assistant.' },
353+
{ role: 'user', content: 'Tell me about the attention is all you need paper' }
354+
];
355+
356+
// Make the API request with top-level await
357+
const result = await client.chat.completions.create({
358+
messages,
359+
model: 'grok-3-mini', // model deployment name
360+
max_tokens: 100
361+
});
362+
363+
// Print the full response
364+
console.log('Full response:', result);
365+
366+
// Print just the message content from the response
367+
console.log('Response content:', result.choices[0].message.content);
368+
```
369+
370+
# [Go](#tab/go)
371+
372+
```go
373+
package main
374+
375+
import (
376+
"context"
377+
"fmt"
378+
379+
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
380+
"github.com/openai/openai-go/v2"
381+
"github.com/openai/openai-go/v2/azure"
382+
"github.com/openai/openai-go/v2/option"
383+
)
384+
385+
func main() {
386+
// Create an Azure credential
387+
tokenCredential, err := azidentity.NewDefaultAzureCredential(nil)
388+
if err != nil {
389+
panic(fmt.Sprintf("Failed to create credential: %v", err))
390+
}
391+
392+
// Create a client with Azure OpenAI endpoint and token credential
393+
client := openai.NewClient(
394+
option.WithBaseURL("https://YOUR-RESOURCE_NAME.openai.azure.com/openai/v1/"),
395+
azure.WithTokenCredential(tokenCredential),
396+
)
397+
398+
// Make a completion request
399+
chatCompletion, err := client.Chat.Completions.New(context.TODO(), openai.ChatCompletionNewParams{
400+
Messages: []openai.ChatCompletionMessageParamUnion{
401+
openai.UserMessage("Explain what the bitter lesson is?"),
402+
},
403+
Model: "grok-3-mini", // Use your deployed model name on Azure
404+
})
405+
if err != nil {
406+
panic(err.Error())
407+
}
408+
409+
fmt.Println(chatCompletion.Choices[0].Message.Content)
410+
}
411+
```
412+
413+
# [Java](#tab/Java)
414+
415+
```java
416+
package com.example;
417+
418+
import com.openai.client.OpenAIClient;
419+
import com.openai.client.okhttp.OpenAIOkHttpClient;
420+
import com.openai.models.ChatModel;
421+
import com.openai.models.chat.completions.ChatCompletion;
422+
import com.openai.models.chat.completions.ChatCompletionCreateParams;
423+
424+
public class OpenAITest {
425+
public static void main(String[] args) {
426+
// Get API key from environment variable for security
427+
String apiKey = System.getenv("OPENAI_API_KEY");
428+
String resourceName = "https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1";
429+
String modelDeploymentName = "grok-3-mini"; //replace with you model deployment name
430+
431+
try {
432+
OpenAIClient client = OpenAIOkHttpClient.builder()
433+
.baseUrl(resourceName)
434+
.apiKey(apiKey)
435+
.build();
436+
437+
ChatCompletionCreateParams params = ChatCompletionCreateParams.builder()
438+
.addUserMessage("Explain what the bitter lesson is?")
439+
.model(modelDeploymentName)
440+
.build();
441+
ChatCompletion chatCompletion = client.chat().completions().create(params);
294442
}
295-
],
296-
"role": "assistant",
297-
"status": "completed",
298-
"type": "message"
299443
}
300-
],
301-
"parallel_tool_calls": true,
302-
"temperature": 1.0,
303-
"tool_choice": "auto",
304-
"tools": [],
305-
"top_p": 1.0,
306-
"background": null,
307-
"max_output_tokens": null,
308-
"previous_response_id": null,
309-
"reasoning": {
310-
"effort": null,
311-
"generate_summary": null,
312-
"summary": null
313-
},
314-
"service_tier": "default",
315-
"status": "completed",
316-
"text": {
317-
"format": {
318-
"type": "text"
319-
}
320-
},
321-
"truncation": "disabled",
322-
"usage": {
323-
"input_tokens": 12,
324-
"input_tokens_details": {
325-
"cached_tokens": 0
326-
},
327-
"output_tokens": 19,
328-
"output_tokens_details": {
329-
"reasoning_tokens": 0
330-
},
331-
"total_tokens": 31
332-
},
333-
"user": null,
334-
"store": true
335444
}
336445
```
337446

447+
# [REST](#tab/rest)
448+
449+
```bash
450+
curl -X POST https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/chat/completions \
451+
-H "Content-Type: application/json" \
452+
-H "Authorization: Bearer $AZURE_OPENAI_AUTH_TOKEN" \
453+
-d '{
454+
"model": "grok-3-mini",
455+
"messages": [
456+
{
457+
"role": "developer",
458+
"content": "You are a helpful assistant."
459+
},
460+
{
461+
"role": "user",
462+
"content": "Explain what the bitter lesson is?"
463+
}
464+
]
465+
}'
466+
```
467+
338468
---
339469

340470
## v1 API support

articles/ai-foundry/openai/includes/language-overview/dotnet.md

Lines changed: 39 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -170,8 +170,6 @@ BearerTokenPolicy tokenPolicy = new(
170170
new DefaultAzureCredential(),
171171
"https://cognitiveservices.azure.com/.default");
172172

173-
#pragma warning disable OPENAI001
174-
175173
OpenAIResponseClient client = new(
176174
model: "o4-mini",
177175
authenticationPolicy: tokenPolicy,
@@ -244,6 +242,45 @@ await foreach (StreamingResponseUpdate update
244242
}
245243
```
246244

245+
### MCP Server
246+
247+
```csharp
248+
using OpenAI;
249+
using OpenAI.Responses;
250+
using System.ClientModel.Primitives;
251+
using Azure.Identity;
252+
253+
#pragma warning disable OPENAI001 //currently required for token based authentication
254+
255+
BearerTokenPolicy tokenPolicy = new(
256+
new DefaultAzureCredential(),
257+
"https://cognitiveservices.azure.com/.default");
258+
259+
OpenAIResponseClient client = new(
260+
model: "o4-mini",
261+
authenticationPolicy: tokenPolicy,
262+
options: new OpenAIClientOptions()
263+
{
264+
Endpoint = new Uri("https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1")
265+
}
266+
);
267+
268+
ResponseCreationOptions options = new();
269+
options.Tools.Add(ResponseTool.CreateMcpTool(
270+
serverLabel: "microsoft_learn",
271+
serverUri: new Uri("https://learn.microsoft.com/api/mcp"),
272+
toolCallApprovalPolicy: new McpToolCallApprovalPolicy(GlobalMcpToolCallApprovalPolicy.NeverRequireApproval)
273+
));
274+
275+
OpenAIResponse response = (OpenAIResponse)client.CreateResponse([
276+
ResponseItem.CreateUserMessageItem([
277+
ResponseContentPart.CreateInputTextPart("Search for information about Azure Functions")
278+
])
279+
], options);
280+
281+
Console.WriteLine(response.GetOutputText());
282+
```
283+
247284
## Error handling
248285

249286
### Error codes

articles/ai-foundry/openai/includes/language-overview/javascript.md

Lines changed: 31 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -114,6 +114,37 @@ for await (const event of stream) {
114114
}
115115
```
116116

117+
### MCP Server
118+
119+
```javascript
120+
import { DefaultAzureCredential, getBearerTokenProvider } from "@azure/identity";
121+
import { OpenAI } from "openai";
122+
123+
const tokenProvider = getBearerTokenProvider(
124+
new DefaultAzureCredential(),
125+
'https://cognitiveservices.azure.com/.default');
126+
const client = new OpenAI({
127+
baseURL: "https://YOUR-RESORCE-NAME.openai.azure.com/openai/v1/",
128+
apiKey: tokenProvider
129+
});
130+
131+
const resp = await client.responses.create({
132+
model: "gpt-5",
133+
tools: [
134+
{
135+
type: "mcp",
136+
server_label: "microsoft_learn",
137+
server_description: "Microsoft Learn MCP server for searching and fetching Microsoft documentation.",
138+
server_url: "https://learn.microsoft.com/api/mcp",
139+
require_approval: "never",
140+
},
141+
],
142+
input: "Search for information about Azure Functions",
143+
});
144+
145+
console.log(resp.output_text);
146+
```
147+
117148
## Chat
118149

119150
`chat.completions.create`

0 commit comments

Comments
 (0)