Skip to content

Commit 909b8e3

Browse files
authored
Merge pull request #3401 from MicrosoftDocs/main
3/7/2025 AM Publish
2 parents 8cfedf3 + a40fa22 commit 909b8e3

File tree

24 files changed

+891
-87
lines changed

24 files changed

+891
-87
lines changed

articles/ai-foundry/model-inference/concepts/models.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -139,6 +139,8 @@ Phi is a family of lightweight, state-of-the-art open models. These models were
139139
| [Phi-3-small-128k-instruct](https://ai.azure.com/explore/models/Phi-3-small-128k-instruct/version/4/registry/azureml) | chat-completion | Global standard | - **Input:** text (131,072 tokens) <br /> - **Output:** (4,096 tokens) <br /> - **Languages:** en <br /> - **Tool calling:** No <br /> - **Response formats:** Text |
140140
| [Phi-3.5-mini-instruct](https://ai.azure.com/explore/models/Phi-3.5-mini-instruct/version/6/registry/azureml) | chat-completion | Global standard | - **Input:** text (131,072 tokens) <br /> - **Output:** (4,096 tokens) <br /> - **Languages:** en, ar, zh, cs, da, nl, fi, fr, de, he, hu, it, ja, ko, no, pl, pt, ru, es, sv, th, tr, and uk <br /> - **Tool calling:** No <br /> - **Response formats:** Text |
141141
| [Phi-4](https://ai.azure.com/explore/models/Phi-4/version/2/registry/azureml) | chat-completion | Global standard | - **Input:** text (16,384 tokens) <br /> - **Output:** (16,384 tokens) <br /> - **Languages:** en, ar, bn, cs, da, de, el, es, fa, fi, fr, gu, ha, he, hi, hu, id, it, ja, jv, kn, ko, ml, mr, nl, no, or, pa, pl, ps, pt, ro, ru, sv, sw, ta, te, th, tl, tr, uk, ur, vi, yo, and zh - **Tool calling:** No <br /> - **Response formats:** Text |
142+
| [Phi-4-mini-instruct](https://ai.azure.com/explore/models/Phi-4-mini-instruct/version/1/registry/azureml) | chat-completion | Global standard | - **Input:** text (131,072 tokens) <br /> - **Output:** (4,096 tokens) <br /> - **Languages:** `ar`, `zh`, `cs`, `da`, `nl`, `en`, `fi`, `fr`, `de`, `he`, `hu`, `it`, `ja`, `ko`, `no`, `pl`, `pt`, `ru`, `es`, `sv`, `th`, `tr`, and `uk` <br /> - **Tool calling:** No <br /> - **Response formats:** Text |
143+
| [Phi-4-multimodal-instruct](https://ai.azure.com/explore/models/Phi-4-multimodal-instruct/version/1/registry/azureml) | chat-completion | Global standard | - **Input:** text, images, and audio (131,072 tokens) <br /> - **Output:** (4,096 tokens) <br /> - **Languages:** `ar`, `zh`, `cs`, `da`, `nl`, `en`, `fi`, `fr`, `de`, `he`, `hu`, `it`, `ja`, `ko`, `no`, `pl`, `pt`, `ru`, `es`, `sv`, `th`, `tr`, and `uk` <br /> - **Tool calling:** No <br /> - **Response formats:** Text |
142144

143145

144146

Lines changed: 54 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,54 @@
1+
---
2+
title: How to use image and audio in chat completions with Azure AI model inference
3+
titleSuffix: Azure AI Foundry
4+
description: Learn how to process audio and images with chat completions models with Azure AI model inference
5+
manager: scottpolly
6+
author: msakande
7+
reviewer: santiagxf
8+
ms.service: azure-ai-model-inference
9+
ms.topic: how-to
10+
ms.date: 1/21/2025
11+
ms.author: mopeakande
12+
ms.reviewer: fasantia
13+
ms.custom: generated
14+
zone_pivot_groups: azure-ai-inference-samples
15+
---
16+
17+
# How to use image and audio in chat completions with Azure AI model inference
18+
19+
20+
::: zone pivot="programming-language-python"
21+
22+
[!INCLUDE [python](../includes/use-chat-multi-modal/python.md)]
23+
::: zone-end
24+
25+
26+
::: zone pivot="programming-language-javascript"
27+
28+
[!INCLUDE [javascript](../includes/use-chat-multi-modal/javascript.md)]
29+
::: zone-end
30+
31+
32+
::: zone pivot="programming-language-java"
33+
34+
[!INCLUDE [java](../includes/use-chat-multi-modal/java.md)]
35+
::: zone-end
36+
37+
38+
::: zone pivot="programming-language-csharp"
39+
40+
[!INCLUDE [csharp](../includes/use-chat-multi-modal/csharp.md)]
41+
::: zone-end
42+
43+
44+
::: zone pivot="programming-language-rest"
45+
46+
[!INCLUDE [rest](../includes/use-chat-multi-modal/rest.md)]
47+
::: zone-end
48+
49+
## Related content
50+
51+
* [Use embeddings models](use-embeddings.md)
52+
* [Use image embeddings models](use-image-embeddings.md)
53+
* [Use reasoning models](use-chat-reasoning.md)
54+
* [Azure AI Model Inference API](.././reference/reference-model-inference-api.md)
Lines changed: 20 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,20 @@
1+
---
2+
manager: nitinme
3+
ms.service: azure-ai-model-inference
4+
ms.topic: include
5+
ms.date: 1/21/2025
6+
ms.author: fasantia
7+
author: santiagxf
8+
---
9+
10+
* Install the [Azure AI inference package](https://aka.ms/azsdk/azure-ai-inference/python/reference) with the following command:
11+
12+
```bash
13+
dotnet add package Azure.AI.Inference --prerelease
14+
```
15+
16+
* If you are using Entra ID, you also need the following package:
17+
18+
```bash
19+
dotnet add package Azure.Identity
20+
```
Lines changed: 44 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,44 @@
1+
2---
2+
manager: nitinme
3+
ms.service: azure-ai-model-inference
4+
ms.topic: include
5+
ms.date: 1/21/2025
6+
ms.author: fasantia
7+
author: santiagxf
8+
---
9+
10+
* Add the [Azure AI inference package](https://aka.ms/azsdk/azure-ai-inference/java/reference) to your project:
11+
12+
```xml
13+
<dependency>
14+
<groupId>com.azure</groupId>
15+
<artifactId>azure-ai-inference</artifactId>
16+
<version>1.0.0-beta.1</version>
17+
</dependency>
18+
```
19+
20+
* If you are using Entra ID, you also need the following package:
21+
22+
```xml
23+
<dependency>
24+
<groupId>com.azure</groupId>
25+
<artifactId>azure-identity</artifactId>
26+
<version>1.13.3</version>
27+
</dependency>
28+
```
29+
30+
* Import the following namespace:
31+
32+
```java
33+
package com.azure.ai.inference.usage;
34+
35+
import com.azure.ai.inference.EmbeddingsClient;
36+
import com.azure.ai.inference.EmbeddingsClientBuilder;
37+
import com.azure.ai.inference.models.EmbeddingsResult;
38+
import com.azure.ai.inference.models.EmbeddingItem;
39+
import com.azure.core.credential.AzureKeyCredential;
40+
import com.azure.core.util.Configuration;
41+
42+
import java.util.ArrayList;
43+
import java.util.List;
44+
```
Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
---
2+
manager: nitinme
3+
ms.service: azure-ai-model-inference
4+
ms.topic: include
5+
ms.date: 1/21/2025
6+
ms.author: fasantia
7+
author: santiagxf
8+
---
9+
10+
* Install the [Azure Inference library for JavaScript](https://aka.ms/azsdk/azure-ai-inference/javascript/reference) with the following command:
11+
12+
```bash
13+
npm install @azure-rest/ai-inference
14+
```
Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
---
2+
manager: nitinme
3+
ms.service: azure-ai-model-inference
4+
ms.topic: include
5+
ms.date: 1/21/2025
6+
ms.author: fasantia
7+
author: santiagxf
8+
---
9+
10+
* Install the [Azure AI inference package for Python](https://aka.ms/azsdk/azure-ai-inference/python/reference) with the following command:
11+
12+
```bash
13+
pip install -U azure-ai-inference
14+
```

articles/ai-foundry/model-inference/includes/use-chat-completions/javascript.md

Lines changed: 2 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -24,13 +24,9 @@ To use chat completion models in your application, you need:
2424

2525
[!INCLUDE [how-to-prerequisites](../how-to-prerequisites.md)]
2626

27-
* A chat completions model deployment. If you don't have one read [Add and configure models to Azure AI services](../../how-to/create-model-deployments.md) to add a chat completions model to your resource.
28-
29-
* Install the [Azure Inference library for JavaScript](https://aka.ms/azsdk/azure-ai-inference/javascript/reference) with the following command:
27+
[!INCLUDE [how-to-prerequisites-javascript](../how-to-prerequisites-javascript.md)]
3028

31-
```bash
32-
npm install @azure-rest/ai-inference
33-
```
29+
* A chat completions model deployment. If you don't have one read [Add and configure models to Azure AI services](../../how-to/create-model-deployments.md) to add a chat completions model to your resource.
3430

3531
## Use chat completions
3632

articles/ai-foundry/model-inference/includes/use-chat-completions/python.md

Lines changed: 5 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -24,14 +24,9 @@ To use chat completion models in your application, you need:
2424

2525
[!INCLUDE [how-to-prerequisites](../how-to-prerequisites.md)]
2626

27-
* A chat completions model deployment. If you don't have one read [Add and configure models to Azure AI services](../../how-to/create-model-deployments.md) to add a chat completions model to your resource.
28-
29-
* Install the [Azure AI inference package for Python](https://aka.ms/azsdk/azure-ai-inference/python/reference) with the following command:
30-
31-
```bash
32-
pip install -U azure-ai-inference
33-
```
27+
[!INCLUDE [how-to-prerequisites-python](../how-to-prerequisites-python.md)]
3428

29+
3530
## Use chat completions
3631

3732
First, create the client to consume the model. The following code uses an endpoint URL and key that are stored in environment variables.
@@ -179,15 +174,13 @@ Some models can create JSON outputs. Set `response_format` to `json_object` to e
179174

180175

181176
```python
182-
from azure.ai.inference.models import ChatCompletionsResponseFormatJSON
183-
184177
response = client.complete(
185178
messages=[
186179
SystemMessage(content="You are a helpful assistant that always generate responses in JSON format, using."
187180
" the following format: { ""answer"": ""response"" }."),
188181
UserMessage(content="How many languages are in the world?"),
189182
],
190-
response_format={ "type": ChatCompletionsResponseFormatJSON() }
183+
response_format="json_object"
191184
)
192185
```
193186

@@ -218,9 +211,9 @@ The following code example creates a tool definition that is able to look from f
218211

219212

220213
```python
221-
from azure.ai.inference.models import FunctionDefinition, ChatCompletionsFunctionToolDefinition
214+
from azure.ai.inference.models import FunctionDefinition, ChatCompletionsToolDefinition
222215

223-
flight_info = ChatCompletionsFunctionToolDefinition(
216+
flight_info = ChatCompletionsToolDefinition(
224217
function=FunctionDefinition(
225218
name="get_flight_info",
226219
description="Returns information about the next flight between two cities. This includes the name of the airline, flight number and the date and time of the next flight",

articles/ai-foundry/model-inference/includes/use-chat-completions/rest.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ To use chat completion models in your application, you need:
2828

2929
## Use chat completions
3030

31-
To use the text embeddings, use the route `/chat/completions` appended to the base URL along with your credential indicated in `api-key`. `Authorization` header is also supported with the format `Bearer <key>`.
31+
To use chat completions API, use the route `/chat/completions` appended to the base URL along with your credential indicated in `api-key`. `Authorization` header is also supported with the format `Bearer <key>`.
3232

3333
```http
3434
POST https://<resource>.services.ai.azure.com/models/chat/completions?api-version=2024-05-01-preview
@@ -553,7 +553,7 @@ Some models can reason across text and images and generate text completions base
553553
To see this capability, download an image and encode the information as `base64` string. The resulting data should be inside of a [data URL](https://developer.mozilla.org/en-US/docs/Web/HTTP/Basics_of_HTTP/Data_URLs):
554554

555555
> [!TIP]
556-
> You will need to construct the data URL using a scripting or programming language. This tutorial use [this sample image](../../../../ai-foundry/media/how-to/sdks/small-language-models-chart-example.jpg) in JPEG format. A data URL has a format as follows: `data:image/jpg;base64,0xABCDFGHIJKLMNOPQRSTUVWXYZ...`.
556+
> You will need to construct the data URL using a scripting or programming language. This tutorial uses [this sample image](../../../../ai-foundry/media/how-to/sdks/small-language-models-chart-example.jpg) in JPEG format. A data URL has a format as follows: `data:image/jpg;base64,0xABCDFGHIJKLMNOPQRSTUVWXYZ...`.
557557
558558
Visualize the image:
559559

0 commit comments

Comments
 (0)