Skip to content

Commit c4032a8

Browse files
committed
Update phi 3 vision
1 parent 641ae65 commit c4032a8

File tree

2 files changed

+35
-37
lines changed

2 files changed

+35
-37
lines changed

articles/machine-learning/how-to-deploy-models-phi-3-vision.md

Lines changed: 34 additions & 37 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,9 @@
11
---
2-
title: How to use Phi-3 chat model with vision with Azure AI Studio
3-
titleSuffix: Azure AI Studio
4-
description: Learn how to use Phi-3 chat model with vision with Azure AI Studio.
5-
ms.service: azure-ai-studio
2+
title: How to use Phi-3 chat model with vision with Azure Machine Learning
3+
titleSuffix: Azure Machine Learning
4+
description: Learn how to use Phi-3 chat model with vision with Azure Machine Learning.
5+
ms.service: azure-machine-learning
6+
ms.subservice: inferencing
67
manager: scottpolly
78
ms.topic: how-to
89
ms.date: 08/19/2024
@@ -14,14 +15,12 @@ ms.custom: references_regions, generated
1415
zone_pivot_groups: azure-ai-model-catalog-samples-chat
1516
---
1617

17-
# How to use Phi-3 chat model with vision
18-
19-
[!INCLUDE [Feature preview](~/reusable-content/ce-skilling/azure/includes/ai-studio/includes/feature-preview.md)]
18+
# How to use Phi-3 chat model with vision with Azure Machine Learning
2019

2120
In this article, you learn about Phi-3 chat model with vision and how to use them.
2221
The Phi-3 family of small language models (SLMs) is a collection of instruction-tuned generative text models.
2322

24-
23+
[!INCLUDE [machine-learning-preview-generic-disclaimer](includes/machine-learning-preview-generic-disclaimer.md)]
2524

2625
::: zone pivot="programming-language-python"
2726

@@ -32,12 +31,11 @@ Phi-3 Vision is a lightweight, state-of-the-art, open multimodal model. The mode
3231

3332
You can learn more about the models in their respective model card:
3433

35-
* [Phi-3-vision-128k-Instruct](https://aka.ms/azureai/landing/Phi-3-vision-128k-Instruct)
36-
34+
* Phi-3-vision-128k-Instruct
3735

3836
## Prerequisites
3937

40-
To use Phi-3 chat model with vision with Azure AI Studio, you need the following prerequisites:
38+
To use Phi-3 chat model with vision with Azure Machine Learning, you need the following prerequisites:
4139

4240
### A model deployment
4341

@@ -48,7 +46,7 @@ Phi-3 chat model with vision can be deployed to our self-hosted managed inferenc
4846
For deployment to a self-hosted managed compute, you must have enough quota in your subscription. If you don't have enough quota available, you can use our temporary quota access by selecting the option **I want to use shared quota and I acknowledge that this endpoint will be deleted in 168 hours.**
4947

5048
> [!div class="nextstepaction"]
51-
> [Deploy the model to managed compute](../concepts/deployments-overview.md)
49+
> [Deploy the model to managed compute](concept-model-catalog.md#deploy-models-for-inference-with-managed-compute)
5250
5351
### The inference package installed
5452

@@ -68,10 +66,10 @@ Read more about the [Azure AI inference package and reference](https://aka.ms/az
6866

6967
## Work with chat completions
7068

71-
In this section, you use the [Azure AI model inference API](https://aka.ms/azureai/modelinference) with a chat completions model for chat.
69+
In this section, you use the [Azure AI Model Inference API](reference-model-inference-api.md) with a chat completions model for chat.
7270

7371
> [!TIP]
74-
> The [Azure AI model inference API](https://aka.ms/azureai/modelinference) allows you to talk with most models deployed in Azure AI Studio with the same code and structure, including Phi-3 chat model with vision.
72+
> The [Azure AI Model Inference API](reference-model-inference-api.md) allows you to talk with most models deployed in Azure Machine Learning studio with the same code and structure, including Phi-3 chat model with vision.
7573
7674
### Create a client to consume the model
7775

@@ -143,7 +141,7 @@ response = client.complete(
143141
```
144142

145143
> [!NOTE]
146-
> Phi-3-vision-128k-Instruct doesn't support system messages (`role="system"`). When you use the Azure AI model inference API, system messages are translated to user messages, which is the closest capability available. This translation is offered for convenience, but it's important for you to verify that the model is following the instructions in the system message with the right level of confidence.
144+
> Phi-3-vision-128k-Instruct don't support system messages (`role="system"`). When you use the Azure AI model inference API, system messages are translated to user messages, which is the closest capability available. This translation is offered for convenience, but it's important for you to verify that the model is following the instructions in the system message with the right level of confidence.
147145
148146
The response is as follows, where you can see the model's usage statistics:
149147

@@ -357,7 +355,7 @@ You can learn more about the models in their respective model card:
357355

358356
## Prerequisites
359357

360-
To use Phi-3 chat model with vision with Azure AI Studio, you need the following prerequisites:
358+
To use Phi-3 chat model with vision with Azure Machine Learning studio, you need the following prerequisites:
361359

362360
### A model deployment
363361

@@ -368,7 +366,7 @@ Phi-3 chat model with vision can be deployed to our self-hosted managed inferenc
368366
For deployment to a self-hosted managed compute, you must have enough quota in your subscription. If you don't have enough quota available, you can use our temporary quota access by selecting the option **I want to use shared quota and I acknowledge that this endpoint will be deleted in 168 hours.**
369367

370368
> [!div class="nextstepaction"]
371-
> [Deploy the model to managed compute](../concepts/deployments-overview.md)
369+
> [Deploy the model to managed compute](concept-model-catalog.md#deploy-models-for-inference-with-managed-compute)
372370
373371
### The inference package installed
374372

@@ -386,10 +384,10 @@ npm install @azure-rest/ai-inference
386384

387385
## Work with chat completions
388386

389-
In this section, you use the [Azure AI model inference API](https://aka.ms/azureai/modelinference) with a chat completions model for chat.
387+
In this section, you use the [Azure AI Model Inference API](reference-model-inference-api.md) with a chat completions model for chat.
390388

391389
> [!TIP]
392-
> The [Azure AI model inference API](https://aka.ms/azureai/modelinference) allows you to talk with most models deployed in Azure AI Studio with the same code and structure, including Phi-3 chat model with vision.
390+
> The [Azure AI Model Inference API](reference-model-inference-api.md) allows you to talk with most models deployed in Azure Machine Learning studio with the same code and structure, including Phi-3 chat model with vision.
393391
394392
### Create a client to consume the model
395393

@@ -463,7 +461,7 @@ var response = await client.path("/chat/completions").post({
463461
```
464462

465463
> [!NOTE]
466-
> Phi-3-vision-128k-Instruct doesn't support system messages (`role="system"`). When you use the Azure AI model inference API, system messages are translated to user messages, which is the closest capability available. This translation is offered for convenience, but it's important for you to verify that the model is following the instructions in the system message with the right level of confidence.
464+
> Phi-3-vision-128k-Instruct don't support system messages (`role="system"`). When you use the Azure AI model inference API, system messages are translated to user messages, which is the closest capability available. This translation is offered for convenience, but it's important for you to verify that the model is following the instructions in the system message with the right level of confidence.
467465
468466
The response is as follows, where you can see the model's usage statistics:
469467

@@ -700,7 +698,7 @@ You can learn more about the models in their respective model card:
700698
701699
## Prerequisites
702700
703-
To use Phi-3 chat model with vision with Azure AI Studio, you need the following prerequisites:
701+
To use Phi-3 chat model with vision with Azure Machine Learning studio, you need the following prerequisites:
704702
705703
### A model deployment
706704
@@ -711,7 +709,7 @@ Phi-3 chat model with vision can be deployed to our self-hosted managed inferenc
711709
For deployment to a self-hosted managed compute, you must have enough quota in your subscription. If you don't have enough quota available, you can use our temporary quota access by selecting the option **I want to use shared quota and I acknowledge that this endpoint will be deleted in 168 hours.**
712710

713711
> [!div class="nextstepaction"]
714-
> [Deploy the model to managed compute](../concepts/deployments-overview.md)
712+
> [Deploy the model to managed compute](concept-model-catalog.md#deploy-models-for-inference-with-managed-compute)
715713

716714
### The inference package installed
717715

@@ -752,10 +750,10 @@ using System.Reflection;
752750

753751
## Work with chat completions
754752

755-
In this section, you use the [Azure AI model inference API](https://aka.ms/azureai/modelinference) with a chat completions model for chat.
753+
In this section, you use the [Azure AI Model Inference API](reference-model-inference-api.md) with a chat completions model for chat.
756754

757755
> [!TIP]
758-
> The [Azure AI model inference API](https://aka.ms/azureai/modelinference) allows you to talk with most models deployed in Azure AI Studio with the same code and structure, including Phi-3 chat model with vision.
756+
> The [Azure AI Model Inference API](reference-model-inference-api.md) allows you to talk with most models deployed in Azure Machine Learning studio with the same code and structure, including Phi-3 chat model with vision.
759757

760758
### Create a client to consume the model
761759

@@ -820,7 +818,7 @@ Response<ChatCompletions> response = client.Complete(requestOptions);
820818
```
821819

822820
> [!NOTE]
823-
> Phi-3-vision-128k-Instruct doesn't support system messages (`role="system"`). When you use the Azure AI model inference API, system messages are translated to user messages, which is the closest capability available. This translation is offered for convenience, but it's important for you to verify that the model is following the instructions in the system message with the right level of confidence.
821+
> Phi-3-vision-128k-Instruct don't support system messages (`role="system"`). When you use the Azure AI model inference API, system messages are translated to user messages, which is the closest capability available. This translation is offered for convenience, but it's important for you to verify that the model is following the instructions in the system message with the right level of confidence.
824822

825823
The response is as follows, where you can see the model's usage statistics:
826824
@@ -1040,7 +1038,7 @@ You can learn more about the models in their respective model card:
10401038

10411039
## Prerequisites
10421040

1043-
To use Phi-3 chat model with vision with Azure AI Studio, you need the following prerequisites:
1041+
To use Phi-3 chat model with vision with Azure Machine Learning studio, you need the following prerequisites:
10441042

10451043
### A model deployment
10461044

@@ -1051,21 +1049,21 @@ Phi-3 chat model with vision can be deployed to our self-hosted managed inferenc
10511049
For deployment to a self-hosted managed compute, you must have enough quota in your subscription. If you don't have enough quota available, you can use our temporary quota access by selecting the option **I want to use shared quota and I acknowledge that this endpoint will be deleted in 168 hours.**
10521050
10531051
> [!div class="nextstepaction"]
1054-
> [Deploy the model to managed compute](../concepts/deployments-overview.md)
1052+
> [Deploy the model to managed compute](concept-model-catalog.md#deploy-models-for-inference-with-managed-compute)
10551053
10561054
### A REST client
10571055
1058-
Models deployed with the [Azure AI model inference API](https://aka.ms/azureai/modelinference) can be consumed using any REST client. To use the REST client, you need the following prerequisites:
1056+
Models deployed with the [Azure AI Model Inference API](reference-model-inference-api.md) can be consumed using any REST client. To use the REST client, you need the following prerequisites:
10591057
10601058
* To construct the requests, you need to pass in the endpoint URL. The endpoint URL has the form `https://your-host-name.your-azure-region.inference.ai.azure.com`, where `your-host-name`` is your unique model deployment host name and `your-azure-region`` is the Azure region where the model is deployed (for example, eastus2).
10611059
* Depending on your model deployment and authentication preference, you need either a key to authenticate against the service, or Microsoft Entra ID credentials. The key is a 32-character string.
10621060
10631061
## Work with chat completions
10641062
1065-
In this section, you use the [Azure AI model inference API](https://aka.ms/azureai/modelinference) with a chat completions model for chat.
1063+
In this section, you use the [Azure AI Model Inference API](reference-model-inference-api.md) with a chat completions model for chat.
10661064
10671065
> [!TIP]
1068-
> The [Azure AI model inference API](https://aka.ms/azureai/modelinference) allows you to talk with most models deployed in Azure AI Studio with the same code and structure, including Phi-3 chat model with vision.
1066+
> The [Azure AI Model Inference API](reference-model-inference-api.md) allows you to talk with most models deployed in Azure Machine Learning studio with the same code and structure, including Phi-3 chat model with vision.
10691067
10701068
### Create a client to consume the model
10711069
@@ -1115,7 +1113,7 @@ The following example shows how you can create a basic chat completions request
11151113
```
11161114
11171115
> [!NOTE]
1118-
> Phi-3-vision-128k-Instruct doesn't support system messages (`role="system"`). When you use the Azure AI model inference API, system messages are translated to user messages, which is the closest capability available. This translation is offered for convenience, but it's important for you to verify that the model is following the instructions in the system message with the right level of confidence.
1116+
> Phi-3-vision-128k-Instruct don't support system messages (`role="system"`). When you use the Azure AI model inference API, system messages are translated to user messages, which is the closest capability available. This translation is offered for convenience, but it's important for you to verify that the model is following the instructions in the system message with the right level of confidence.
11191117
11201118
The response is as follows, where you can see the model's usage statistics:
11211119

@@ -1421,9 +1419,8 @@ It is a good practice to start with a low number of instances and scale up as ne
14211419
14221420
## Related content
14231421
1424-
1425-
* [Azure AI Model Inference API](../reference/reference-model-inference-api.md)
1426-
* [Deploy models as serverless APIs](deploy-models-serverless.md)
1427-
* [Consume serverless API endpoints from a different Azure AI Studio project or hub](deploy-models-serverless-connect.md)
1428-
* [Region availability for models in serverless API endpoints](deploy-models-serverless-availability.md)
1429-
* [Plan and manage costs (marketplace)](costs-plan-manage.md#monitor-costs-for-models-offered-through-the-azure-marketplace)
1422+
* [Azure AI Model Inference API](reference-model-inference-api.md)
1423+
* [Model Catalog and Collections](concept-model-catalog.md)
1424+
* [Deploy models as serverless API endpoints](how-to-deploy-models-serverless.md)
1425+
* [Plan and manage costs for Azure AI Studio](concept-plan-manage-cost.md)
1426+
* [Region availability for models in serverless API endpoints](concept-endpoint-serverless-availability.md)

articles/machine-learning/how-to-deploy-models-phi-3.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1461,6 +1461,7 @@ It is a good practice to start with a low number of instances and scale up as ne
14611461
14621462
## Related content
14631463
1464+
- [Azure AI Model Inference API](reference-model-inference-api.md)
14641465
- [Model Catalog and Collections](concept-model-catalog.md)
14651466
- [Deploy models as serverless API endpoints](how-to-deploy-models-serverless.md)
14661467
- [Plan and manage costs for Azure AI Studio](concept-plan-manage-cost.md)

0 commit comments

Comments
 (0)