|
1 | 1 | ---
|
2 |
| -title: Monitor model deployments in Azure AI model inference |
3 |
| -description: Learn how to use Azure Monitor tools like Log Analytics to capture and analyze metrics and data logs for your Azure AI model inference. |
| 2 | +title: Monitor model deployments in Azure AI Foundry Models |
| 3 | +description: Learn how to use Azure Monitor tools like Log Analytics to capture and analyze metrics and data logs for Foundry Models. |
4 | 4 | author: santiagxf
|
5 | 5 | ms.author: fasantia
|
6 | 6 | ms.service: azure-ai-model-inference
|
7 | 7 | ms.topic: how-to
|
8 | 8 | ms.date: 4/30/2025
|
9 | 9 | ---
|
10 | 10 |
|
11 |
| -# Monitor model deployments in Azure AI model inference |
| 11 | +# Monitor model deployments in Azure AI Foundry Models |
12 | 12 |
|
13 | 13 | [!INCLUDE [Feature preview](../includes/feature-preview.md)]
|
14 | 14 |
|
15 |
| -When you have critical applications and business processes that rely on Azure resources, you need to monitor and get alerts for your system. The Azure Monitor service collects and aggregates metrics and logs from every component of your system, including Azure AI model inference model deployments. You can use this information to view availability, performance, and resilience, and get notifications of issues. |
| 15 | +When you have critical applications and business processes that rely on Azure resources, you need to monitor and get alerts for your system. The Azure Monitor service collects and aggregates metrics and logs from every component of your system, including Foundry Models deployments. You can use this information to view availability, performance, and resilience, and get notifications of issues. |
16 | 16 |
|
17 |
| -This document explains how you can use metrics and logs to monitor model deployments in Azure AI model inference. |
| 17 | +This document explains how you can use metrics and logs to monitor model deployments in Foundry Models. |
18 | 18 |
|
19 | 19 | ## Prerequisites
|
20 | 20 |
|
21 |
| -To use monitoring capabilities for model deployments in Azure AI model inference, you need the following: |
| 21 | +To use monitoring capabilities for model deployments in Foundry Models, you need the following: |
22 | 22 |
|
23 | 23 | * An Azure AI services resource. For more information, see [Create an Azure AI Services resource](quickstart-create-resources.md).
|
24 | 24 |
|
25 | 25 | > [!TIP]
|
26 |
| - > If you are using Serverless API Endpoints and you want to take advantage of monitoring capabilities explained in this document, [migrate your Serverless API Endpoints to Azure AI model inference](quickstart-ai-project.md). |
| 26 | + > If you are using Serverless API Endpoints and you want to take advantage of monitoring capabilities explained in this document, [migrate your Serverless API Endpoints to Foundry Models](quickstart-ai-project.md). |
27 | 27 |
|
28 | 28 | * At least one model deployment.
|
29 | 29 |
|
30 | 30 | * Access to diagnostic information for the resource.
|
31 | 31 |
|
32 | 32 | ## Metrics
|
33 | 33 |
|
34 |
| -Azure Monitor collects metrics from Azure AI model inference automatically. **No configuration is required**. These metrics are: |
| 34 | +Azure Monitor collects metrics from Foundry Models automatically. **No configuration is required**. These metrics are: |
35 | 35 |
|
36 | 36 | * Stored in the Azure Monitor time-series metrics database.
|
37 | 37 | * Lightweight and capable of supporting near real-time alerting.
|
|
0 commit comments