Skip to content

Commit 3ee90b1

Browse files
committed
update stutio tab for model performance metrics
1 parent c79b7b3 commit 3ee90b1

6 files changed

+33
-30
lines changed

articles/machine-learning/how-to-monitor-model-performance.md

Lines changed: 33 additions & 30 deletions
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ ms.custom: devplatv2, update-code
1818

1919
[!INCLUDE [dev v2](includes/machine-learning-dev-v2.md)]
2020

21-
Learn to use Azure Machine Learning's model monitoring to continuously track the performance of machine learning models in production. Model monitoring provides you with a broad view of monitoring signals and alert you to potential issues. By monitoring models in production, you can critically evaluate the inherent risks associated with them and identify blind spots that could adversely affect your business.
21+
Learn to use Azure Machine Learning's model monitoring to continuously track the performance of machine learning models in production. Model monitoring provides you with a broad view of monitoring signals and alert you to potential issues. When you monitor signals and performance metrics of models in production, you can critically evaluate the inherent risks associated with them and identify blind spots that could adversely affect your business.
2222

2323
In this article you, learn to perform the following tasks:
2424

@@ -399,13 +399,15 @@ To set up advanced monitoring:
399399
1. Select **Add** to open the **Edit Signal** window.
400400
1. Select **Feature attribution drift (preview)** to configure the feature attribution drift signal as follows:
401401

402-
1. Select the production data asset with your model inputs and the desired lookback window size.
403-
1. Select the production data asset with your model outputs.
404-
1. Select the common column between these data assets to join them on. If the data was collected with the [data collector](how-to-collect-production-data.md), the common column is `correlationid`.
405-
1. (Optional) If you used the data collector to collect data where your model inputs and outputs are already joined, select the joined dataset as your production data asset and **Remove** step 2 in the configuration panel.
406-
1. Select your training dataset to use as the reference dataset.
407-
1. Select the target (output) column for your training dataset.
408-
1. Select your preferred metric and threshold.
402+
1. In step 1, select the production data asset that has your model inputs
403+
- Also, select the desired lookback window size.
404+
1. In step 2, select the production data asset that has your model outputs.
405+
- Also, select the common column between these data assets to join them on. If the data was collected with the [data collector](how-to-collect-production-data.md), the common column is `correlationid`.
406+
1. (Optional) If you used the data collector to collect data where your model inputs and outputs are already joined, select the joined dataset as your production data asset (in step 1)
407+
- Also, **Remove** step 2 in the configuration panel.
408+
1. In step 3, select your training dataset to use as the reference dataset.
409+
- Also, select the target (output) column for your training dataset.
410+
1. In step 4, select your preferred metric and threshold.
409411

410412
:::image type="content" source="media/how-to-monitor-models/model-monitoring-configure-feature-attribution-drift.png" alt-text="Screenshot showing how to configure feature attribution drift signal." lightbox="media/how-to-monitor-models/model-monitoring-configure-feature-attribution-drift.png":::
411413

@@ -455,16 +457,16 @@ You must satisfy the following requirements for you to configure your model perf
455457

456458
* (Optional) Have a pre-joined tabular dataset with model outputs and ground truth data already joined together.
457459

458-
### When should you monitor model performance?
460+
### Example scenario for monitoring model performance
459461

460-
The following example describes model performance monitoring for a scenario when you deploy a model to predict whether credit card transactions are fraudulent or not.
462+
To understand the concepts associated with model performance monitoring, consider this example. Suppose you're deploying a model to predict whether credit card transactions are fraudulent or not. Follow these steps to monitor the model's performance:
461463

462-
1. Suppose your deployment uses the data collector to collect the model's production inference data (input and output data), and the output data is stored in a column `is_fraud`.
463-
1. For each row of the collected inference data, you log a unique ID. The unique ID can come from your application, or you can use the `correlationid` that Azure Machine Learning uniquely generates for each logged JSON object.
464+
1. Configure your deployment to use the data collector to collect the model's production inference data (input and output data). Let's say that the output data is stored in a column `is_fraud`.
465+
1. For each row of the collected inference data, log a unique ID. The unique ID can come from your application, or you can use the `correlationid` that Azure Machine Learning uniquely generates for each logged JSON object.
464466
1. Later, when the ground truth (or actual) `is_fraud` data becomes available, it also gets logged and mapped to the same unique ID that was logged with the model's outputs.
465467
1. This ground truth `is_fraud` data is also collected, maintained, and registered to Azure Machine Learning as a data asset.
466-
1. You can then create a model performance monitoring signal that joins the model's production inference and ground truth data assets, using the unique ID columns.
467-
1. Finally, you can compute the model performance metrics.
468+
1. Create a model performance monitoring signal that joins the model's production inference and ground truth data assets, using the unique ID columns.
469+
1. Finally, compute the model performance metrics.
468470

469471
# [Azure CLI](#tab/azure-cli)
470472

@@ -667,39 +669,40 @@ created_monitor = poller.result()
667669

668670
To set up model performance monitoring:
669671

670-
1. Complete the entires on the **Basic settings** page as described earlier in the [Set up out-of-box model monitoring](#set-up-out-of-box-model-monitoring) section.
672+
1. Complete the entries on the **Basic settings** page as described earlier in the [Set up out-of-box model monitoring](#set-up-out-of-box-model-monitoring) section.
671673
1. Select **Next** to open the **Configure data asset** page of the **Advanced settings** section.
672-
1. **Add** a dataset to be used as the ground truth dataset. Ensure that your model outputs is also included. The ground truth dataset you add should have a unique ID for each row which matches a unique ID for each row in the model outputs. This is necessary for them to be joined together prior to metric computation.
674+
1. Select **+ Add** to add a dataset for use as the ground truth dataset. Ensure that your model outputs dataset is also included in the list of added datasets. The ground truth dataset you add should have a unique ID column. The values in the unique ID columns for the ground truth dataset and the model outputs dataset must match in order for both datasets to be joined together prior to metric computation.
673675

674-
NOTE: This screenshot should show adding model_outputs and the ground truth data asset
676+
:::image type="content" source="media/how-to-monitor-models/model-monitoring-advanced-config-data2.png" alt-text="Screenshot showing how to add datasets for the monitoring signals to use." lightbox="media/how-to-monitor-models/model-monitoring-advanced-config-data2.png":::
675677

676-
:::image type="content" source="media/how-to-monitor-models/model-monitoring-advanced-config-data.png" alt-text="Screenshot showing how to add datasets for the monitoring signals to use." lightbox="media/how-to-monitor-models/model-monitoring-advanced-config-data.png":::
678+
:::image type="content" source="media/how-to-monitor-models/model-monitoring-added-ground-truth-dataset.png" alt-text="Screenshot showing the ground truth dataset and the model outputs and inputs datasets for the monitoring signals to connect to." lightbox="media/how-to-monitor-models/model-monitoring-added-ground-truth-dataset.png":::
677679

678680
1. Select **Next** to go to the **Select monitoring signals** page. On this page, you will see some monitoring signals already added (if you selected an Azure Machine Learning online deployment earlier).
681+
1. Delete the existing monitoring signals on the page, since you're only interested in creating a model performance monitoring signal.
679682
1. Select **Add** to open the **Edit Signal** window.
680683
1. Select **Model performance (preview)** to configure the model performance signal as follows:
681684

682-
1. Select the production data asset with your model outputs and the desired lookback window size and lookback window offset. Select the appropriate target column (for example, `is_fraud`).
683-
1. Select the reference data asset, which should be the ground truth data asset you added earlier. Select the appropriate target_column. Select column to join with the model outputs. This column should be the column which is common between the two datasets and is a unique ID for each for (for example, `correlationid`).
684-
1. Select your desired performance metrics and the respective thresholds.
685+
1. In step 1, for the production data asset, select your model outputs dataset. Also, make the following selections:
686+
- Select the appropriate target column (for example, `is_fraud`)
687+
- Select the desired lookback window size and lookback window offset.
688+
1. In step 2, for the reference data asset, select the ground truth data asset that you added earlier. Also, make the following selections:
689+
- Select the appropriate target column
690+
- Select the column on which to perform the join with the model outputs dataset. The column used for the join should be the column that is common between the two datasets and which has a unique ID for each row in the dataset (for example, `correlationid`).
691+
1. In step 3, select your desired performance metrics and specify their respective thresholds.
685692

686693
NOTE: This screenshot should show a fully configured model performance view
687694

688-
:::image type="content" source="media/how-to-monitor-models/model-monitoring-configure-feature-attribution-drift.png" alt-text="Screenshot showing how to configure feature attribution drift signal." lightbox="media/how-to-monitor-models/model-monitoring-configure-feature-attribution-drift.png":::
695+
:::image type="content" source="media/how-to-monitor-models/model-monitoring-configure-model-performance.png" alt-text="Screenshot showing how to configure a model performance signal." lightbox="media/how-to-monitor-models/model-monitoring-configure-model-performance.png":::
689696

690697
1. Select **Save** to return to the **Select monitoring signals** page.
691698

692-
NOTE: This screenshot should show the model performance signal configured
693-
694-
:::image type="content" source="media/how-to-monitor-models/model-monitoring-configured-signals.png" alt-text="Screenshot showing the configured signals." lightbox="media/how-to-monitor-models/model-monitoring-configured-signals.png":::
699+
:::image type="content" source="media/how-to-monitor-models/model-monitoring-configured-model-performance-signal.png" alt-text="Screenshot showing the configured model performance signal." lightbox="media/how-to-monitor-models/model-monitoring-configured-model-performance-signal.png":::
695700

696-
1. When you're finished with your monitoring signals configuration, select **Next** to go to the **Notifications** page.
697-
1. On the **Notifications** page, enable alert notifications for each signal and select **Next**.
701+
1. Select **Next** to go to the **Notifications** page.
702+
1. On the **Notifications** page, enable alert notification for the model performance signal and select **Next**.
698703
1. Review your settings on the **Review monitoring settings** page.
699704

700-
NOTE: This screenshot should show the review page with model performance added
701-
702-
:::image type="content" source="media/how-to-monitor-models/model-monitoring-advanced-config-review.png" alt-text="Screenshot showing review page of the advanced configuration for model monitoring." lightbox="media/how-to-monitor-models/model-monitoring-advanced-config-review.png":::
705+
:::image type="content" source="media/how-to-monitor-models/model-monitoring-review-monitoring-details.png" alt-text="Screenshot showing review page that includes the configured model performance signal." lightbox="media/how-to-monitor-models/model-monitoring-review-monitoring-details.png":::
703706

704707
1. Select **Create** to create your model performance monitor.
705708

53.1 KB
Loading
66.9 KB
Loading
31.5 KB
Loading
56.7 KB
Loading
90.8 KB
Loading

0 commit comments

Comments
 (0)