You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
#Customer intent: As a data scientist, I want to understand Azure Machine Learning monitoring so I can keep my machine learning models fresh and performant.
15
15
---
@@ -172,23 +172,23 @@ For example, if the accuracy of your classification model in production dips bel
172
172
173
173
## Model monitoring authentication options
174
174
175
-
Azure Machine Learning model monitoring supports both credential-based and credential-less authentication to the datastore with the collected production inference data from your model. To configure credential-less authentication, follow the steps below:
175
+
Azure Machine Learning model monitoring supports both credential-based and credential-less authentication to the datastore with the collected production inference data from your model. To configure credential-less authentication, follow these steps:
176
176
177
-
1) Create a User-Assigned Managed Identity (UAMI) and attach it to your Azure Machine Learning workspace
178
-
2) Grant the UAMI [proper permissions](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-identity-based-service-authentication?view=azureml-api-2&tabs=cli#user-assigned-managed-identity) to access your datastore.
179
-
3) Update the workspace level property `systemDatastoresAuthMode` to `'identity'`.
177
+
1. Create a User-Assigned Managed Identity (UAMI) and attach it to your Azure Machine Learning workspace.
178
+
1. Grant the UAMI [proper permissions](how-to-identity-based-service-authentication.md#user-assigned-managed-identity) to access your datastore.
179
+
1. Update the value of the workspace level property `systemDatastoresAuthMode` to `'identity'`.
180
180
181
-
Or, you can simply add credentials to the datastore where your production inference data is being stored.
181
+
Alternatively, you can add credentials to the datastore where your production inference data is stored.
182
182
183
-
To learn more about credential-less authentication with Azure Machine Learning, see [here](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-identity-based-service-authentication?view=azureml-api-2&tabs=cli#user-assigned-managed-identity).
183
+
To learn more about credential-less authentication with Azure Machine Learning, see [User-assigned managed identity](how-to-identity-based-service-authentication.md#user-assigned-managed-identity).
184
184
185
-
## Model monitoring networking limitations
185
+
## Model monitoring limitations
186
186
187
-
Azure Machine Learning model monitoring does not support the `AllowOnlyApprovedOutbound` managed virtual network isolation setting. To learn more about managed virtual network isolation in Azure Machine Learning, see [here](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-managed-network?view=azureml-api-2&tabs=azure-cli).
187
+
Azure Machine Learning model monitoring has the following limitations:
188
188
189
-
## Model monitoring other limitations
189
+
- It doesn't support the `AllowOnlyApprovedOutbound` managed virtual network isolation setting. To learn more about managed virtual network isolation in Azure Machine Learning, see [Workspace Managed Virtual Network Isolation](how-to-managed-network.md).
190
190
191
-
Azure Machine Learning model monitoring has a dependency on `Spark` to compute metrics over large-scale datasets. `MLTable`is not well-supported by `Spark`, and thus it is recommended to avoid using `MLTable` whenever possible with model monitoring jobs. Only basic `MLTable` files are guaranteed to be supported. For complex or custom operations, we recommend using the `Spark` API directly in your code.
191
+
- It depends on `Spark` to compute metrics over large-scale datasets. Because `MLTable`isn't well-supported by `Spark`, it's best to avoid using `MLTable` whenever possible with model monitoring jobs. Only basic `MLTable` files have guaranteed support. For complex or custom operations, consider using the `Spark` API directly in your code.
0 commit comments