Skip to content

Commit 91758aa

Browse files
authored
Merge pull request #110984 from Blackmist/model-management-updates
updates from customer feedback.
2 parents 7d09ef3 + 29252d7 commit 91758aa

File tree

1 file changed

+21
-3
lines changed

1 file changed

+21
-3
lines changed

articles/machine-learning/concept-model-management-and-deployment.md

Lines changed: 21 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -119,6 +119,16 @@ To deploy the model as a web service, you must provide the following items:
119119

120120
For more information, see [Deploy models](how-to-deploy-and-where.md).
121121

122+
#### Controlled rollout
123+
124+
When deploying to Azure Kubernetes Service, you can use controlled rollout to enable the following scenarios:
125+
126+
* Create multiple versions of an endpoint for a deployment
127+
* Perform A/B testing by routing traffic to different versions of the endpoint.
128+
* Switch between endpoint versions by updating the traffic percentage in endpoint configuration.
129+
130+
For more information, see [Controlled rollout of ML models](how-to-deploy-azure-kubernetes-service.md#deploy-models-to-aks-using-controlled-rollout-preview).
131+
122132
#### IoT Edge devices
123133

124134
You can use models with IoT devices through **Azure IoT Edge modules**. IoT Edge modules are deployed to a hardware device, which enables inference, or model scoring, on the device.
@@ -131,12 +141,20 @@ Microsoft Power BI supports using machine learning models for data analytics. Fo
131141

132142
## Capture the governance data required for capturing the end-to-end ML lifecycle
133143

134-
Azure ML gives you the capability to track the end-to-end audit trail of all of your ML assets. Specifically:
144+
Azure ML gives you the capability to track the end-to-end audit trail of all of your ML assets by using metadata.
135145

136146
- Azure ML [integrates with Git](how-to-set-up-training-targets.md#gitintegration) to track information on which repository / branch / commit your code came from.
137-
- [Azure ML Datasets](how-to-create-register-datasets.md) help you track, profile, and version data.
147+
- [Azure ML Datasets](how-to-create-register-datasets.md) help you track, profile, and version data.
148+
- [Interpretability](how-to-machine-learning-interpretability.md) allows you to explain your models, meet regulatory compliance, and understand how models arrive at a result for given input.
138149
- Azure ML Run history stores a snapshot of the code, data, and computes used to train a model.
139150
- The Azure ML Model Registry captures all of the metadata associated with your model (which experiment trained it, where it is being deployed, if its deployments are healthy).
151+
- [Integration with Azure Event Grid](concept-event-grid-integration.md) allows you to act on events in the ML lifecycle. For example, model registration, deployment, data drift, and training (run) events.
152+
153+
> [!TIP]
154+
> While some information on models and datasets is automatically captured, you can add additional information by using __tags__. When looking for registered models and datasets in your workspace, you can use tags as a filter.
155+
>
156+
> Associating a dataset with a registered model is an optional step. For information on referencing a dataset when registering a model, see the [Model](https://docs.microsoft.com/python/api/azureml-core/azureml.core.model(class)?view=azure-ml-py) class reference.
157+
140158

141159
## Notify, automate, and alert on events in the ML lifecycle
142160
Azure ML publishes key events to Azure EventGrid, which can be used to notify and automate on events in the ML lifecycle. For more information, please see [this document](how-to-use-event-grid.md).
@@ -152,7 +170,7 @@ For more information, see [How to enable model data collection](how-to-enable-da
152170

153171
## Retrain your model on new data
154172

155-
Often, you'll want to update your model, or even retrain it from scratch, as you receive new information. Sometimes, receiving new data is an expected part of the domain. Other times, as discussed in [Detect data drift (preview) on datasets](how-to-monitor-datasets.md), model performance can degrade in the face of such things as changes to a particular sensor, natural data changes such as seasonal effects, or features shifting in their relation to other features.
173+
Often, you'll want to validate your model, update it, or even retrain it from scratch, as you receive new information. Sometimes, receiving new data is an expected part of the domain. Other times, as discussed in [Detect data drift (preview) on datasets](how-to-monitor-datasets.md), model performance can degrade in the face of such things as changes to a particular sensor, natural data changes such as seasonal effects, or features shifting in their relation to other features.
156174

157175
There is no universal answer to "How do I know if I should retrain?" but Azure ML event and monitoring tools previously discussed are good starting points for automation. Once you have decided to retrain, you should:
158176

0 commit comments

Comments
 (0)