You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/machine-learning/concept-model-management-and-deployment.md
+21-3Lines changed: 21 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -119,6 +119,16 @@ To deploy the model as a web service, you must provide the following items:
119
119
120
120
For more information, see [Deploy models](how-to-deploy-and-where.md).
121
121
122
+
#### Controlled rollout
123
+
124
+
When deploying to Azure Kubernetes Service, you can use controlled rollout to enable the following scenarios:
125
+
126
+
* Create multiple versions of an endpoint for a deployment
127
+
* Perform A/B testing by routing traffic to different versions of the endpoint.
128
+
* Switch between endpoint versions by updating the traffic percentage in endpoint configuration.
129
+
130
+
For more information, see [Controlled rollout of ML models](how-to-deploy-azure-kubernetes-service.md#deploy-models-to-aks-using-controlled-rollout-preview).
131
+
122
132
#### IoT Edge devices
123
133
124
134
You can use models with IoT devices through **Azure IoT Edge modules**. IoT Edge modules are deployed to a hardware device, which enables inference, or model scoring, on the device.
@@ -131,12 +141,20 @@ Microsoft Power BI supports using machine learning models for data analytics. Fo
131
141
132
142
## Capture the governance data required for capturing the end-to-end ML lifecycle
133
143
134
-
Azure ML gives you the capability to track the end-to-end audit trail of all of your ML assets. Specifically:
144
+
Azure ML gives you the capability to track the end-to-end audit trail of all of your ML assets by using metadata.
135
145
136
146
- Azure ML [integrates with Git](how-to-set-up-training-targets.md#gitintegration) to track information on which repository / branch / commit your code came from.
137
-
-[Azure ML Datasets](how-to-create-register-datasets.md) help you track, profile, and version data.
147
+
-[Azure ML Datasets](how-to-create-register-datasets.md) help you track, profile, and version data.
148
+
-[Interpretability](how-to-machine-learning-interpretability.md) allows you to explain your models, meet regulatory compliance, and understand how models arrive at a result for given input.
138
149
- Azure ML Run history stores a snapshot of the code, data, and computes used to train a model.
139
150
- The Azure ML Model Registry captures all of the metadata associated with your model (which experiment trained it, where it is being deployed, if its deployments are healthy).
151
+
-[Integration with Azure Event Grid](concept-event-grid-integration.md) allows you to act on events in the ML lifecycle. For example, model registration, deployment, data drift, and training (run) events.
152
+
153
+
> [!TIP]
154
+
> While some information on models and datasets is automatically captured, you can add additional information by using __tags__. When looking for registered models and datasets in your workspace, you can use tags as a filter.
155
+
>
156
+
> Associating a dataset with a registered model is an optional step. For information on referencing a dataset when registering a model, see the [Model](https://docs.microsoft.com/python/api/azureml-core/azureml.core.model(class)?view=azure-ml-py) class reference.
157
+
140
158
141
159
## Notify, automate, and alert on events in the ML lifecycle
142
160
Azure ML publishes key events to Azure EventGrid, which can be used to notify and automate on events in the ML lifecycle. For more information, please see [this document](how-to-use-event-grid.md).
@@ -152,7 +170,7 @@ For more information, see [How to enable model data collection](how-to-enable-da
152
170
153
171
## Retrain your model on new data
154
172
155
-
Often, you'll want to update your model, or even retrain it from scratch, as you receive new information. Sometimes, receiving new data is an expected part of the domain. Other times, as discussed in [Detect data drift (preview) on datasets](how-to-monitor-datasets.md), model performance can degrade in the face of such things as changes to a particular sensor, natural data changes such as seasonal effects, or features shifting in their relation to other features.
173
+
Often, you'll want to validate your model, update it, or even retrain it from scratch, as you receive new information. Sometimes, receiving new data is an expected part of the domain. Other times, as discussed in [Detect data drift (preview) on datasets](how-to-monitor-datasets.md), model performance can degrade in the face of such things as changes to a particular sensor, natural data changes such as seasonal effects, or features shifting in their relation to other features.
156
174
157
175
There is no universal answer to "How do I know if I should retrain?" but Azure ML event and monitoring tools previously discussed are good starting points for automation. Once you have decided to retrain, you should:
0 commit comments