Skip to content

Commit 77c524b

Browse files
authored
Merge pull request #5453 from cdpark/refresh-ml-designer-2
Feature 438779: Q&M: Azure ML Designer doc Freshness - batch 2
2 parents 8b05832 + 17a5292 commit 77c524b

File tree

8 files changed

+99
-121
lines changed

8 files changed

+99
-121
lines changed
-206 KB
Loading
-188 KB
Loading
-157 KB
Loading
6.8 KB
Loading
6.16 KB
Loading
-108 KB
Loading

articles/machine-learning/v1/tutorial-designer-automobile-price-deploy.md

Lines changed: 39 additions & 39 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
title: 'Tutorial: Use designer to deploy no-code models'
2+
title: Use Designer to Deploy No-Code Models
33
titleSuffix: Azure Machine Learning
44
description: Learn how to deploy a machine learning model to predict car prices with the Azure Machine Learning designer.
55
ms.reviewer: None
@@ -9,28 +9,28 @@ services: machine-learning
99
ms.service: azure-machine-learning
1010
ms.subservice: core
1111
ms.topic: tutorial
12-
ms.date: 03/04/2024
12+
ms.date: 06/09/2025
1313
ms.custom: UpdateFrequency5, designer
1414
---
1515

16-
# Tutorial: Use the designer to deploy a machine learning model
16+
# Tutorial: Deploy a machine learning model using designer
1717

1818
[!INCLUDE [v1 deprecation](../includes/sdk-v1-deprecation.md)]
1919

2020
In [part one of this tutorial](tutorial-designer-automobile-price-train-score.md), you trained a linear regression model that predicts car prices. In this second part, you use the Azure Machine Learning designer to deploy the model so that others can use it.
2121

22-
>[!Note]
23-
> The designer supports two types of components: classic prebuilt components (v1) and custom components (v2). These two types of components are NOT compatible.
22+
> [!NOTE]
23+
> Designer supports two types of components: classic prebuilt components (v1) and custom components (v2). These two types of components are NOT compatible.
2424
>
25-
>*Classic prebuilt components* provide prebuilt components mainly for data processing and traditional machine learning tasks like regression and classification. This type of component continues to be supported but no new components will be added.
25+
>Classic prebuilt components are intended primarily for data processing and traditional machine learning tasks like regression and classification. This type of component continues to be supported but will not have any new components added.
2626
>
27-
>*Custom components* allow you to wrap your own code as a component. They support sharing components across workspaces and seamless authoring across Machine Learning Studio, CLI v2, and SDK v2 interfaces.
27+
>Custom components allow you to wrap your own code as a component. It supports sharing components across workspaces and seamless authoring across Studio, CLI v2, and SDK v2 interfaces.
2828
>
29-
>For new projects, we advise that you use custom components, which are compatible with Azure Machine Learning v2 and will keep receiving new updates.
29+
>For new projects, we highly suggest that you use custom components, which are compatible with Azure Machine Learning V2 and will keep receiving new updates.
3030
>
3131
>This article applies to classic prebuilt components and isn't compatible with CLI v2 and SDK v2.
3232
33-
In this tutorial, you:
33+
In this tutorial, you learn how to:
3434

3535
> [!div class="checklist"]
3636
> * Create a real-time inference pipeline.
@@ -43,7 +43,7 @@ In this tutorial, you:
4343
Complete [part one of the tutorial](tutorial-designer-automobile-price-train-score.md) to learn how to train and score a machine learning model in the designer.
4444

4545
> [!IMPORTANT]
46-
> If you don't see graphical elements mentioned in this document, such as buttons in studio or designer, you might not have the right level of permissions to the workspace. Please contact your Azure subscription administrator to verify that you have been granted the correct level of access. For more information, see [Manage users and roles](../how-to-assign-roles.md).
46+
> If you don't see graphical elements mentioned in this document, such as buttons in studio or designer, you might not have the right level of permissions to the workspace. Contact your Azure subscription administrator to verify that you have been granted the correct level of access. For more information, see [Manage users and roles](../how-to-assign-roles.md).
4747
4848
## Create a real-time inference pipeline
4949

@@ -54,7 +54,7 @@ To deploy your pipeline, you must first convert the training pipeline into a rea
5454
5555
### Create a real-time inference pipeline
5656

57-
1. Select **Pipelines** from the side navigation panel, then open the pipeline job that you created. On the detail page, above the pipeline canvas, select the ellipses **...** then choose **Create inference pipeline** > **Real-time inference pipeline**.
57+
1. Select **Jobs** from the sidebar menu, then open the pipeline job that you created. On the detail page, above the pipeline canvas, select the ellipses **...** then choose **Create inference pipeline** > **Real-time inference pipeline**.
5858

5959
:::image type="content" source="media/tutorial-designer-automobile-price-deploy/create-real-time-inference.png" alt-text="Screenshot of create inference pipeline in pipeline job detail page." lightbox="media/tutorial-designer-automobile-price-deploy/create-real-time-inference.png":::
6060

@@ -81,24 +81,24 @@ To deploy your pipeline, you must first convert the training pipeline into a rea
8181

8282
1. Select **Deploy** in the job detail page.
8383

84-
:::image type="content" source="./media/tutorial-designer-automobile-price-deploy/deploy-in-job-detail-page.png" alt-text="Screenshot showing deploying in job detail page.":::
84+
:::image type="content" source="./media/tutorial-designer-automobile-price-deploy/deploy-in-job-detail-page.png" alt-text="Screenshot showing deploying in job detail page." lightbox="./media/tutorial-designer-automobile-price-deploy/deploy-in-job-detail-page.png":::
8585

8686
## Create an inferencing cluster
8787

8888
In the dialog box that appears, you can select from any existing Azure Kubernetes Service (AKS) clusters to deploy your model to. If you don't have an AKS cluster, use the following steps to create one.
8989

90-
1. Go to the **Compute** page by selecting **Compute** in the dialog box.
90+
1. Go to the **Compute** page by selecting **Compute** in the sidebar menu.
9191

92-
1. On the navigation ribbon, select **Kubernetes Clusters** > **+ New**.
92+
1. On the navigation ribbon, select **Kubernetes Clusters**. Then select **+ New** > **AksCompute**.
9393

9494
:::image type="content" source="./media/tutorial-designer-automobile-price-deploy/new-inference-cluster.png" alt-text="Screenshot showing how to get to the new inference cluster pane.":::
9595

9696
1. In the inference cluster pane, configure a new Kubernetes Service.
9797

98-
1. Enter *aks-compute* for the **Compute name**.
99-
10098
1. Select a nearby region that's available for the **Region**.
10199

100+
1. On the next screen, enter *aks-compute* for the **Compute name**.
101+
102102
1. Select **Create**.
103103

104104
> [!NOTE]
@@ -116,19 +116,19 @@ After your AKS service finishes provisioning, return to the real-time inferencin
116116

117117
:::image type="content" source="./media/tutorial-designer-automobile-price-deploy/setup-endpoint.png" alt-text="Screenshot showing how to set up a new real-time endpoint.":::
118118

119-
You can also change the **Advanced** setting for your real-time endpoint.
119+
You can also change the **Advanced** settings for your real-time endpoint.
120120

121121
|Advanced setting|Description|
122122
|---|---|
123-
|Enable Application Insights diagnostics and data collection| Allows Azure Application Insights to collect data from the deployed endpoints. <br> By default: false. |
124-
|Scoring timeout| A timeout in milliseconds to enforce for scoring calls to the web service. <br> By default: 60000.|
125-
|Auto scale enabled| Allows autoscaling for the web service.<br> By default: true.|
126-
|Min replicas| The minimum number of containers to use when autoscaling this web service.<br> By default: 1.|
127-
|Max replicas| The maximum number of containers to use when autoscaling this web service.<br> By default: 10.|
128-
|Target utilization|The target utilization (as a percentage) that the autoscaler should attempt to maintain for this web service.<br> By default: 70.|
129-
|Refresh period|How often (in seconds) the autoscaler attempts to scale this web service.<br> By default: 1.|
130-
|CPU reserve capacity|The number of CPU cores to allocate for this web service.<br> By default: 0.1.|
131-
|Memory reserve capacity|The amount of memory (in GB) to allocate for this web service.<br> By default: 0.5.|
123+
|Enable Application Insights diagnostics and data collection| Allows Azure Application Insights to collect data from the deployed endpoints. <br> By default: false |
124+
|Scoring timeout| A timeout in milliseconds to enforce for scoring calls to the web service. <br> By default: 60000|
125+
|Auto scale enabled| Allows autoscaling for the web service.<br> By default: true|
126+
|Min replicas| The minimum number of containers to use when autoscaling this web service.<br> By default: 1|
127+
|Max replicas| The maximum number of containers to use when autoscaling this web service.<br> By default: 10|
128+
|Target utilization|The target utilization (as a percentage) that the autoscaler should attempt to maintain for this web service.<br> By default: 70|
129+
|Refresh period|How often (in seconds) the autoscaler attempts to scale this web service.<br> By default: 1|
130+
|CPU reserve capacity|The number of CPU cores to allocate for this web service.<br> By default: 0.1|
131+
|Memory reserve capacity|The amount of memory (in GB) to allocate for this web service.<br> By default: 0.5|
132132

133133
1. Select **Deploy**.
134134

@@ -144,13 +144,13 @@ After your AKS service finishes provisioning, return to the real-time inferencin
144144

145145
After deployment finishes, you can view your real-time endpoint by going to the **Endpoints** page.
146146

147-
1. On the **Endpoints** page, select the endpoint you deployed.
147+
1. Select **Endpoints** on the sidebar menu, then select the endpoint you deployed.
148148

149-
In the **Details** tab, you can see more information such as the REST URI, Swagger definition, status, and tags.
149+
- In the **Details** tab, you can see more information such as the REST URI, Swagger definition, status, and tags.
150150

151-
In the **Consume** tab, you can find sample consumption code, security keys, and set authentication methods.
151+
- In the **Consume** tab, you can find sample consumption code, security keys, and set authentication methods.
152152

153-
In the **Deployment logs** tab, you can find the detailed deployment logs of your real-time endpoint.
153+
- In the **Deployment logs** tab, you can find the detailed deployment logs of your real-time endpoint.
154154

155155
1. To test your endpoint, go to the **Test** tab. From here, you can enter test data and select **Test** verify the output of your endpoint.
156156

@@ -166,35 +166,35 @@ You can update the online endpoint with new model trained in the designer. On th
166166

167167
1. After you submit the modified training pipeline, go to the job detail page.
168168

169-
1. When the job completes, right click **Train Model** and select **Register data**.
169+
1. When the job completes, right-click **Train Model** and select **Register data**.
170170

171171
:::image type="content" source="media/how-to-run-batch-predictions-designer/register-train-model-as-dataset.png" alt-text="Screenshot showing register trained model as dataset." lightbox="media/how-to-run-batch-predictions-designer/register-train-model-as-dataset.png":::
172172

173-
Input name and select **File** type.
173+
Input a name and select **File** type.
174174

175-
:::image type="content" source="./media/how-to-run-batch-predictions-designer/register-train-model-as-dataset-2.png" alt-text="Screenshot of register as a data asset with new data asset selected.":::
175+
:::image type="content" source="./media/how-to-run-batch-predictions-designer/register-train-model-as-dataset-2.png" alt-text="Screenshot of register as a data asset with new data asset selected." lightbox="./media/how-to-run-batch-predictions-designer/register-train-model-as-dataset-2.png":::
176176

177-
1. After the dataset registers successfully, open your inference pipeline draft, or clone the previous inference pipeline job into a new draft. In the inference pipeline draft, replace the previous trained model shown as **MD-XXXX** node connected to the **Score Model** component with the newly registered dataset.
177+
1. After the dataset registers successfully, open your inference pipeline draft, or clone the previous inference pipeline job into a new draft. In the inference pipeline draft, replace the previous trained model shown as **MD-xxxx** node connected to the **Score Model** component with the newly registered dataset.
178178

179179
:::image type="content" source="media/tutorial-designer-automobile-price-deploy/modify-inference-pipeline.png" alt-text="Screenshot showing how to modify inference pipeline." lightbox="media/tutorial-designer-automobile-price-deploy/modify-inference-pipeline.png":::
180180

181-
1. If you need to update the data preprocessing part in your training pipeline, and would like to update that into the inference pipeline, the processing is similar as steps above.
181+
1. If you need to update the data preprocessing part in your training pipeline, and would like to update that into the inference pipeline, the processing is similar to the preceding steps.
182182

183183
You just need to register the transformation output of the transformation component as dataset.
184184

185-
Then, manually replace the **TD-** component in the inference pipeline with the registered dataset.
185+
Then, manually replace the **TD-xxxx** component in the inference pipeline with the registered dataset.
186186

187187
:::image type="content" source="media/tutorial-designer-automobile-price-deploy/replace-td-module.png" alt-text="Screenshot showing how to replace transformation component." lightbox="media/tutorial-designer-automobile-price-deploy/replace-td-module.png":::
188188

189189
1. After modifying your inference pipeline with the newly trained model or transformation, submit it. When the job is completed, deploy it to the existing online endpoint deployed previously.
190190

191-
:::image type="content" source="./media/tutorial-designer-automobile-price-deploy/deploy-to-existing-endpoint.png" alt-text="Screenshot showing how to replace existing real-time endpoint.":::
191+
:::image type="content" source="./media/tutorial-designer-automobile-price-deploy/deploy-to-existing-endpoint.png" alt-text="Screenshot showing how to replace existing real-time endpoint." lightbox="./media/tutorial-designer-automobile-price-deploy/deploy-to-existing-endpoint.png":::
192192

193193
## Limitations
194194

195-
* Due to datastore access limitation, if your inference pipeline contains **Import Data** or **Export Data** components, they're auto-removed when deployed to real-time endpoint.
195+
* Due to datastore access limitation, if your inference pipeline contains **Import Data** or **Export Data** components, they're autoremoved when deployed to a real-time endpoint.
196196

197-
* If you have datasets in the real-time inference pipeline and want to deploy it to real-time endpoint, currently this flow only supports datasets registered from **Blob** datastore. If you want to use datasets from other type datastores, you can use **Select Column** to connect with your initial dataset with settings of selecting all columns, register the outputs of **Select Column** as File dataset and then replace the initial dataset in the real-time inference pipeline with this newly registered dataset.
197+
* If you have datasets in the real-time inference pipeline and want to deploy to a real-time endpoint, currently this flow only supports datasets registered from **Blob** datastore. If you want to use datasets from other type datastores, you can use **Select Column** to connect with your initial dataset with settings of selecting all columns, register the outputs of **Select Column** as File dataset and then replace the initial dataset in the real-time inference pipeline with this newly registered dataset.
198198

199199
* If your inference graph contains **Enter Data Manually** component that isn't connected to the same port as **Web Service Input** component, the **Enter Data Manually** component isn't executed during HTTP call processing. A workaround is to register the outputs of that **Enter Data Manually** component as a dataset, then in the inference pipeline draft, replace the **Enter Data Manually** component with the registered dataset.
200200

0 commit comments

Comments
 (0)