You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/machine-learning/how-to-deploy-mlflow-models-online-endpoints.md
+59-75Lines changed: 59 additions & 75 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -33,7 +33,7 @@ For no-code-deployment, Azure Machine Learning:
33
33
34
34
## About the example
35
35
36
-
The example shows how you can deploy an MLflow model to an online endpoint to perform predictions. The example uses an MLflow model that's based on the [Diabetes dataset](https://www4.stat.ncsu.edu/~boos/var.select/diabetes.html). This dataset contains ten baseline variables: age, sex, body mass index, average blood pressure, and six blood serum measurements obtained from 442 diabetes patients. It also contains the response of interest, a quantitative measure of disease progression one year after baseline.
36
+
The example shows how you can deploy an MLflow model to an online endpoint to perform predictions. The example uses an MLflow model that's based on the [Diabetes dataset](https://www4.stat.ncsu.edu/~boos/var.select/diabetes.html). This dataset contains 10 baseline variables: age, sex, body mass index, average blood pressure, and six blood serum measurements obtained from 442 diabetes patients. It also contains the response of interest, a quantitative measure of disease progression one year after baseline.
37
37
38
38
The model was trained using a `scikit-learn` regressor, and all the required preprocessing has been packaged as a pipeline, making this model an end-to-end pipeline that goes from raw data to predictions.
39
39
@@ -56,7 +56,7 @@ Before following the steps in this article, make sure you have the following pre
56
56
- Azure role-based access controls (Azure RBAC) are used to grant access to operations in Azure Machine Learning. To perform the steps in this article, your user account must be assigned the owner or contributor role for the Azure Machine Learning workspace, or a custom role allowing `Microsoft.MachineLearningServices/workspaces/onlineEndpoints/*`. For more information on roles, see [Manage access to an Azure Machine Learning workspace](how-to-assign-roles.md).
57
57
- You must have an MLflow model registered in your workspace. This article registers a model trained for the [Diabetes dataset](https://www4.stat.ncsu.edu/~boos/var.select/diabetes.html) in the workspace.
58
58
59
-
-Additionally, you need to:
59
+
-Also, you need to:
60
60
61
61
# [Azure CLI](#tab/cli)
62
62
@@ -289,7 +289,7 @@ version = registered_model.version
289
289
290
290
# [Python (MLflow SDK)](#tab/mlflow)
291
291
292
-
We can configure the properties of this endpoint using a configuration file. In this case, we are configuring the authentication mode of the endpoint to be "key".
292
+
You can configure the properties of this endpoint using a configuration file. In this case, you're configuring the authentication mode of the endpoint to be "key".
293
293
294
294
```python
295
295
endpoint_config = {
@@ -509,7 +509,7 @@ version = registered_model.version
509
509
510
510
## Invoke the endpoint
511
511
512
-
Once your deployment is ready, you can use it to serve request. One way to test the deployment is by using the built-in invocation capability in the deployment client you are using. The following JSON is a sample request for the deployment.
512
+
Once your deployment is ready, you can use it to serve request. One way to test the deployment is by using the built-in invocation capability in the deployment client you're using. The following JSON is a sample request for the deployment.
@@ -571,59 +571,59 @@ The response will be similar to the following text:
571
571
572
572
## Customize MLflow model deployments
573
573
574
-
MLflow models can be deployed to online endpoints without indicating a scoring script in the deployment definition. However, you can opt to customize how inference is executed.
574
+
You don't have to specify a scoring script in the deployment definition of an MLflow model to an online endpoint. However, you can opt to do so and customize how inference gets executed.
575
575
576
-
You will typically selectthis workflow when:
576
+
You'll typically want to customize your MLflow model deployment when:
577
577
578
578
> [!div class="checklist"]
579
579
> - The model doesn't have a `PyFunc` flavor on it.
580
-
> - You need to customize the way the model is run, for instance, use an specific flavor to load it with `mlflow.<flavor>.load_model()`.
581
-
> - You need to do pre/post processing in your scoring routine when it is not done by the model itself.
582
-
> - The output of the model can't be nicely represented in tabular data. For instance, it is a tensor representing an image.
580
+
> - You need to customize the way the model is run, for instance, to use a specific flavor to load the model, using `mlflow.<flavor>.load_model()`.
581
+
> - You need to do pre/post processing in your scoring routine when it's not done by the model itself.
582
+
> - The output of the model can't be nicely represented in tabular data. For instance, it's a tensor representing an image.
583
583
584
584
> [!IMPORTANT]
585
-
> If you choose to indicate an scoring script for an MLflow model deployment, you will also have to specify the environment where the deployment will run.
585
+
> If you choose to specify a scoring script for an MLflow model deployment, you'll also have to specify the environment where the deployment will run.
586
586
587
587
### Steps
588
588
589
-
Use the following steps to deploy an MLflow model with a custom scoring script.
589
+
To deploy an MLflow model with a custom scoring script:
590
590
591
-
1. Identify the folder where your MLflow model is placed.
591
+
1. Identify the folder where your MLflow model is located.
592
592
593
593
a. Go to the [Azure Machine Learning studio](https://ml.azure.com).
594
594
595
-
b. Go to the section __Models__.
595
+
b. Go to the __Models__ section.
596
596
597
597
c. Select the model you're trying to deploy and go to its __Artifacts__ tab.
598
598
599
-
d. Take note of the folder that is displayed. This folder was indicated when the model was registered.
599
+
d. Take note of the folder that is displayed. This folder was specified when the model was registered.
600
600
601
601
:::image type="content" source="media/how-to-deploy-mlflow-models-online-endpoints/mlflow-model-folder-name.png" lightbox="media/how-to-deploy-mlflow-models-online-endpoints/mlflow-model-folder-name.png" alt-text="Screenshot showing the folder where the model artifacts are placed.":::
602
602
603
-
1. Create a scoring script. Notice how the folder name `model` you identified before has been included in the `init()` function.
603
+
1. Create a scoring script. Notice how the folder name `model` that you previously identified is included in the `init()` function.
604
+
605
+
> [!TIP]
606
+
> The following scoring script is provided as an example about how to perform inference with an MLflow model. You can adapt this script to your needs or change any of its parts to reflect your scenario.
> The previous scoring script is provided as an example about how to perform inference of an MLflow model. You can adapt this example to your needs or change any of its parts to reflect your scenario.
611
-
612
612
> [!WARNING]
613
-
> __MLflow 2.0 advisory__: The provided scoring script will work with both MLflow 1.X and MLflow 2.X. However, be advised that the expected input/output formats on those versions may vary. Check the environment definition used to ensure you are using the expected MLflow version. Notice that MLflow 2.0 is only supported in Python 3.8+.
613
+
> __MLflow 2.0 advisory__: The provided scoring script will work with both MLflow 1.X and MLflow 2.X. However, be advised that the expected input/output formats on those versions might vary. Check the environment definition used to ensure you're using the expected MLflow version. Notice that MLflow 2.0 is only supported in Python 3.8+.
614
614
615
-
1. Let's create an environment where the scoring script can be executed. Since our model is MLflow, the conda requirements are also specified in the model package (for more details about MLflow models and the files included on it see The MLmodel format). We are going thento build the environment using the conda dependencies from the file. However, we need also to include the package `azureml-inference-server-http` which is required forOnline Deploymentsin Azure Machine Learning.
615
+
1. Create an environment where the scoring script can be executed. Since the model is an MLflow model, the conda requirements are also specified in the model package. For more details about the files included in an MLflow model see [The MLmodel format](concept-mlflow-models.md#the-mlmodel-format). You'll then build the environment using the conda dependencies from the file. However, you need to also include the package `azureml-inference-server-http`, which is required foronline deploymentsin Azure Machine Learning.
@@ -680,9 +675,9 @@ Use the following steps to deploy an MLflow model with a custom scoring script.
680
675
instance_type: Standard_F2s_v2
681
676
instance_count: 1
682
677
```
683
-
678
+
684
679
Create the deployment:
685
-
680
+
686
681
```azurecli
687
682
az ml online-deployment create -f deployment.yml
688
683
```
@@ -706,39 +701,37 @@ Use the following steps to deploy an MLflow model with a custom scoring script.
706
701
707
702
# [Python (MLflow SDK)](#tab/mlflow)
708
703
709
-
*This operation is not supported in MLflow SDK*
704
+
*This operation isn't supported in MLflow SDK*
710
705
711
706
# [Studio](#tab/studio)
712
-
713
-
In the [Azure Machine Learning studio](https://ml.azure.com), follow these steps:
714
-
707
+
715
708
1. From the __Endpoints__ page, Select **+Create**.
716
709
1. Select the MLflow model you registered previously.
717
710
1. Select __More options__ in the endpoint creation wizard to open up advanced options.
718
-
711
+
719
712
:::image type="content" source="media/how-to-deploy-mlflow-models-online-endpoints/select-advanced-deployment-options.png" lightbox="media/how-to-deploy-mlflow-models-online-endpoints/select-advanced-deployment-options.png" alt-text="Screenshot showing how to select advanced deployment options when creating an endpoint":::
720
-
713
+
721
714
1. Provide a name and authentication type for the endpoint, and then select __Next__ to see that the model you selected is being used for your deployment.
722
-
1. Select __Next__ to continue to the "Deployment" page.
723
-
1. Select __Next__ to go to the "Code + environment" page. When you select a model registered in MLflow format, you don't need to specify a scoring script or an environment on this page. However, you want to specify one in this section
715
+
1. Select __Next__ to continue to the ___Deployment__ page.
716
+
1. Select __Next__ to go to the __Code + environment__ page. When you select a model registered in MLflow format, you don't need to specify a scoring script or an environment on this page. However, you want to specify one in this section
724
717
1. Select the slider next to __Customize environment and scoring script__.
725
718
726
719
:::image type="content" source="media/how-to-deploy-mlflow-models-online-endpoints/configure-scoring-script-mlflow.png" lightbox="media/how-to-deploy-mlflow-models-online-endpoints/configure-scoring-script-mlflow.png" alt-text="Screenshot showing how to indicate an environment and scoring script for MLflow models":::
727
-
728
-
1. Browse to selectthe scoring script you created before.
720
+
721
+
1. Browse to selectthe scoring script you created previously.
729
722
1. Select __Custom environments__ for the environment type.
730
-
1. Select the custom environment you created before and select__Next__.
723
+
1. Select the custom environment you created previously, and select__Next__.
731
724
1. Complete the wizard to deploy the model to the endpoint.
732
725
733
726
---
734
727
735
-
1. Once your deployment completes, your deployment is ready to serve request. One of the easier ways to test the deployment is by using a sample request file along with the `invoke` method.
728
+
1. Once your deployment completes, it is ready to serve requests. One way to test the deployment is by using a sample request file along with the `invoke` method.
0 commit comments