Skip to content

Commit 7b5f59e

Browse files
committed
refresh
1 parent 67bac6a commit 7b5f59e

File tree

1 file changed

+74
-57
lines changed

1 file changed

+74
-57
lines changed
Lines changed: 74 additions & 57 deletions
Original file line numberDiff line numberDiff line change
@@ -1,104 +1,124 @@
11
---
2-
title: Deploy MLflow models as web services
2+
title: Deploy MLflow models as web services (v1)
33
titleSuffix: Azure Machine Learning
4-
description: Set up MLflow with Azure Machine Learning to deploy your ML models as an Azure web service.
4+
description: Set up MLflow with Azure Machine Learning by using the v1 SDK and deploy machine learning models as Azure web services.
55
services: machine-learning
66
author: msakande
77
ms.author: mopeakande
88
ms.reviewer: fasantia
99
ms.service: machine-learning
1010
ms.subservice: core
11-
ms.date: 11/04/2022
11+
ms.date: 07/16/2024
1212
ms.topic: how-to
1313
ms.custom: UpdateFrequency5, sdkv1
14+
15+
#customer intent: As a developer, I want to configure MLflow with Azure Machine Learning so I can deploy machine learning models as Azure web services.
1416
---
1517

1618
# Deploy MLflow models as Azure web services
1719

1820
[!INCLUDE [sdk v1](../includes/machine-learning-sdk-v1.md)]
1921

20-
In this article, learn how to deploy your [MLflow](https://www.mlflow.org) model as an Azure web service, so that you can leverage and apply Azure Machine Learning's model management and data drift detection capabilities to your production models. For more MLflow and Azure Machine Learning functionality integrations, see [MLflow and Azure Machine Learning (v2)](../concept-mlflow.md), which uses v2 SDK.
22+
MLflow is an open-source library for managing the life cycle of machine learning experiments. MLflow integration with Azure Machine Learning lets you extend the management capabilities beyond model training to the deployment phase of production models. In this article, you deploy an [MLflow](https://www.mlflow.org/) model as an Azure web service and apply Azure Machine Learning model management and data drift detection features to your production models.
2123

22-
Azure Machine Learning offers deployment configurations for:
23-
* Azure Container Instance (ACI) which is a suitable choice for a quick dev-test deployment.
24-
* Azure Kubernetes Service (AKS) which is recommended for scalable production deployments.
24+
The following diagram demonstrates how the MLflow deploy API integrates with Azure Machine Learning to deploy models. You create models as Azure web services by using popular frameworks like PyTorch, Tensorflow, or scikit-learn, and manage the services in your workspace:
2525

26-
[!INCLUDE [endpoints-option](../includes/machine-learning-endpoints-preview-note.md)]
26+
:::image type="content" source="./media/how-to-deploy-mlflow-models/mlflow-diagram-deploy.png" border="false" alt-text="Diagram that illustrates how the MLflow deploy API integrates with Azure Machine Learning to deploy models." lightbox="./media/how-to-deploy-mlflow-models/mlflow-diagram-deploy.png":::
2727

2828
> [!TIP]
29-
> The information in this document is primarily for data scientists and developers who want to deploy their MLflow model to an Azure Machine Learning web service endpoint. If you are an administrator interested in monitoring resource usage and events from Azure Machine Learning, such as quotas, completed training runs, or completed model deployments, see [Monitoring Azure Machine Learning](../monitor-azure-machine-learning.md).
29+
> This article supports data scientists and developers who want to deploy an MLflow model to an Azure Machine Learning web service endpoint. If you're an admin who wants to monitor resource usage and events from Azure Machine Learning, such as quotas, completed training runs, or completed model deployments, see [Monitoring Azure Machine Learning](../monitor-azure-machine-learning.md).
3030
31-
## MLflow with Azure Machine Learning deployment
31+
## Prerequisites
3232

33-
MLflow is an open-source library for managing the life cycle of your machine learning experiments. Its integration with Azure Machine Learning allows for you to extend this management beyond model training to the deployment phase of your production model.
33+
- Train a machine learning model. If you don't have a trained model, download the notebook that best fits your compute scenario in the [Azure Machine Learning Notebooks repository](https://github.com/Azure/MachineLearningNotebooks/tree/master/how-to-use-azureml/ml-frameworks/using-mlflow) on GitHub. Follow the instructions in the notebook and run the cells to prepare the model.
3434

35-
The following diagram demonstrates that with the MLflow deploy API and Azure Machine Learning, you can deploy models created with popular frameworks, like PyTorch, Tensorflow, scikit-learn, etc., as Azure web services and manage them in your workspace.
35+
- [Set up the MLflow Tracking URI to connect Azure Machine Learning](how-to-use-mlflow.md#track-runs-from-your-local-machine-or-remote-compute).
3636

37-
![ deploy mlflow models with azure machine learning](./media/how-to-deploy-mlflow-models/mlflow-diagram-deploy.png)
37+
- Install the **azureml-mlflow** package. This package automatically loads the **azureml-core** definitions in the [Azure Machine Learning Python SDK](/python/api/overview/azure/ml/install), which provides the connectivity for MLflow to access your workspace.
3838

39-
## Prerequisites
39+
- Confirm you have the required [access permissions for MLflow operations with your workspace](../how-to-assign-roles.md#mlflow-operations).
40+
41+
### Deployment options
42+
43+
Azure Machine Learning offers the following deployment configuration options:
44+
45+
- Azure Container Instances: Suitable for quick dev-test deployment.
46+
- Azure Kubernetes Service (AKS): Recommended for scalable production deployment.
4047

41-
* A machine learning model. If you don't have a trained model, find the notebook example that best fits your compute scenario in [this repo](https://github.com/Azure/MachineLearningNotebooks/tree/master/how-to-use-azureml/ml-frameworks/using-mlflow) and follow its instructions.
42-
* [Set up the MLflow Tracking URI to connect Azure Machine Learning](how-to-use-mlflow.md#track-runs-from-your-local-machine-or-remote-compute).
43-
* Install the `azureml-mlflow` package.
44-
* This package automatically brings in `azureml-core` of the [The Azure Machine Learning Python SDK](/python/api/overview/azure/ml/install), which provides the connectivity for MLflow to access your workspace.
45-
* See which [access permissions you need to perform your MLflow operations with your workspace](../how-to-assign-roles.md#mlflow-operations).
48+
[!INCLUDE [endpoints-option](../includes/machine-learning-endpoints-preview-note.md)]
49+
50+
For more MLflow and Azure Machine Learning functionality integrations, see [MLflow and Azure Machine Learning (v2)](../concept-mlflow.md), which uses the v2 SDK.
4651

47-
## Deploy to Azure Container Instance (ACI)
52+
## Deploy to Azure Container Instances
4853

4954
To deploy your MLflow model to an Azure Machine Learning web service, your model must be set up with the [MLflow Tracking URI to connect with Azure Machine Learning](how-to-use-mlflow.md).
5055

51-
In order to deploy to ACI, you don't need to define any deployment configuration, the service will default to an ACI deployment when a config is not provided.
52-
Then, register and deploy the model in one step with MLflow's [deploy](https://www.mlflow.org/docs/latest/python_api/mlflow.azureml.html#mlflow.azureml.deploy) method for Azure Machine Learning.
56+
For the deployment to Azure Container Instances, you don't need to define any deployment configuration. The service defaults to an Azure Container Instances deployment when a configuration isn't provided. You can register and deploy the model in one step with MLflow's [deploy](https://www.mlflow.org/docs/latest/python_api/mlflow.azureml.html#mlflow.azureml.deploy) method for Azure Machine Learning.
57+
58+
<!-- Reviewer: The following cell returns an error:
59+
60+
File /anaconda/envs/azureml_py310_sdkv2/lib/python3.10/site-packages/packaging/specifiers.py:25
61+
10 from typing import (
62+
(...)
63+
21 Union,
64+
22 )
65+
24 from .utils import canonicalize_version
66+
~~~> 25 from .version import LegacyVersion, Version, parse
67+
27 ParsedVersion = Union[Version, LegacyVersion]
68+
28 UnparsedVersion = Union[Version, LegacyVersion, str]
69+
70+
ImportError: cannot import name 'LegacyVersion' from 'packaging.version'
71+
(/anaconda/envs/azureml_py310_sdkv2/lib/python3.10/site-packages/packaging/version.py)
5372
73+
I tried downgrading the packaging version to 21.3 (or older) per this StackOverflow report, but the problem persists:
74+
https://stackoverflow.com/questions/74941714/importerror-cannot-import-name-legacyversion-from-packaging-version
75+
76+
-->
5477

5578
```python
5679
from mlflow.deployments import get_deploy_client
5780

58-
# set the tracking uri as the deployment client
81+
# Set the tracking URI as the deployment client
5982
client = get_deploy_client(mlflow.get_tracking_uri())
6083

61-
# set the model path
84+
# Set the model path
6285
model_path = "model"
6386

64-
# define the model path and the name is the service name
65-
# the model gets registered automatically and a name is autogenerated using the "name" parameter below
87+
# Define the model path and the name as the service name
88+
# The model is registered automatically and a name is autogenerated by using the "name" parameter
6689
client.create_deployment(name="mlflow-test-aci", model_uri='runs:/{}/{}'.format(run.id, model_path))
6790
```
6891

69-
### Customize deployment configuration
92+
### Customize deployment config json file
7093

71-
If you prefer not to use the defaults, you can set up your deployment configuration with a deployment config json file that uses parameters from the [deploy_configuration()](/python/api/azureml-core/azureml.core.webservice.aciwebservice#deploy-configuration-cpu-cores-none--memory-gb-none--tags-none--properties-none--description-none--location-none--auth-enabled-none--ssl-enabled-none--enable-app-insights-none--ssl-cert-pem-file-none--ssl-key-pem-file-none--ssl-cname-none--dns-name-label-none-) method as reference.
94+
If you prefer not to use the defaults, you can set up your deployment with a deployment config json file that uses parameters from the [deploy_configuration()](/python/api/azureml-core/azureml.core.webservice.aciwebservice#deploy-configuration-cpu-cores-none--memory-gb-none--tags-none--properties-none--description-none--location-none--auth-enabled-none--ssl-enabled-none--enable-app-insights-none--ssl-cert-pem-file-none--ssl-key-pem-file-none--ssl-cname-none--dns-name-label-none-) method as a reference.
7295

73-
For your deployment config json file, each of the deployment config parameters need to be defined in the form of a dictionary. The following is an example. [Learn more about what your deployment configuration json file can contain](reference-azure-machine-learning-cli.md#azure-container-instance-deployment-configuration-schema).
96+
### Define deployment config parameters
7497

98+
In your deployment config json file, you define each deployment config parameter in the form of a dictionary. The following snippet provides an example. For more information about what your deployment configuration json file can contain, see the [Azure Container instance deployment configuration schema](reference-azure-machine-learning-cli.md#azure-container-instance-deployment-configuration-schema) in the Azure Machine Learning Azure CLI reference.
7599

76-
### Azure Container Instance deployment configuration schema
77100
```json
78101
{"computeType": "aci",
79102
"containerResourceRequirements": {"cpu": 1, "memoryInGB": 1},
80103
"location": "eastus2"
81104
}
82105
```
83106

84-
Your json file can then be used to create your deployment.
107+
Your config json file can then be used to create your deployment:
85108

86109
```python
87-
# set the deployment config
110+
# Set the deployment config json file
88111
deploy_path = "deployment_config.json"
89112
test_config = {'deploy-config-file': deploy_path}
90113

91-
client.create_deployment(model_uri='runs:/{}/{}'.format(run.id, model_path),
92-
config=test_config,
93-
name="mlflow-test-aci")
114+
client.create_deployment(model_uri='runs:/{}/{}'.format(run.id, model_path), config=test_config, name="mlflow-test-aci")
94115
```
95116

96-
97117
## Deploy to Azure Kubernetes Service (AKS)
98118

99119
To deploy your MLflow model to an Azure Machine Learning web service, your model must be set up with the [MLflow Tracking URI to connect with Azure Machine Learning](how-to-use-mlflow.md).
100120

101-
To deploy to AKS, first create an AKS cluster. Create an AKS cluster using the [ComputeTarget.create()](/python/api/azureml-core/azureml.core.computetarget#create-workspace--name--provisioning-configuration-) method. It may take 20-25 minutes to create a new cluster.
121+
For deployment to AKS, you first create an AKS cluster by using the [ComputeTarget.create()](/python/api/azureml-core/azureml.core.computetarget#create-workspace--name--provisioning-configuration-) method. This process can take 20-25 minutes to create a new cluster.
102122

103123
```python
104124
from azureml.core.compute import AksCompute, ComputeTarget
@@ -109,58 +129,55 @@ prov_config = AksCompute.provisioning_configuration()
109129
aks_name = 'aks-mlflow'
110130

111131
# Create the cluster
112-
aks_target = ComputeTarget.create(workspace=ws,
113-
name=aks_name,
114-
provisioning_configuration=prov_config)
132+
aks_target = ComputeTarget.create(workspace=ws, name=aks_name, provisioning_configuration=prov_config)
115133

116134
aks_target.wait_for_completion(show_output = True)
117135

118136
print(aks_target.provisioning_state)
119137
print(aks_target.provisioning_errors)
120138
```
121-
Create a deployment config json using [deploy_configuration()](/python/api/azureml-core/azureml.core.webservice.aks.aksservicedeploymentconfiguration#parameters) method values as a reference. Each of the deployment config parameters simply need to be defined as a dictionary. Here's an example below:
139+
140+
Create a deployment config json by using the [deploy_configuration()](/python/api/azureml-core/azureml.core.webservice.aks.aksservicedeploymentconfiguration#parameters) method values as a reference. Define each deployment config parameter as a dictionary, as demonstrated in the following example:
122141

123142
```json
124143
{"computeType": "aks", "computeTargetName": "aks-mlflow"}
125144
```
126145

127-
Then, register and deploy the model in one step with MLflow's [deployment client](https://www.mlflow.org/docs/latest/python_api/mlflow.deployments.html).
146+
Then, register and deploy the model in a single step with the MLflow [deployment client](https://www.mlflow.org/docs/latest/python_api/mlflow.deployments.html):
128147

129148
```python
130149
from mlflow.deployments import get_deploy_client
131150

132-
# set the tracking uri as the deployment client
151+
# Set the tracking URI as the deployment client
133152
client = get_deploy_client(mlflow.get_tracking_uri())
134153

135-
# set the model path
154+
# Set the model path
136155
model_path = "model"
137156

138-
# set the deployment config
157+
# Set the deployment config json file
139158
deploy_path = "deployment_config.json"
140159
test_config = {'deploy-config-file': deploy_path}
141160

142-
# define the model path and the name is the service name
143-
# the model gets registered automatically and a name is autogenerated using the "name" parameter below
144-
client.create_deployment(model_uri='runs:/{}/{}'.format(run.id, model_path),
145-
config=test_config,
146-
name="mlflow-test-aci")
161+
# Define the model path and the name as the service name
162+
# The model is registered automatically and a name is autogenerated by using the "name" parameter
163+
client.create_deployment(model_uri='runs:/{}/{}'.format(run.id, model_path), config=test_config, name="mlflow-test-aci")
147164
```
148165

149166
The service deployment can take several minutes.
150167

151168
## Clean up resources
152169

153-
If you don't plan to use your deployed web service, use `service.delete()` to delete it from your notebook. For more information, see the documentation for [WebService.delete()](/python/api/azureml-core/azureml.core.webservice%28class%29#delete--).
170+
If you don't plan to use your deployed web service, use the `service.delete()` method to delete the service from your notebook. For more information, see the [delete() method of the WebService Class](/python/api/azureml-core/azureml.core.webservice%28class%29#azureml-core-webservice-delete) in the Python SDK documentation.
154171

155-
## Example notebooks
172+
## Explore example notebooks
156173

157174
The [MLflow with Azure Machine Learning notebooks](https://github.com/Azure/MachineLearningNotebooks/tree/master/how-to-use-azureml/ml-frameworks/using-mlflow) demonstrate and expand upon concepts presented in this article.
158175

159176
> [!NOTE]
160-
> A community-driven repository of examples using mlflow can be found at https://github.com/Azure/azureml-examples.
177+
> For a community-driven repository of examples that use MLflow, see the [Azure Machine Learning examples repository](https://github.com/Azure/azureml-examples) on GitHub.
161178
162-
## Next steps
179+
## Related content
163180

164-
* [Manage your models](concept-model-management-and-deployment.md).
165-
* Monitor your production models for [data drift](how-to-enable-data-collection.md).
166-
* [Track Azure Databricks runs with MLflow](../how-to-use-mlflow-azure-databricks.md).
181+
- [Manage, deploy, and monitor models with Azure Machine Learning v1](concept-model-management-and-deployment.md)
182+
- [Detect data drift (preview) on datasets](how-to-monitor-datasets.md)
183+
- [Track Azure Databricks experiment runs with MLflow](../how-to-use-mlflow-azure-databricks.md)

0 commit comments

Comments
 (0)