Skip to content

Commit eae6635

Browse files
authored
Update how-to-deploy-mlflow-models.md
1 parent a1d0f7f commit eae6635

File tree

1 file changed

+33
-32
lines changed

1 file changed

+33
-32
lines changed

articles/machine-learning/how-to-deploy-mlflow-models.md

Lines changed: 33 additions & 32 deletions
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,6 @@ In this article, learn how to deploy your [MLflow](https://www.mlflow.org) model
2626
For no-code-deployment, Azure Machine Learning
2727

2828
* Dynamically installs Python packages provided in the `conda.yaml` file, this means the dependencies are installed during container runtime.
29-
* The base container image/curated environment used for dynamic installation is `mcr.microsoft.com/azureml/mlflow-ubuntu18.04-py37-cpu-inference` or `AzureML-mlflow-ubuntu18.04-py37-cpu-inference`
3029
* Provides a MLflow base image/curated environment that contains the following items:
3130
* [`azureml-inference-server-http`](how-to-inference-server-http.md)
3231
* [`mlflow-skinny`](https://github.com/mlflow/mlflow/blob/master/README_SKINNY.rst)
@@ -35,13 +34,8 @@ For no-code-deployment, Azure Machine Learning
3534
> [!IMPORTANT]
3635
> If you are used to deploying models using scoring scripts and custom environments and you are looking to know how to achieve the same functionality using MLflow models, we recommend reading [Using MLflow models for no-code deployment](how-to-log-mlflow-models.md).
3736
38-
> [!NOTE]
39-
> Consider the following limitations when deploying MLflow models to Azure Machine Learning:
40-
> - Spark flavor is not supported at the moment for deployment.
41-
> - Data type `mlflow.types.DataType.Binary` is not supported as column type in signatures. For models that work with images, we suggest you to use or (a) tensors inputs using the [TensorSpec input type](https://mlflow.org/docs/latest/python_api/mlflow.types.html#mlflow.types.TensorSpec), or (b) `Base64` encoding schemes with a `mlflow.types.DataType.String` column type, which is commonly used when there is a need to encode binary data that needs be stored and transferred over media.
42-
> - Signatures with tensors with unspecified shapes (`-1`) is only supported at the batch size by the moment. For instance, a signature with shape `(-1, -1, -1, 3)` is not supported but `(-1, 300, 300, 3)` is.
43-
44-
For more information about how to specify requests to online endpoints, view [Considerations when deploying to real-time inference](#considerations-when-deploying-to-real-time-inference). For more information about the supported file types in batch endpoints, view [Considerations when deploying to batch inference](#considerations-when-deploying-to-batch-inference).
37+
> [!WARNING]
38+
> For information about inputs format and limitation in online endpoints, view [Considerations when deploying to real-time inference](#considerations-when-deploying-to-real-time-inference). For more information about the supported file types in batch endpoints, view [Considerations when deploying to batch inference](#considerations-when-deploying-to-batch-inference).
4539
4640
## Deployment tools
4741

@@ -237,13 +231,13 @@ This example shows how you can deploy an MLflow model to an online endpoint usin
237231

238232
__create-endpoint.yaml__
239233

240-
:::code language="yaml" source="~/azureml-examples-main/cli/endpoints/online/mlflow/create-endpoint.yaml":::
234+
:::code language="yaml" source="~/azureml-examples-main/cli/endpoints/online/mlflow/create-endpoint.yaml":::
241235

242236
# [Batch endpoints](#tab/batch)
243237

244238
__create-endpoint.yaml__
245239

246-
:::code language="yaml" source="~/azureml-examples-main/cli/endpoints/batch/batch-endpoint.yml":::
240+
:::code language="yaml" source="~/azureml-examples-main/cli/endpoints/batch/batch-endpoint.yml":::
247241

248242

249243
1. To create a new endpoint using the YAML configuration, use the following command:
@@ -365,37 +359,36 @@ You can use [Azure Machine Learning studio](https://ml.azure.com) to deploy mode
365359
366360
2. From [studio](https://ml.azure.com), select your workspace and then use either the __endpoints__ or __models__ page to create the endpoint deployment:
367361
368-
# [Online endpoints](#tab/batch)
362+
# [Online endpoints](#tab/mir)
369363
370-
1. From the __Endpoints__ page, y the __Batch endpoints__ section, select **+Create**.
364+
1. From the __Endpoints__ page, y the __Batch endpoints__ section, select **+Create**.
371365
372-
:::image type="content" source="media/how-to-deploy-mlflow-models-online-endpoints/create-from-endpoints.png" lightbox="media/how-to-deploy-mlflow-models-online-endpoints/create-from-endpoints.png" alt-text="Screenshot showing create option on the Endpoints UI page.":::
366+
:::image type="content" source="media/how-to-deploy-mlflow-models-online-endpoints/create-from-endpoints.png" lightbox="media/how-to-deploy-mlflow-models-online-endpoints/create-from-endpoints.png" alt-text="Screenshot showing create option on the Endpoints UI page.":::
373367
374-
1. Provide a name for the endpoint, and then select __Next__.
375-
1. When selecting a model, select the MLflow model registered previously. Select __Next__ to continue.
368+
1. Provide a name for the endpoint, and then select __Next__.
369+
1. When selecting a model, select the MLflow model registered previously. Select __Next__ to continue.
376370
377-
1. When you select a model registered in MLflow format, in the Environment step of the wizard, you don't need a scoring script or an environment.
371+
1. When you select a model registered in MLflow format, in the Environment step of the wizard, you don't need a scoring script or an environment.
378372

379-
:::image type="content" source="media/how-to-deploy-mlflow-models-online-endpoints/ncd-wizard.png" lightbox="media/how-to-deploy-mlflow-models-online-endpoints/ncd-wizard.png" alt-text="Screenshot showing no code and environment needed for MLflow models.":::
373+
:::image type="content" source="media/how-to-deploy-mlflow-models-online-endpoints/ncd-wizard.png" lightbox="media/how-to-deploy-mlflow-models-online-endpoints/ncd-wizard.png" alt-text="Screenshot showing no code and environment needed for MLflow models.":::
380374

381-
1. Complete the wizard to deploy the model to the endpoint.
375+
1. Complete the wizard to deploy the model to the endpoint.
382376

383-
:::image type="content" source="media/how-to-deploy-mlflow-models-online-endpoints/review-screen-ncd.png" lightbox="media/how-to-deploy-mlflow-models-online-endpoints/review-screen-ncd.png" alt-text="Screenshot showing NCD review screen.":::
377+
:::image type="content" source="media/how-to-deploy-mlflow-models-online-endpoints/review-screen-ncd.png" lightbox="media/how-to-deploy-mlflow-models-online-endpoints/review-screen-ncd.png" alt-text="Screenshot showing NCD review screen.":::
384378

385-
# [Batch endpoints](#tab/mir)
386-
387-
1. From the __Endpoints__ page, Select **+Create**.
379+
# [Batch endpoints](#tab/batch)
388380

389-
:::image type="content" source="media/how-to-deploy-mlflow-models/create-batch-endpoint.png" lightbox="media/how-to-deploy-mlflow-models/create-from-endpoints.png" alt-text="Screenshot showing create option on the Endpoints UI page.":::
381+
1. From the __Endpoints__ page, Select **+Create**.
390382

391-
1. Provide a name and authentication type for the endpoint, and then select __Next__.
392-
1. When selecting a model, select the MLflow model registered previously. Select __Next__ to continue.
383+
:::image type="content" source="media/how-to-deploy-mlflow-models/create-batch-endpoint.png" lightbox="media/how-to-deploy-mlflow-models/create-from-endpoints.png" alt-text="Screenshot showing create option on the Endpoints UI page.":::
393384

394-
1. When you select a model registered in MLflow format, in the Environment step of the wizard, you don't need a scoring script or an environment.
385+
1. Provide a name and authentication type for the endpoint, and then select __Next__.
386+
1. When selecting a model, select the MLflow model registered previously. Select __Next__ to continue.
387+
1. When you select a model registered in MLflow format, in the Environment step of the wizard, you don't need a scoring script or an environment.
395388
396-
:::image type="content" source="media/how-to-deploy-mlflow-models-online-endpoints/ncd-wizard.png" lightbox="media/how-to-deploy-mlflow-models-online-endpoints/ncd-wizard.png" alt-text="Screenshot showing no code and environment needed for MLflow models.":::
389+
:::image type="content" source="media/how-to-deploy-mlflow-models-online-endpoints/ncd-wizard.png" lightbox="media/how-to-deploy-mlflow-models-online-endpoints/ncd-wizard.png" alt-text="Screenshot showing no code and environment needed for MLflow models.":::
397390
398-
1. Create your default deployment in your endpoint. For that provide the following values and then clic on __Next__:
391+
1. Create your default deployment in your endpoint. For that provide the following values and then clic on __Next__:
399392
400393
* __Deployment name:__ Name of the default deployment you want.
401394
* __Output action:__ Use __Append__ to output the predictions generated by the model in your output.
@@ -405,13 +398,13 @@ You can use [Azure Machine Learning studio](https://ml.azure.com) to deploy mode
405398
* __Max retires:__ The number of times a worker will retry to score a given mini batch processing has failed.
406399
* __Max concurrency per instance:__ The number of workers each instance will have available. If you cluster has 2 nodes and you indicate __Max concurrency per instance__ = 2, then 4 workers will be available to you. Each of them will process __Mini batch size__ samples at a time.
407400
408-
1. For environment, you don't have to specify anything for MLflow models.
401+
1. For environment, you don't have to specify anything for MLflow models.
409402

410-
1. Configure the cluster the jobs will run on and the number of instances it will be utilized from it. Azure Machine Learning Batch Endpoints runs on Compute Clusters. You will need to have a compute cluster created where the batch endpoints will get deployed. The cluster is only utilized when jobs are submitted, so you can utilize the same cluster for multiple deployments if needed.
403+
1. Configure the cluster the jobs will run on and the number of instances it will be utilized from it. Azure Machine Learning Batch Endpoints runs on Compute Clusters. You will need to have a compute cluster created where the batch endpoints will get deployed. The cluster is only utilized when jobs are submitted, so you can utilize the same cluster for multiple deployments if needed.
411404

412-
:::image type="content" source="media/how-to-deploy-mlflow-models/create-batch-endpoint-2.png" lightbox="media/how-to-deploy-mlflow-models/create-batch-endpoint-2.png" alt-text="Screenshot showing cluster configuration.":::
405+
:::image type="content" source="media/how-to-deploy-mlflow-models/create-batch-endpoint-2.png" lightbox="media/how-to-deploy-mlflow-models/create-batch-endpoint-2.png" alt-text="Screenshot showing cluster configuration.":::
413406

414-
1. Complete the wizard to deploy the model to the endpoint.
407+
1. Complete the wizard to deploy the model to the endpoint.
415408

416409

417410
## Considerations when deploying to real time inference
@@ -480,6 +473,14 @@ Your inputs should be submitted inside a JSON payload containing a dictionary wi
480473
}
481474
```
482475
476+
### Limitations
477+
478+
> [!NOTE]
479+
> Consider the following limitations when deploying MLflow models to Azure Machine Learning:
480+
> - Spark flavor is not supported at the moment for deployment.
481+
> - Data type `mlflow.types.DataType.Binary` is not supported as column type in signatures. For models that work with images, we suggest you to use or (a) tensors inputs using the [TensorSpec input type](https://mlflow.org/docs/latest/python_api/mlflow.types.html#mlflow.types.TensorSpec), or (b) `Base64` encoding schemes with a `mlflow.types.DataType.String` column type, which is commonly used when there is a need to encode binary data that needs be stored and transferred over media.
482+
> - Signatures with tensors with unspecified shapes (`-1`) is only supported at the batch size by the moment. For instance, a signature with shape `(-1, -1, -1, 3)` is not supported but `(-1, 300, 300, 3)` is.
483+
483484
## Considerations when deploying to batch inference
484485
485486
Azure Machine Learning supports no-code deployment for batch inference in [managed endpoints](concept-endpoints.md). This represents a convenient way to deploy models that require processing of big amounts of data in a batch-fashion.

0 commit comments

Comments
 (0)