You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In this article, learn how to deploy your [MLflow](https://www.mlflow.org) model to Azure ML for both real-time and batch inference. Azure ML supports no-code deployment of models created and logged with MLflow. This means that you don't have to provide a scoring script or an environment. Those models can be deployed to ACI (Azure Container Instances), AKS (Azure Kubernetes Services) or our managed inference services (usually referred as MIR).
25
+
In this article, learn how to deploy your [MLflow](https://www.mlflow.org) model to Azure ML for both real-time and batch inference. Azure ML supports no-code deployment of models created and logged with MLflow. This means that you don't have to provide a scoring script or an environment. Those models can be deployed to ACI (Azure Container Instances), AKS (Azure Kubernetes Services) or our managed inference services (referred as MIR).
26
26
27
27
For no-code-deployment, Azure Machine Learning
28
28
@@ -34,7 +34,7 @@ For no-code-deployment, Azure Machine Learning
34
34
*`pandas`
35
35
* The scoring script baked into the image.
36
36
37
-
## Supported targets for MLflow models:
37
+
## Supported targets for MLflow models
38
38
39
39
The following table shows the target support for MLflow models in Azure ML:
40
40
@@ -55,7 +55,9 @@ The following table shows the target support for MLflow models in Azure ML:
55
55
> - <sup>4</sup> Data type `mlflow.types.DataType.Binary` is not supported as column type. For models that work with images, we suggest you to use or (a) tensors inputs using the [TensorSpec input type](https://mlflow.org/docs/latest/python_api/mlflow.types.html#mlflow.types.TensorSpec), or (b) `Base64` encoding schemes with a `mlflow.types.DataType.String` column type, which is commonly used when there is a need to encode binary data that needs be stored and transferred over media.
56
56
> - <sup>5</sup> Tensors with unspecified shapes (`-1`) is only supported at the batch size by the moment. For instance, a signature with shape `(-1, -1, -1, 3)` is not supported but `(-1, 300, 300, 3)` is.
57
57
58
-
## Options
58
+
For more information about how to specify requests to online-endpoints or the supported file types in batch-endpoints, check [Considerations when deploying to real time inference](#considerations-when-deploying-to-real-time-inference) and [Considerations when deploying to batch inference](#considerations-when-deploying-to-batch-inference).
59
+
60
+
## Deployment tools
59
61
60
62
There are three workflows for deploying MLflow models to Azure ML:
61
63
@@ -65,7 +67,7 @@ There are three workflows for deploying MLflow models to Azure ML:
65
67
66
68
### Which option to use?
67
69
68
-
If you are familiar with MLflow or your platform support MLflow natively (like Azure Databricks) and you wish to continue using the same set of methods, use the `azureml-mlflow` plugin. If, on the other hand, you are more familiar with the [Azure ML CLI v2](concept-v2.md), you want to automate deployments using CI/CD pipelines, or you want to keep deployments configuration in a git repository, we recommend you to use the [Azure ML CLI v2](concept-v2.md). If you want to quickly deploy and test models trained with MLflow, you can use [Azure Machine Learning Studio](https://ml.azure.com) UI deployment.
70
+
If you are familiar with MLflow or your platform support MLflow natively (like Azure Databricks) and you wish to continue using the same set of methods, use the `azureml-mlflow` plugin. On the other hand, if you are more familiar with the [Azure ML CLI v2](concept-v2.md), you want to automate deployments using automation pipelines, or you want to keep deployments configuration in a git repository; we recommend you to use the [Azure ML CLI v2](concept-v2.md). If you want to quickly deploy and test models trained with MLflow, you can use [Azure Machine Learning Studio](https://ml.azure.com) UI deployment.
69
71
70
72
## Deploy using the MLflow plugin
71
73
@@ -151,7 +153,7 @@ The following sample creates a deployment using an ACI:
151
153
152
154
Deployments can be generated using both the Python API for MLflow or MLflow CLI. In both cases, a JSON configuration file needs to be indicated with the details of the deployment you want to achieve. The full specification of this configuration can be found at [Managed online deployment schema (v2)](reference-yaml-deployment-managed-online.md).
153
155
154
-
#### Configuration example for an Managed Inference Service deployment (real time)
156
+
#### Configuration example for a Managed Inference Service deployment (real time)
155
157
156
158
```json
157
159
{
@@ -210,7 +212,7 @@ You can use Azure ML CLI v2 to deploy models trained and logged with MLflow to M
210
212
211
213
[!INCLUDE [clone repo & set defaults](../../includes/machine-learning-cli-prepare.md)]
212
214
213
-
In this code snippets used in this article, the `ENDPOINT_NAME` environment variable contains the name of the endpoint to create and use. To set this, use the following command from the CLI. Replace `<YOUR_ENDPOINT_NAME>` with the name of your endpoint:
215
+
In this code snippet used in this article, the `ENDPOINT_NAME` environment variable contains the name of the endpoint to create and use. To set this, use the following command from the CLI. Replace `<YOUR_ENDPOINT_NAME>` with the name of your endpoint:
@@ -325,9 +327,11 @@ The following input's types are supported in Azure ML when deploying models with
325
327
> - <sup>1</sup> We suggest you to use split orientation instead. Records orientation doesn't guarante column ordering preservation.
326
328
> - <sup>2</sup> We suggest you to explore batch inference for processing files.
327
329
330
+
Regardless of the input type used, Azure Machine Learning requires inputs to be provided in a JSON payload, within a dictionary key `input_data`. Note that such key is not required when serving models using the command `mlflow models serve` and hence payloads can't be used interchangeably.
331
+
328
332
### Creating requests
329
333
330
-
Your inputs should be submitted inside the a JSON payload containing a dictionary with key `input_data`.
334
+
Your inputs should be submitted inside a JSON payload containing a dictionary with key `input_data`.
331
335
332
336
#### Payload example for a JSON-serialized pandas DataFrames in the split orientation
0 commit comments