Skip to content

Commit 8f96cf9

Browse files
Merge pull request #222657 from santiagxf/santiagxf-patch-2
Minor updates in the structure of the MLflow deployment guideline
2 parents e0364ff + 847ff1e commit 8f96cf9

File tree

1 file changed

+16
-14
lines changed

1 file changed

+16
-14
lines changed

articles/machine-learning/how-to-deploy-mlflow-models.md

Lines changed: 16 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -39,6 +39,17 @@ For no-code-deployment, Azure Machine Learning:
3939
> [!WARNING]
4040
> Online Endpoints dynamically installs Python packages provided MLflow model package during container runtime. deploying MLflow models to online endpoints with no-code deployment in a private network without egress connectivity is not supported by the moment. If that's your case, either enable egress connectivity or indicate the environment to use in the deployment as explained in [Customizing MLflow model deployments (Online Endpoints)](how-to-deploy-mlflow-models-online-endpoints.md#customizing-mlflow-model-deployments). This limitation is not present in Batch Endpoints.
4141
42+
### Python packages and dependencies
43+
44+
Azure Machine Learning automatically generates environments to run inference of MLflow models. Those environments are built by reading the conda dependencies specified in the MLflow model. Azure Machine Learning also adds any required package to run the inferencing server, which will vary depending on the type of deployment you are doing.
45+
46+
__conda.yaml__
47+
48+
:::code language="yaml" source="~/azureml-examples-main/sdk/python/endpoints/online/mlflow/sklearn-diabetes/model/conda.yaml" highlight="13-19":::
49+
50+
> [!WARNING]
51+
> MLflow performs automatic package detection when logging models, and pins their versions in the conda dependencies of the model. However, such action is performed at the best of its knowledge and there may be cases when the detection doesn't reflect your intentions or requirements. On those cases consider [logging models with a custom conda dependencies definition](how-to-log-mlflow-models.md?#logging-models-with-a-custom-signature-environment-or-samples).
52+
4253
### Implications of models with signatures
4354

4455
MLflow models can include a signature that indicates the expected inputs and their types. For those models containing a signature, Azure Machine Learning enforces compliance with it, both in terms of the number of inputs and their types. This means that your data input should comply with the types indicated in the model signature. If the data can't be parsed as expected, the invocation will fail. This applies for both online and batch endpoints.
@@ -52,19 +63,6 @@ You can inspect the model signature of your model by opening the MLmodel file as
5263
> [!TIP]
5364
> Signatures in MLflow models are optional but they are highly encouraged as they provide a convenient way to early detect data compatibility issues. For more information about how to log models with signatures read [Logging models with a custom signature, environment or samples](how-to-log-mlflow-models.md#logging-models-with-a-custom-signature-environment-or-samples).
5465
55-
56-
### Python packages and dependencies
57-
58-
Azure Machine Learning automatically generates environments to run inference of MLflow models. Those environments are built by reading the conda dependencies specified in the MLflow model. Azure Machine Learning also adds any required package to run the inferencing server, which will vary depending on the type of deployment you are doing.
59-
60-
__conda.yaml__
61-
62-
:::code language="yaml" source="~/azureml-examples-main/sdk/python/endpoints/online/mlflow/sklearn-diabetes/model/conda.yaml" highlight="13-19":::
63-
64-
> [!WARNING]
65-
> MLflow performs automatic package detection when logging models, and pins their versions in the conda dependencies of the model. However, such action is performed at the best of its knowledge and there may be cases when the detection doesn't reflect your intentions or requirements. On those cases consider [logging models with a custom conda dependencies definition](how-to-log-mlflow-models.md?#logging-models-with-a-custom-signature-environment-or-samples).
66-
67-
6866
## Deployment tools
6967

7068
Azure Machine Learning offers many ways to deploy MLflow models into Online and Batch endpoints. You can deploy models using the following tools:
@@ -96,7 +94,11 @@ If you are familiar with MLflow or your platform support MLflow natively (like A
9694

9795
## Differences between models deployed in Azure Machine Learning and MLflow built-in server
9896

99-
MLflow includes built-in deployment tools that model developers can use to test models locally. For instance, you can run a local instance of a model registered in MLflow server registry with `mlflow models serve -m my_model`. Since Azure Machine Learning online endpoints run our influencing server technology, the behavior of these two services is different.
97+
MLflow includes built-in deployment tools that model developers can use to test models locally. For instance, you can run a local instance of a model registered in MLflow server registry with `mlflow models serve -m my_model` or you can use the MLflow CLI `mlflow models predict`. Azure Machine Learning online and batch endpoints run different inferencing technologies which may have different features. Read this section to understand their differences.
98+
99+
### Batch vs Online endpoints
100+
101+
Azure Machine Learning supports deploying models to both online and batch endpoints. Online Endpoints compare to [MLflow built-in server](https://www.mlflow.org/docs/latest/models.html#built-in-deployment-tools) and they provide a scalable, synchronous, and lightweight way to run models for inference. Batch Endpoints, on the other hand, provide a way to run asynchronous inference over long running inferencing processes that can scale to big amounts of data. This capability is not present by the moment in MLflow server although similar capability can be achieved using Spark jobs. The rest of this section mostly applies to online endpoints but you can learn more of batch endpoint at [What are Azure Machine Learning endpoints?](concept-endpoints.md).
100102

101103
### Input formats
102104

0 commit comments

Comments
 (0)