Skip to content

Commit 3f1e560

Browse files
authored
Merge pull request #222583 from santiagxf/santiagxf/aml-mlflow-refresh
MLflow refresh for 2.0
2 parents 7f49dfa + fb4a4f6 commit 3f1e560

9 files changed

+971
-459
lines changed

articles/machine-learning/how-to-batch-scoring-script.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ ms.custom: how-to
2020
Batch endpoints allow you to deploy models to perform inference at scale. Because how inference should be executed varies from model's format, model's type and use case, batch endpoints require a scoring script (also known as batch driver script) to indicate the deployment how to use the model over the provided data. In this article you will learn how to use scoring scripts in different scenarios and their best practices.
2121

2222
> [!TIP]
23-
> MLflow models don't require a scoring script as it is autogenerated for you. For more details about how batch endpoints work with MLflow models, see the dedicated tutorial [Using MLflow models in batch deployments](how-to-mlflow-batch.md). Notice that this feature doesn't prevent you from writing an specific scoring script for MLflow models as explained at [Using MLflow models with a scoring script](how-to-mlflow-batch.md#using-mlflow-models-with-a-scoring-script).
23+
> MLflow models don't require a scoring script as it is autogenerated for you. For more details about how batch endpoints work with MLflow models, see the dedicated tutorial [Using MLflow models in batch deployments](how-to-mlflow-batch.md). Notice that this feature doesn't prevent you from writing an specific scoring script for MLflow models as explained at [Using MLflow models with a scoring script](how-to-mlflow-batch.md#customizing-mlflow-models-deployments-with-a-scoring-script).
2424
2525
> [!WARNING]
2626
> If you are deploying an Automated ML model under a batch endpoint, notice that the scoring script that Automated ML provides only works for Online Endpoints and it is not designed for batch execution. Please follow this guideline to learn how to create one depending on what your model does.

articles/machine-learning/how-to-deploy-mlflow-models-online-endpoints.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -92,7 +92,7 @@ The workspace is the top-level resource for Azure Machine Learning, providing a
9292

9393
# [Studio](#tab/studio)
9494

95-
Navigate to [Azure Machine Learning Studio](https://ml.azure.com).
95+
Navigate to [Azure Machine Learning studio](https://ml.azure.com).
9696

9797
---
9898

@@ -280,7 +280,7 @@ Once your deployment completes, your deployment is ready to serve request. One o
280280
:::code language="json" source="~/azureml-examples-main/cli/endpoints/online/mlflow/sample-request-sklearn.json":::
281281

282282
> [!NOTE]
283-
> Notice how the key `input_data` has been used in this example instead of `inputs` as used in MLflow serving. This is because Azure Machine Learning requires a different input format to be able to automatically generate the swagger contracts for the endpoints. See [Considerations when deploying to real time inference](how-to-deploy-mlflow-models.md#considerations-when-deploying-to-real-time-inference) for details about expected input format.
283+
> Notice how the key `input_data` has been used in this example instead of `inputs` as used in MLflow serving. This is because Azure Machine Learning requires a different input format to be able to automatically generate the swagger contracts for the endpoints. See [Differences between models deployed in Azure Machine Learning and MLflow built-in server](how-to-deploy-mlflow-models.md#differences-between-models-deployed-in-azure-machine-learning-and-mlflow-built-in-server) for details about expected input format.
284284

285285
To submit a request to the endpoint, you can do as follows:
286286

0 commit comments

Comments
 (0)