Skip to content

Commit 281be4c

Browse files
committed
fixes
1 parent 276aa07 commit 281be4c

File tree

5 files changed

+21
-20
lines changed

5 files changed

+21
-20
lines changed

articles/machine-learning/how-to-batch-scoring-script.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ ms.custom: how-to
2020
Batch endpoints allow you to deploy models to perform inference at scale. Because how inference should be executed varies from model's format, model's type and use case, batch endpoints require a scoring script (also known as batch driver script) to indicate the deployment how to use the model over the provided data. In this article you will learn how to use scoring scripts in different scenarios and their best practices.
2121

2222
> [!TIP]
23-
> MLflow models don't require a scoring script as it is autogenerated for you. For more details about how batch endpoints work with MLflow models, see the dedicated tutorial [Using MLflow models in batch deployments](how-to-mlflow-batch.md). Notice that this feature doesn't prevent you from writing an specific scoring script for MLflow models as explained at [Using MLflow models with a scoring script](how-to-mlflow-batch.md#using-mlflow-models-with-a-scoring-script).
23+
> MLflow models don't require a scoring script as it is autogenerated for you. For more details about how batch endpoints work with MLflow models, see the dedicated tutorial [Using MLflow models in batch deployments](how-to-mlflow-batch.md). Notice that this feature doesn't prevent you from writing an specific scoring script for MLflow models as explained at [Using MLflow models with a scoring script](how-to-mlflow-batch.md#customizing-mlflow-models-deployments-with-a-scoring-script).
2424
2525
> [!WARNING]
2626
> If you are deploying an Automated ML model under a batch endpoint, notice that the scoring script that Automated ML provides only works for Online Endpoints and it is not designed for batch execution. Please follow this guideline to learn how to create one depending on what your model does.

articles/machine-learning/how-to-deploy-mlflow-models-online-endpoints.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -280,7 +280,7 @@ Once your deployment completes, your deployment is ready to serve request. One o
280280
:::code language="json" source="~/azureml-examples-main/cli/endpoints/online/mlflow/sample-request-sklearn.json":::
281281

282282
> [!NOTE]
283-
> Notice how the key `input_data` has been used in this example instead of `inputs` as used in MLflow serving. This is because Azure Machine Learning requires a different input format to be able to automatically generate the swagger contracts for the endpoints. See [Considerations when deploying to real time inference](how-to-deploy-mlflow-models.md#considerations-when-deploying-to-real-time-inference) for details about expected input format.
283+
> Notice how the key `input_data` has been used in this example instead of `inputs` as used in MLflow serving. This is because Azure Machine Learning requires a different input format to be able to automatically generate the swagger contracts for the endpoints. See [Differences between models deployed in Azure Machine Learning and MLflow built-in server](how-to-deploy-mlflow-models.md#differences-between-models-deployed-in-azure-machine-learning-and-mlflow-built-in-server) for details about expected input format.
284284

285285
To submit a request to the endpoint, you can do as follows:
286286

articles/machine-learning/how-to-deploy-mlflow-models-online-progressive.md

Lines changed: 15 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -80,7 +80,7 @@ Online endpoints are endpoints that are used for online (real-time) inferencing.
8080

8181
We are going to exploit this functionality by deploying multiple versions of the same model under the same endpoint. However, the new deployment will receive 0% of the traffic at the begging. Once we are sure about the new model to work correctly, we are going to progressively move traffic from one deployment to the other.
8282

83-
#. Endpoints require a name, which needs to be unique in the same region. Let's ensure to create one that doesn't exist:
83+
1. Endpoints require a name, which needs to be unique in the same region. Let's ensure to create one that doesn't exist:
8484

8585
# [Azure CLI](#tab/cli)
8686

@@ -117,7 +117,7 @@ We are going to exploit this functionality by deploying multiple versions of the
117117
print(f"Endpoint name: {endpoint_name}")
118118
```
119119
120-
#. Configure the endpoint
120+
1. Configure the endpoint
121121
122122
# [Azure CLI](#tab/cli)
123123
@@ -155,7 +155,7 @@ We are going to exploit this functionality by deploying multiple versions of the
155155
outfile.write(json.dumps(endpoint_config))
156156
```
157157
158-
#. Create the endpoint:
158+
1. Create the endpoint:
159159
160160
# [Azure CLI](#tab/cli)
161161
@@ -178,7 +178,7 @@ We are going to exploit this functionality by deploying multiple versions of the
178178
)
179179
```
180180
181-
#. Getting the authentication secret for the endpoint.
181+
1. Getting the authentication secret for the endpoint.
182182
183183
# [Azure CLI](#tab/cli)
184184
@@ -206,7 +206,7 @@ We are going to exploit this functionality by deploying multiple versions of the
206206
207207
So far, the endpoint is empty. There are no deployments on it. Let's create the first one by deploying the same model we were working on before. We will call this deployment "default" and it will represent our "blue deployment".
208208
209-
#. Configure the deployment
209+
1. Configure the deployment
210210
211211
# [Azure CLI](#tab/cli)
212212
@@ -262,7 +262,7 @@ So far, the endpoint is empty. There are no deployments on it. Let's create the
262262
outfile.write(json.dumps(deploy_config))
263263
```
264264
265-
#. Create the deployment
265+
1. Create the deployment
266266
267267
# [Azure CLI](#tab/cli)
268268
@@ -287,7 +287,7 @@ So far, the endpoint is empty. There are no deployments on it. Let's create the
287287
)
288288
```
289289
290-
#. Test the deployment
290+
1. Test the deployment
291291
292292
# [Azure CLI](#tab/cli)
293293
@@ -322,7 +322,7 @@ So far, the endpoint is empty. There are no deployments on it. Let's create the
322322
323323
Let's imagine that there is a new version of the model created by the development team and it is ready to be in production. We can first try to fly this model and once we are confident, we can update the endpoint to route the traffic to it.
324324
325-
#. Register a new model version
325+
1. Register a new model version
326326
327327
# [Azure CLI](#tab/cli)
328328
@@ -354,7 +354,7 @@ Let's imagine that there is a new version of the model created by the developmen
354354
version = registered_model.version
355355
```
356356
357-
#. Configure a new deployment
357+
1. Configure a new deployment
358358
359359
# [Azure CLI](#tab/cli)
360360
@@ -413,7 +413,7 @@ Let's imagine that there is a new version of the model created by the developmen
413413
outfile.write(json.dumps(deploy_config))
414414
```
415415
416-
#. Create the new deployment
416+
1. Create the new deployment
417417
418418
# [Azure CLI](#tab/cli)
419419
@@ -442,7 +442,7 @@ Let's imagine that there is a new version of the model created by the developmen
442442
443443
One we are confident with the new deployment, we can update the traffic to route some of it to the new deployment. Traffic is configured at the endpoint level:
444444
445-
#. Configure the traffic:
445+
1. Configure the traffic:
446446
447447
# [Azure CLI](#tab/cli)
448448
@@ -470,7 +470,7 @@ One we are confident with the new deployment, we can update the traffic to route
470470
outfile.write(json.dumps(traffic_config))
471471
```
472472
473-
#. Update the endpoint
473+
1. Update the endpoint
474474
475475
# [Azure CLI](#tab/cli)
476476
@@ -493,7 +493,7 @@ One we are confident with the new deployment, we can update the traffic to route
493493
)
494494
```
495495
496-
#. If you decide to switch the entire traffic to the new deployment, update all the traffic:
496+
1. If you decide to switch the entire traffic to the new deployment, update all the traffic:
497497
498498
# [Azure CLI](#tab/cli)
499499
@@ -521,7 +521,7 @@ One we are confident with the new deployment, we can update the traffic to route
521521
outfile.write(json.dumps(traffic_config))
522522
```
523523
524-
#. Update the endpoint
524+
1. Update the endpoint
525525
526526
# [Azure CLI](#tab/cli)
527527
@@ -544,7 +544,7 @@ One we are confident with the new deployment, we can update the traffic to route
544544
)
545545
```
546546
547-
#. Since the old deployment doesn't receive any traffic, you can safely delete it:
547+
1. Since the old deployment doesn't receive any traffic, you can safely delete it:
548548
549549
# [Azure CLI](#tab/cli)
550550

articles/machine-learning/how-to-deploy-mlflow-models.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -57,7 +57,7 @@ A solution to this scenario is to implement machine learning pipelines that move
5757

5858
### Customize inference with a scoring script
5959

60-
If you want to customize how inference is executed for MLflow models (or opt-out for no-code deployment) you can refer to [Customizing MLflow model deployments (Online Endpoints)](how-to-deploy-mlflow-models-online-endpoints.md#customizing-mlflow-model-deployments) and [Customizing MLflow model deployments (Batch Endpoints)](how-to-mlflow-batch.md#using-mlflow-models-with-a-scoring-script).
60+
If you want to customize how inference is executed for MLflow models (or opt-out for no-code deployment) you can refer to [Customizing MLflow model deployments (Online Endpoints)](how-to-deploy-mlflow-models-online-endpoints.md#customizing-mlflow-model-deployments) and [Customizing MLflow model deployments (Batch Endpoints)](how-to-mlflow-batch.md#customizing-mlflow-models-deployments-with-a-scoring-script).
6161

6262
> [!IMPORTANT]
6363
> When you opt-in to indicate a scoring script, you also need to provide an environment for deployment.
@@ -78,7 +78,7 @@ Each workflow has different capabilities, particularly around which type of comp
7878
| Deploy MLflow models to managed online endpoints | [See example](how-to-deploy-mlflow-models-online-progressive.md)<sup>1</sup> | [See example](how-to-deploy-mlflow-models-online-endpoints.md)<sup>1</sup> | [See example](how-to-deploy-mlflow-models-online-endpoints.md?tabs=studio)]<sup>1</sup> |
7979
| Deploy MLflow models to managed online endpoints (with a scoring script) | Not supported | [See example](how-to-deploy-mlflow-models-online-endpoints.md#customizing-mlflow-model-deployments) | Not supported |
8080
| Deploy MLflow models to batch endpoints | | [See example](how-to-mlflow-batch.md) | [See example](how-to-mlflow-batch.md?tab=studio) |
81-
| Deploy MLflow models to batch endpoints (with a scoring script) | | [See example](how-to-mlflow-batch.md#using-mlflow-models-with-a-scoring-script) | Not supported |
81+
| Deploy MLflow models to batch endpoints (with a scoring script) | | [See example](how-to-mlflow-batch.md#customizing-mlflow-models-deployments-with-a-scoring-script) | Not supported |
8282
| Deploy MLflow models to web services (ACI/AKS) | Supported<sup>2</sup> | <sup>2</sup> | <sup>2</sup> |
8383
| Deploy MLflow models to web services (ACI/AKS - with a scoring script) | <sup>2</sup> | <sup>2</sup> | Supported<sup>2</sup> |
8484

@@ -91,7 +91,7 @@ Each workflow has different capabilities, particularly around which type of comp
9191
If you are familiar with MLflow or your platform support MLflow natively (like Azure Databricks) and you wish to continue using the same set of methods, use the MLflow SDK. On the other hand, if you are more familiar with the [Azure ML CLI v2](concept-v2.md), you want to automate deployments using automation pipelines, or you want to keep deployments configuration in a git repository; we recommend you to use the [Azure ML CLI v2](concept-v2.md). If you want to quickly deploy and test models trained with MLflow, you can use [Azure Machine Learning studio](https://ml.azure.com) UI deployment.
9292

9393

94-
## Differences between MLflow models deployed in Azure Machine Learning and MLflow built-in server
94+
## Differences between models deployed in Azure Machine Learning and MLflow built-in server
9595

9696
MLflow includes built-in deployment tools that model developers can use to test models locally. For instance, you can run a local instance of a model registered in MLflow server registry with `mlflow models serve -m my_model`. Since Azure Machine Learning online endpoints run our influencing server technology, the behavior of these two services is different.
9797

articles/machine-learning/toc.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -622,6 +622,7 @@
622622
- name: Text processing with batch deployments
623623
href: how-to-nlp-processing-batch.md
624624
- name: Integrations
625+
items:
625626
- name: Invoke batch endpoints from Azure Data Factory
626627
href: how-to-use-batch-azure-data-factory.md
627628
- name: Invoke batch endpoints from Event Grid events in storage

0 commit comments

Comments
 (0)