Skip to content

Commit 1cc66cb

Browse files
Merge pull request #228343 from santiagxf/santiagxf/azureml-batch-landing
Update how-to-use-batch-endpoint.md
2 parents 3226369 + 46f1e6b commit 1cc66cb

File tree

2 files changed

+15
-20
lines changed

2 files changed

+15
-20
lines changed

articles/machine-learning/how-to-secure-batch-endpoint.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -73,7 +73,7 @@ Consider the following limitations when working on batch endpoints deployed rega
7373

7474
- If you change the networking configuration of the workspace from public to private, or from private to public, such doesn't affect existing batch endpoints networking configuration. Batch endpoints rely on the configuration of the workspace at the time of creation. You can recreate your endpoints if you want them to reflect changes you made in the workspace.
7575

76-
- When working on a private link-enabled workspace, batch endpoints can be created and managed using Azure Machine Learning studio. However, they can't be invoked from the UI in studio. Use the Azure ML CLI v2 instead for job creation. For more details about how to use it see [Invoke the batch endpoint to start a batch scoring job](how-to-use-batch-endpoint.md#invoke-the-batch-endpoint-to-start-a-batch-job).
76+
- When working on a private link-enabled workspace, batch endpoints can be created and managed using Azure Machine Learning studio. However, they can't be invoked from the UI in studio. Use the Azure ML CLI v2 instead for job creation. For more details about how to use it see [Run batch endpoint to start a batch scoring job](how-to-use-batch-endpoint.md#run-endpoint-and-configure-inputs-and-outputs).
7777

7878
## Recommended read
7979

articles/machine-learning/how-to-use-batch-endpoint.md

Lines changed: 14 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -96,7 +96,7 @@ Open the [Azure ML studio portal](https://ml.azure.com) and sign in using your c
9696

9797
Batch endpoints run on compute clusters. They support both [Azure Machine Learning Compute clusters (AmlCompute)](./how-to-create-attach-compute-cluster.md) or [Kubernetes clusters](./how-to-attach-kubernetes-anywhere.md). Clusters are a shared resource so one cluster can host one or many batch deployments (along with other workloads if desired).
9898

99-
Run the following code to create an Azure Machine Learning compute cluster. The following examples in this article use the compute created here named `batch-cluster`. Adjust as needed and reference your compute using `azureml:<your-compute-name>`.
99+
This article uses a compute created here named `batch-cluster`. Adjust as needed and reference your compute using `azureml:<your-compute-name>` or create one as shown.
100100

101101
# [Azure CLI](#tab/azure-cli)
102102

@@ -220,7 +220,6 @@ A batch endpoint is an HTTPS endpoint that clients can call to trigger a batch s
220220
| --- | ----------- |
221221
| `name` | The name of the batch endpoint. Needs to be unique at the Azure region level.|
222222
| `description` | The description of the batch endpoint. This property is optional. |
223-
| `auth_mode` | The authentication method for the batch endpoint. Currently only Azure Active Directory token-based authentication (`aad_token`) is supported. |
224223
| `defaults.deployment_name` | The name of the deployment that will serve as the default deployment for the endpoint. |
225224
226225
# [Studio](#tab/azure-studio)
@@ -245,22 +244,6 @@ A batch endpoint is an HTTPS endpoint that clients can call to trigger a batch s
245244
246245
*You'll create the endpoint in the same step you are creating the deployment later.*
247246
248-
## Create a scoring script
249-
250-
Batch deployments require a scoring script that indicates how the given model should be executed and how input data must be processed.
251-
252-
> [!NOTE]
253-
> For MLflow models, Azure Machine Learning automatically generates the scoring script, so you're not required to provide one. If your model is an MLflow model, you can skip this step. For more information about how batch endpoints work with MLflow models, see the dedicated tutorial [Using MLflow models in batch deployments](how-to-mlflow-batch.md).
254-
255-
> [!WARNING]
256-
> If you're deploying an Automated ML model under a batch endpoint, notice that the scoring script that Automated ML provides only works for online endpoints and is not designed for batch execution. Please see [Author scoring scripts for batch deployments](how-to-batch-scoring-script.md) to learn how to create one depending on what your model does.
257-
258-
In this case, we're deploying a model that reads image files representing digits and outputs the corresponding digit. The scoring script is as follows:
259-
260-
__mnist/code/batch_driver.py__
261-
262-
:::code language="python" source="~/azureml-examples-main/sdk/python/endpoints/batch/mnist/code/batch_driver.py" :::
263-
264247
## Create a batch deployment
265248
266249
A deployment is a set of resources required for hosting the model that does the actual inferencing. To create a batch deployment, you need all the following items:
@@ -270,6 +253,18 @@ A deployment is a set of resources required for hosting the model that does the
270253
* The environment in which the model runs.
271254
* The pre-created compute and resource settings.
272255
256+
1. Batch deployments require a scoring script that indicates how a given model should be executed and how input data must be processed. Batch Endpoints support scripts created in Python. In this case, we're deploying a model that reads image files representing digits and outputs the corresponding digit. The scoring script is as follows:
257+
258+
> [!NOTE]
259+
> For MLflow models, Azure Machine Learning automatically generates the scoring script, so you're not required to provide one. If your model is an MLflow model, you can skip this step. For more information about how batch endpoints work with MLflow models, see the dedicated tutorial [Using MLflow models in batch deployments](how-to-mlflow-batch.md).
260+
261+
> [!WARNING]
262+
> If you're deploying an Automated ML model under a batch endpoint, notice that the scoring script that Automated ML provides only works for online endpoints and is not designed for batch execution. Please see [Author scoring scripts for batch deployments](how-to-batch-scoring-script.md) to learn how to create one depending on what your model does.
263+
264+
__mnist/code/batch_driver.py__
265+
266+
:::code language="python" source="~/azureml-examples-main/sdk/python/endpoints/batch/mnist/code/batch_driver.py" :::
267+
273268
1. Create an environment where your batch deployment will run. Such environment needs to include the packages `azureml-core` and `azureml-dataset-runtime[fuse]`, which are required by batch endpoints, plus any dependency your code requires for running. In this case, the dependencies have been captured in a `conda.yml`:
274269
275270
__mnist/environment/conda.yml__
@@ -480,7 +475,7 @@ A deployment is a set of resources required for hosting the model that does the
480475
481476
:::image type="content" source="./media/how-to-use-batch-endpoints-studio/batch-endpoint-details.png" alt-text="Screenshot of the check batch endpoints and deployment details.":::
482477
483-
## Invoke the batch endpoint to start a batch job
478+
## Run endpoint and configure inputs and outputs
484479
485480
Invoking a batch endpoint triggers a batch scoring job. A job `name` will be returned from the invoke response and can be used to track the batch scoring progress. The batch scoring job runs for some time. It splits the entire inputs into multiple `mini_batch` and processes in parallel on the compute cluster. The batch scoring job outputs will be stored in cloud storage, either in the workspace's default blob storage, or the storage you specified.
486481

0 commit comments

Comments
 (0)