You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/machine-learning/how-to-use-batch-endpoint.md
+63-59Lines changed: 63 additions & 59 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -503,7 +503,7 @@ A deployment is a set of resources required for hosting the model that does the
503
503
504
504
:::image type="content" source="./media/how-to-use-batch-endpoints-studio/batch-endpoint-details.png" alt-text="Screenshot of the check batch endpoints and deployment details.":::
505
505
506
-
## Run endpoint and configure inputs and outputs
506
+
## Run batch endpoints and access results
507
507
508
508
Invoking a batch endpoint triggers a batch scoring job. A job `name` will be returned from the invoke response and can be used to track the batch scoring progress. The batch scoring job runs for some time. It splits the entire inputs into multiple `mini_batch` and processes in parallel on the compute cluster. The batch scoring job outputs will be stored in cloud storage, either in the workspace's default blob storage, or the storage you specified.
Batch scoring jobs usually take some time to process the entire set of inputs.
552
+
553
+
# [Azure CLI](#tab/azure-cli)
554
+
555
+
You can use CLI `job show` to view the job. Run the following code to check job status from the previous endpoint invoke. To learn more about job commands, run `az ml job -h`.
The following code checks the job status and outputs a link to the Azure Machine Learning studio for further details.
562
+
563
+
```python
564
+
ml_client.jobs.get(job.name)
565
+
```
566
+
567
+
# [Studio](#tab/azure-studio)
568
+
569
+
1. Navigate to the __Endpoints__ tab on the side menu.
570
+
571
+
1. Select the tab __Batch endpoints__.
572
+
573
+
1. Select the batch endpoint you want to monitor.
574
+
575
+
1. Select the tab __Jobs__.
576
+
577
+
:::image type="content" source="media/how-to-use-batch-endpoints-studio/summary-jobs.png" alt-text="Screenshot of summary of jobs submitted to a batch endpoint.":::
578
+
579
+
1. You'll see a list of the jobs created for the selected endpoint.
580
+
581
+
1. Select the last job that is running.
582
+
583
+
1. You'll be redirected to the job monitoring page.
584
+
585
+
---
586
+
587
+
### Check batch scoring results
588
+
589
+
Follow the following steps to view the scoring results in Azure Storage Explorer when the job is completed:
590
+
591
+
1. Run the following code to open batch scoring job in Azure Machine Learning studio. The job studio link is also included in the response of `invoke`, as the value of `interactionEndpoints.Studio.endpoint`.
The scoring results in Storage Explorer are similar to the following sample page:
604
+
605
+
:::image type="content" source="media/how-to-use-batch-endpoint/scoring-view.png" alt-text="Screenshot of the scoring output." lightbox="media/how-to-use-batch-endpoint/scoring-view.png":::
606
+
607
+
## Configure job's inputs, outputs, and execution
608
+
609
+
Batch Endpoints require only one data input, which is the data you want to score. However, you can indicate also the outputs, and some other parameters about the execution.
610
+
549
611
### Configure job's inputs
550
612
551
613
Batch endpoints support reading files or folders that are located in different locations. To learn more about how the supported types and how to specify them read [Accessing data from batch endpoints jobs](how-to-access-data-batch-endpoints-jobs.md).
Batch scoring jobs usually take some time to process the entire set of inputs.
686
-
687
-
# [Azure CLI](#tab/azure-cli)
688
-
689
-
You can use CLI `job show` to view the job. Run the following code to check job status from the previous endpoint invoke. To learn more about job commands, run `az ml job -h`.
The following code checks the job status and outputs a link to the Azure Machine Learning studio for further details.
696
-
697
-
```python
698
-
ml_client.jobs.get(job.name)
699
-
```
700
-
701
-
# [Studio](#tab/azure-studio)
702
-
703
-
1. Navigate to the __Endpoints__ tab on the side menu.
704
-
705
-
1. Select the tab __Batch endpoints__.
706
-
707
-
1. Select the batch endpoint you want to monitor.
708
-
709
-
1. Select the tab __Jobs__.
710
-
711
-
:::image type="content" source="media/how-to-use-batch-endpoints-studio/summary-jobs.png" alt-text="Screenshot of summary of jobs submitted to a batch endpoint.":::
712
-
713
-
1. You'll see a list of the jobs created for the selected endpoint.
714
-
715
-
1. Select the last job that is running.
716
-
717
-
1. You'll be redirected to the job monitoring page.
718
-
719
-
---
720
-
721
-
### Check batch scoring results
722
-
723
-
Follow the following steps to view the scoring results in Azure Storage Explorer when the job is completed:
724
-
725
-
1. Run the following code to open batch scoring job in Azure Machine Learning studio. The job studio link is also included in the response of `invoke`, as the value of `interactionEndpoints.Studio.endpoint`.
The scoring results in Storage Explorer are similar to the following sample page:
738
-
739
-
:::image type="content" source="media/how-to-use-batch-endpoint/scoring-view.png" alt-text="Screenshot of the scoring output." lightbox="media/how-to-use-batch-endpoint/scoring-view.png":::
740
-
741
745
## Adding deployments to an endpoint
742
746
743
747
Once you have a batch endpoint with a deployment, you can continue to refine your model and add new deployments. Batch endpoints will continue serving the default deployment while you develop and deploy new models under the same endpoint. Deployments can't affect one to another.
0 commit comments