Skip to content

Commit eefb5fa

Browse files
committed
last set of studio edits
1 parent 794d5f6 commit eefb5fa

File tree

5 files changed

+88
-56
lines changed

5 files changed

+88
-56
lines changed

articles/machine-learning/how-to-use-batch-model-deployments.md

Lines changed: 88 additions & 56 deletions
Original file line numberDiff line numberDiff line change
@@ -337,27 +337,44 @@ A model deployment is a set of resources required for hosting the model that doe
337337
In the studio, follow these steps:
338338

339339
1. Navigate to the __Endpoints__ tab on the side menu.
340+
340341
1. Select the tab __Batch endpoints__ > __Create__.
342+
341343
1. Give the endpoint a name, in this case `mnist-batch`. You can configure the rest of the fields or leave them blank.
344+
342345
1. Select __Next__ to go to the "Model" section.
346+
343347
1. Select the model __mnist-classifier-torch__.
348+
344349
1. Select __Next__ to go to the "Deployment" page.
350+
345351
1. Give the deployment a name.
352+
346353
1. For __Output action__, ensure __Append row__ is selected.
354+
347355
1. For __Output file name__, ensure the batch scoring output file is the one you need. Default is `predictions.csv`.
356+
348357
1. For __Mini batch size__, adjust the size of the files that will be included on each mini-batch. This size will control the amount of data your scoring script receives per batch.
358+
349359
1. For __Scoring timeout (seconds)__, ensure you're giving enough time for your deployment to score a given batch of files. If you increase the number of files, you usually have to increase the timeout value too. More expensive models (like those based on deep learning), may require high values in this field.
360+
350361
1. For __Max concurrency per instance__, configure the number of executors you want to have for each compute instance you get in the deployment. A higher number here guarantees a higher degree of parallelization but it also increases the memory pressure on the compute instance. Tune this value altogether with __Mini batch size__.
362+
351363
1. Once done, select __Next__ to go to the "Code + environment" page.
364+
352365
1. For "Select a scoring script for inferencing", browse to find and select the scoring script file *deployment-torch/code/batch_driver.py*.
366+
353367
1. In the "Select environment" section, select the environment you created previously _torch-batch-env_.
368+
354369
1. Select __Next__ to go to the "Compute" page.
370+
355371
1. Select the compute cluster you created in a previous step.
356372

357373
> [!WARNING]
358374
> Azure Kubernetes cluster are supported in batch deployments, but only when created using the Azure Machine Learning CLI or Python SDK.
359375

360376
1. For __Instance count__, enter the number of compute instances you want for the deployment. In this case, use 2.
377+
361378
1. Select __Next__.
362379

363380
1. Create the deployment:
@@ -382,10 +399,10 @@ A model deployment is a set of resources required for hosting the model that doe
382399
[!notebook-python[] (~/azureml-examples-main/sdk/python/endpoints/batch/deploy-models/mnist-classifier/mnist-batch.ipynb?name=set_default_deployment)]
383400

384401
# [Studio](#tab/azure-studio)
385-
402+
386403
In the wizard, select __Create__ to start the deployment process.
387-
388-
:::image type="content" source="./media/how-to-use-batch-model-deployments/review-batch-wizard.png" alt-text="Screenshot of batch endpoints/deployment review screen.":::
404+
405+
:::image type="content" source="./media/how-to-use-batch-model-deployments/review-batch-wizard.png" alt-text="Screenshot of batch endpoints deployment review screen." lightbox="media/how-to-use-batch-model-deployments/review-batch-wizard.png":::
389406

390407
---
391408

@@ -404,14 +421,16 @@ A model deployment is a set of resources required for hosting the model that doe
404421
[!notebook-python[] (~/azureml-examples-main/sdk/python/endpoints/batch/deploy-models/mnist-classifier/mnist-batch.ipynb?name=query_deployment)]
405422

406423
# [Studio](#tab/azure-studio)
407-
424+
425+
After creating the batch endpoint, the endpoint's details page opens up. You can also find this page by following these steps:
426+
408427
1. Navigate to the __Endpoints__ tab on the side menu.
409-
428+
410429
1. Select the tab __Batch endpoints__.
411-
412-
1. Select the batch endpoint you want to get details from.
413-
414-
1. In the endpoint page, you'll see all the details of the endpoint along with all the deployments available.
430+
431+
1. Select the batch endpoint you want to view.
432+
433+
1. The endpoint's **Details** page shows the details of the endpoint along with all the deployments available in the endpoint.
415434

416435
:::image type="content" source="./media/how-to-use-batch-model-deployments/batch-endpoint-details.png" alt-text="Screenshot of the check batch endpoints and deployment details.":::
417436

@@ -447,19 +466,28 @@ You can run and invoke a batch endpoint using Azure CLI, Azure Machine Learning
447466

448467
1. Select __Create job__.
449468

450-
:::image type="content" source="./media/how-to-use-batch-model-deployments/create-batch-job.png" alt-text="Screenshot of the create job option to start batch scoring.":::
469+
:::image type="content" source="./media/how-to-use-batch-model-deployments/create-batch-job.png" alt-text="Screenshot of the create job option to start batch scoring." lightbox="media/how-to-use-batch-model-deployments/create-batch-job.png":::
451470

452-
1. On __Deployment__, select the deployment you want to execute.
471+
1. For __Deployment__, select the deployment to execute.
453472

454-
:::image type="content" source="./media/how-to-use-batch-model-deployments/job-setting-batch-scoring.png" alt-text="Screenshot of using the deployment to submit a batch job.":::
473+
:::image type="content" source="./media/how-to-use-batch-model-deployments/job-setting-batch-scoring.png" alt-text="Screenshot of using the deployment to submit a batch job." lightbox="media/how-to-use-batch-model-deployments/job-setting-batch-scoring.png":::
455474

456-
1. Select __Next__.
475+
1. Select __Next__ to go to the "Select data source" page.
457476

458-
1. On __Select data source__, select the data input you want to use. For this example, select __Datastore__ and in the section __Path__ enter the full URL `https://azuremlexampledata.blob.core.windows.net/data/mnist/sample`. Notice that this only works because the given path has public access enabled. In general, you'll need to register the data source as a __Datastore__. See [Accessing data from batch endpoints jobs](how-to-access-data-batch-endpoints-jobs.md) for details.
477+
1. For the "Data source type", select __Datastore__.
459478

460-
:::image type="content" source="./media/how-to-use-batch-model-deployments/select-datastore-job.png" alt-text="Screenshot of selecting datastore as an input option.":::
479+
1. For the "Datastore", select __workspaceblobstore__ from the dropdown menu.
461480

462-
1. Start the job.
481+
1. For "Path", enter the full URL `https://azuremlexampledata.blob.core.windows.net/data/mnist/sample`.
482+
483+
> [!TIP]
484+
> This path work only because the given path has public access enabled. In general, youl need to register the data source as a __Datastore__. See [Accessing data from batch endpoints jobs](how-to-access-data-batch-endpoints-jobs.md) for details.
485+
486+
:::image type="content" source="./media/how-to-use-batch-model-deployments/select-datastore-job.png" alt-text="Screenshot of selecting datastore as an input option." lightbox="media/how-to-use-batch-model-deployments/select-datastore-job.png":::
487+
488+
1. Select __Next__.
489+
490+
1. Select __Create__ to start the job.
463491

464492
---
465493

@@ -489,9 +517,9 @@ The following code checks the job status and outputs a link to the Azure Machine
489517

490518
1. Select the batch endpoint you want to monitor.
491519

492-
1. Select the tab __Jobs__.
520+
1. Select the __Jobs__ tab.
493521

494-
:::image type="content" source="media/how-to-use-batch-model-deployments/summary-jobs.png" alt-text="Screenshot of summary of jobs submitted to a batch endpoint.":::
522+
:::image type="content" source="media/how-to-use-batch-model-deployments/summary-jobs.png" alt-text="Screenshot of summary of jobs submitted to a batch endpoint." lightbox="media/how-to-use-batch-model-deployments/summary-jobs.png":::
495523

496524
1. You'll see a list of the jobs created for the selected endpoint.
497525

@@ -551,23 +579,25 @@ Once you've identified the data store you want to use, configure the output as f
551579

552580
1. Select __Create job__.
553581

554-
:::image type="content" source="./media/how-to-use-batch-model-deployments/create-batch-job.png" alt-text="Screenshot of the create job option to start batch scoring.":::
582+
:::image type="content" source="./media/how-to-use-batch-model-deployments/create-batch-job.png" alt-text="Screenshot of the create job option to start batch scoring." lightbox="media/how-to-use-batch-model-deployments/create-batch-job.png":::
555583

556-
1. On __Deployment__, select the deployment you want to execute.
584+
1. For __Deployment__, select the deployment you want to execute.
557585

558-
1. Select __Next__.
559-
560-
1. Check the option __Override deployment settings__.
586+
1. Select the option __Override deployment settings__.
561587

562588
:::image type="content" source="./media/how-to-use-batch-model-deployments/overwrite-setting.png" alt-text="Screenshot of the overwrite setting when starting a batch job.":::
563589

564590
1. You can now configure __Output file name__ and some extra properties of the deployment execution. Just this execution will be affected.
565591

566-
1. On __Select data source__, select the data input you want to use.
592+
1. Select __Next__.
567593

568-
1. On __Configure output location__, check the option __Enable output configuration__.
594+
1. On the "Select data source" page, select the data input you want to use.
569595

570-
:::image type="content" source="./media/how-to-use-batch-model-deployments/configure-output-location.png" alt-text="Screenshot of optionally configuring output location.":::
596+
1. Select __Next__.
597+
598+
1. On the "Configure output location" page, select the option __Enable output configuration__.
599+
600+
:::image type="content" source="./media/how-to-use-batch-model-deployments/configure-output-location.png" alt-text="Screenshot of optionally configuring output location." lightbox="media/how-to-use-batch-model-deployments/configure-output-location.png":::
571601

572602
1. Configure the __Blob datastore__ where the outputs should be placed.
573603

@@ -605,17 +635,21 @@ When you invoke a batch endpoint, some settings can be overwritten to make best
605635

606636
1. Select __Create job__.
607637

608-
:::image type="content" source="./media/how-to-use-batch-model-deployments/create-batch-job.png" alt-text="Screenshot of the create job option to start batch scoring.":::
638+
1. For __Deployment__, select the deployment you want to execute.
609639

610-
1. On __Deployment__, select the deployment you want to execute.
640+
1. Select the option __Override deployment settings__.
641+
642+
1. Configure the job parameters. Only the current job execution will be affected by this configuration.
611643

612644
1. Select __Next__.
613645

614-
1. Check the option __Override deployment settings__.
646+
1. On the "Select data source" page, select the data input you want to use.
615647

616-
:::image type="content" source="./media/how-to-use-batch-model-deployments/overwrite-setting.png" alt-text="Screenshot of the overwrite setting when starting a batch job.":::
648+
1. Select __Next__.
617649

618-
1. Configure the job parameters. Only the current job execution will be affected by this configuration.
650+
1. On the "Configure output location" page, select the option __Enable output configuration__.
651+
652+
1. Configure the __Blob datastore__ where the outputs should be placed.
619653

620654
---
621655

@@ -644,20 +678,14 @@ In this example, you add a second deployment that uses a __model built with Kera
644678
# [Studio](#tab/azure-studio)
645679

646680
1. Navigate to the __Environments__ tab on the side menu.
647-
648681
1. Select the tab __Custom environments__ > __Create__.
649-
650682
1. Enter the name of the environment, in this case `keras-batch-env`.
651-
652-
1. On __Select environment type__ select __Use existing docker image with conda__.
653-
654-
1. On __Container registry image path__, enter `mcr.microsoft.com/azureml/openmpi4.1.0-ubuntu20.0`.
655-
656-
1. On __Customize__ section copy the content of the file `deployment-keras/environment/conda.yaml` included in the repository into the portal.
657-
658-
1. Select __Next__ and then on __Create__.
659-
660-
1. The environment is ready to be used.
683+
1. For __Select environment source__, select __Use existing docker image with optional conda file__.
684+
1. For __Container registry image path__, enter `mcr.microsoft.com/azureml/openmpi4.1.0-ubuntu20.04`.
685+
1. Select **Next** to go to the "Customize" section.
686+
1. Copy the content of the file _deployment-keras/environment/conda.yaml_ from the GitHub repo into the portal.
687+
1. Select __Next__ until you get to the "Review page".
688+
1. Select __Create__ and wait until the environment is ready for use.
661689

662690
---
663691

@@ -697,35 +725,39 @@ In this example, you add a second deployment that uses a __model built with Kera
697725

698726
:::image type="content" source="./media/how-to-use-batch-model-deployments/add-deployment-option.png" alt-text="Screenshot of add new deployment option.":::
699727

700-
1. On the model list, select the model `mnist` and select __Next__.
728+
1. Select __Next__ to go to the "Model" page.
729+
730+
1. From the model list, select the model `mnist` and select __Next__.
701731

702732
1. On the deployment configuration page, give the deployment a name.
703733

704-
1. On __Output action__, ensure __Append row__ is selected.
734+
1. Unselect the option to __Make this new deployment the default for batch jobs__.
735+
736+
1. For __Output action__, ensure __Append row__ is selected.
705737

706-
1. On __Output file name__, ensure the batch scoring output file is the one you need. Default is `predictions.csv`.
738+
1. For __Output file name__, ensure the batch scoring output file is the one you need. Default is `predictions.csv`.
707739

708-
1. On __Mini batch size__, adjust the size of the files that will be included on each mini-batch. This will control the amount of data your scoring script receives per each batch.
740+
1. For __Mini batch size__, adjust the size of the files that will be included in each mini-batch. This will control the amount of data your scoring script receives for each batch.
709741

710-
1. On __Scoring timeout (seconds)__, ensure you're giving enough time for your deployment to score a given batch of files. If you increase the number of files, you usually have to increase the timeout value too. More expensive models (like those based on deep learning), may require high values in this field.
742+
1. For __Scoring timeout (seconds)__, ensure you're giving enough time for your deployment to score a given batch of files. If you increase the number of files, you usually have to increase the timeout value too. More expensive models (like those based on deep learning), may require high values in this field.
711743

712-
1. On __Max concurrency per instance__, configure the number of executors you want to have per each compute instance you get in the deployment. A higher number here guarantees a higher degree of parallelization but it also increases the memory pressure on the compute instance. Tune this value altogether with __Mini batch size__.
713-
1. Once done, select __Next__.
744+
1. For __Max concurrency per instance__, configure the number of executors you want to have for each compute instance you get in the deployment. A higher number here guarantees a higher degree of parallelization but it also increases the memory pressure on the compute instance. Tune this value altogether with __Mini batch size__.
714745

715-
1. On environment, go to __Select scoring file and dependencies__ and select __Browse__.
746+
1. Select __Next__ to go to the "Code + environment" page.
716747

717-
1. Select the scoring script file on `deployment-keras/code/batch_driver.py`.
748+
1. For __Select a scoring script for inferencing__, browse to select the scoring script file *deployment-keras/code/batch_driver.py*.
718749

719-
1. On the section __Choose an environment__, select the environment you created a previous step.
750+
1. For __Select environment__, select the environment you created ina previous step.
720751

721752
1. Select __Next__.
722753

723-
1. On the section __Compute__, select the compute cluster you created in a previous step.
754+
1. On the __Compute__ page, select the compute cluster you created in a previous step.
724755

725-
1. On __Instance count__, enter the number of compute instances you want for the deployment. In this case, we'll use 2.
756+
1. For __Instance count__, enter the number of compute instances you want for the deployment. In this case, use 2.
726757

727758
1. Select __Next__.
728759

760+
729761
1. Create the deployment:
730762

731763
# [Azure CLI](#tab/cli)
@@ -804,7 +836,7 @@ Although you can invoke a specific deployment inside an endpoint, you'll typical
804836

805837
:::image type="content" source="./media/how-to-use-batch-model-deployments/update-default-deployment.png" alt-text="Screenshot of updating default deployment.":::
806838

807-
1. On __Select default deployment__, select the name of the deployment you want to be the default one.
839+
1. On __Select default deployment__, select the name of the deployment you want to set as the default.
808840

809841
1. Select __Update__.
810842

30.2 KB
Loading
-2.22 KB
Loading
6 KB
Loading
5.29 KB
Loading

0 commit comments

Comments
 (0)