Skip to content

Commit 3655bb5

Browse files
Merge pull request #216740 from santiagxf/santiagxf/azureml-batch-patch
Update how-to-use-batch-endpoint.md
2 parents 3e51a97 + 20fa7c9 commit 3655bb5

File tree

1 file changed

+13
-6
lines changed

1 file changed

+13
-6
lines changed

articles/machine-learning/batch-inference/how-to-use-batch-endpoint.md

Lines changed: 13 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -652,9 +652,11 @@ The scoring results in Storage Explorer are similar to the following sample page
652652

653653
Once you have a batch endpoint with a deployment, you can continue to refine your model and add new deployments. Batch endpoints will continue serving the default deployment while you develop and deploy new models under the same endpoint. Deployments can't affect one to another.
654654

655+
In this example, you will learn how to add a second deployment __that solves the same MNIST problem but using a model built with Keras and TensorFlow__.
656+
655657
### Adding a second deployment
656658

657-
1. Create an environment where your batch deployment will run. Include in the environment any dependency your code requires for running. You will also need to add the library `azureml-core` as it is required for batch deployments to work.
659+
1. Create an environment where your batch deployment will run. Include in the environment any dependency your code requires for running. You will also need to add the library `azureml-core` as it is required for batch deployments to work. The following environment definition has the required libraries to run a model with TensorFlow.
658660

659661
# [Azure ML CLI](#tab/cli)
660662

@@ -678,9 +680,9 @@ Once you have a batch endpoint with a deployment, you can continue to refine you
678680
1. Enter the name of the environment, in this case `keras-batch-env`.
679681
1. On __Select environment type__ select __Use existing docker image with conda__.
680682
1. On __Container registry image path__, enter `mcr.microsoft.com/azureml/openmpi3.1.2-ubuntu18.04`.
681-
1. On __Customize__ section copy the content of the file `./mnist/environment/conda.yml` included in the repository into the portal. The conda file looks as follows:
683+
1. On __Customize__ section copy the content of the file `./mnist-keras/environment/conda.yml` included in the repository into the portal. The conda file looks as follows:
682684

683-
:::code language="yaml" source="~/azureml-examples-main/cli/endpoints/batch/mnist/environment/conda.yml":::
685+
:::code language="yaml" source="~/azureml-examples-main/cli/endpoints/batch/mnist-keras/environment/conda.yml":::
684686

685687
1. Click on __Next__ and then on __Create__.
686688
1. The environment is ready to be used.
@@ -693,8 +695,13 @@ Once you have a batch endpoint with a deployment, you can continue to refine you
693695
> [!IMPORTANT]
694696
> Do not forget to include the library `azureml-core` in your deployment as it is required by the executor.
695697

696-
697-
1. Create a deployment definition
698+
1. Create a scoring script for the model:
699+
700+
__batch_driver.py__
701+
702+
:::code language="python" source="~/azureml-examples-main/sdk/python/endpoints/batch/mnist-keras/code/batch_driver.py" :::
703+
704+
3. Create a deployment definition
698705

699706
# [Azure ML CLI](#tab/cli)
700707

@@ -711,7 +718,7 @@ Once you have a batch endpoint with a deployment, you can continue to refine you
711718
endpoint_name=batch_endpoint_name,
712719
model=model,
713720
code_path="./mnist-keras/code/",
714-
scoring_script="digit_identification.py",
721+
scoring_script="batch_driver.py",
715722
environment=env,
716723
compute=compute_name,
717724
instance_count=2,

0 commit comments

Comments
 (0)