You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/machine-learning/how-to-use-batch-endpoint.md
+30-20Lines changed: 30 additions & 20 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -192,6 +192,8 @@ A batch endpoint is an HTTPS endpoint that clients can call to trigger a batch s
192
192
# [Azure CLI](#tab/azure-cli)
193
193
194
194
The following YAML file defines a batch endpoint, which you can include in the CLI command for [batch endpoint creation](#create-a-batch-endpoint). In the repository, this file is located at `/cli/endpoints/batch/batch-endpoint.yml`.
@@ -254,6 +256,8 @@ Batch deployments require a scoring script that indicates how the given model sh
254
256
> [!TIP]
255
257
> For more information about how to write scoring scripts and best practices for it please see [Author scoring scripts for batch deployments](how-to-batch-scoring-script.md).
@@ -265,11 +269,13 @@ A deployment is a set of resources required for hosting the model that does the
265
269
* The environment in which the model runs.
266
270
* The pre-created compute and resource settings.
267
271
268
-
1. Create an environment where your batch deployment will run. Include in the environment any dependency your code requires for running. You will also need to add the library `azureml-core` as it is required for batch deployments to work.
272
+
1. Create an environment where your batch deployment will run. Include in the environment any dependency your code requires for running. In this case, the dependencies have been captured in a `conda.yml`.
269
273
270
274
# [Azure CLI](#tab/azure-cli)
271
275
272
-
*No extra step is required for the Azure ML CLI. The environment definition will be included in the deployment file as an anonymous environment.*
276
+
The environment definition will be included in the deployment definition itself as an anonymous environment. You will see in the following lines in the deployment:
@@ -284,25 +290,30 @@ A deployment is a set of resources required for hosting the model that does the
284
290
285
291
# [Studio](#tab/azure-studio)
286
292
293
+
On [Azure ML studio portal](https://ml.azure.com), follow these steps:
294
+
287
295
1. Navigate to the __Environments__ tab on the side menu.
288
296
1. Select the tab __Custom environments__ > __Create__.
289
297
1. Enter the name of the environment, in this case `torch-batch-env`.
290
298
1. On __Select environment type__ select __Use existing docker image with conda__.
291
299
1. On __Container registry image path__, enter `mcr.microsoft.com/azureml/openmpi3.1.2-ubuntu18.04`.
292
-
1. On __Customize__ section copy the content of the file `./mnist/environment/conda.yml` included in the repository into the portal. The conda file looks as follows:
> Curated environments are not supported in batch deployments. You will need to indicate your own environment. You can always use the base image of a curated environment as yours to simplify the process.
303
314
304
315
> [!IMPORTANT]
305
-
> Do not forget to include the library `azureml-core` in your deployment as it is required by the executor.
316
+
> The packages `azureml-core` and `azureml-dataset-runtime[fuse]` are required by batch deployments and should be included in the environment dependencies.
306
317
307
318
308
319
1. Create a deployment definition
@@ -376,7 +387,9 @@ A deployment is a set of resources required for hosting the model that does the
376
387
* `logging_level`- The log verbosity level. Allowed values are `warning`, `info`, `debug`. Default is `info`.
377
388
378
389
# [Studio](#tab/azure-studio)
379
-
390
+
391
+
On [Azure ML studio portal](https://ml.azure.com), follow these steps:
392
+
380
393
1. Navigate to the __Endpoints__ tab on the side menu.
381
394
1. Select the tab __Batch endpoints__ > __Create__.
382
395
1. Give the endpoint a name, in this case `mnist-batch`. You can configure the rest of the fields or leave them blank.
@@ -701,32 +714,29 @@ In this example, you will learn how to add a second deployment __that solves the
701
714
1. Enter the name of the environment, in this case `keras-batch-env`.
702
715
1. On __Select environment type__ select __Use existing docker image with conda__.
703
716
1. On __Container registry image path__, enter `mcr.microsoft.com/azureml/openmpi3.1.2-ubuntu18.04`.
704
-
1. On __Customize__ section copy the content of the file`./mnist-keras/environment/conda.yml` included in the repository into the portal. The conda file looks as follows:
1. On __Customize__ section copy the content of the file`./mnist-keras/environment/conda.yml` included in the repository into the portal.
708
718
1. Click on __Next__ and then on __Create__.
709
719
1. The environment is ready to be used.
710
720
711
721
---
712
-
713
-
> [!WARNING]
714
-
> Curated environments are not supported in batch deployments. You will need to indicate your own environment. You can always use the base image of a curated environment as yours to simplify the process.
715
-
716
-
> [!IMPORTANT]
717
-
> Do not forget to include the library `azureml-core`in your deployment as it is required by the executor.
0 commit comments