You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -59,6 +59,25 @@ The main example in this doc uses managed online endpoints for deployment. To us
59
59
60
60
* (Optional) To deploy locally, you must [install Docker Engine](https://docs.docker.com/engine/install/) on your local computer. We *highly recommend* this option, so it's easier to debug issues.
61
61
62
+
# [ARM template](#tab/arm)
63
+
64
+
> [!NOTE]
65
+
> While the Azure CLI and CLI extension for machine learning are used in these steps, they are not the main focus. They are used more as utilities, passing templates to Azure and checking the status of template deployments.
* Azure role-based access controls (Azure RBAC) are used to grant access to operations in Azure Machine Learning. To perform the steps in this article, your user account must be assigned the __owner__ or __contributor__ role for the Azure Machine Learning workspace, or a custom role allowing `Microsoft.MachineLearningServices/workspaces/onlineEndpoints/*`. For more information, see [Manage access to an Azure Machine Learning workspace](how-to-assign-roles.md).
70
+
71
+
* If you haven't already set the defaults for the Azure CLI, save your default settings. To avoid passing in the values for your subscription, workspace, and resource group multiple times, run this code:
72
+
73
+
```azurecli
74
+
az account set --subscription <subscription ID>
75
+
az configure --defaults workspace=<Azure Machine Learning workspace name> group=<resource group>
76
+
```
77
+
78
+
> [!IMPORTANT]
79
+
> The examples in this document assume that you are using the Bash shell. For example, from a Linux system or [Windows Subsystem for Linux](/windows/wsl/about).
80
+
62
81
---
63
82
64
83
## Prepare your system
@@ -142,6 +161,54 @@ The [workspace](concept-workspace.md) is the top-level resource for Azure Machin
142
161
)
143
162
```
144
163
164
+
# [ARM template](#tab/arm)
165
+
166
+
### Clone the sample repository
167
+
168
+
To follow along with this article, first clone the [samples repository (azureml-examples)](https://github.com/azure/azureml-examples). Then, run the following code to go to the samples directory:
> Endpoint names must be unique within an Azure region. For example, in the Azure `westus2` region, there can be only one endpoint with the name `my-endpoint`.
188
+
189
+
Also set the following environment variables, as they are used in the examples in this article. Replace the values with your Azure subscription ID, the Azure region where your workspace is located, the resource group that contains the workspace, and the workspace name:
A couple of the template examples require you to upload files to the Azure Blob store for your workspace. The following steps will query the workspace and store this information in environment variables used in the examples:
@@ -254,6 +321,10 @@ In this article, we first define names of online endpoint and deployment for deb
254
321
)
255
322
```
256
323
324
+
# [ARM template](#tab/arm)
325
+
326
+
The Azure Resource Manager templates [online-endpoint.json](https://github.com/Azure/azureml-examples/tree/main/arm-templates/online-endpoint.json) and [online-endpoint-deployment.json](https://github.com/Azure/azureml-examples/tree/main/arm-templates/online-endpoint-deployment.json) are used by the steps in this article.
327
+
257
328
---
258
329
259
330
### Register your model and environment separately
@@ -273,6 +344,24 @@ For more information on registering your model as an asset, see [Register your m
273
344
For more information on creating an environment, see
274
345
[Manage Azure Machine Learning environments with the CLI& SDK (v2)](how-to-manage-environments-v2.md#create-an-environment)
275
346
347
+
# [ARM template](#tab/arm)
348
+
349
+
1. To register the model using a template, you must first upload the model file to an Azure Blob store. The following example uses the `az storage blob upload-batch` command to upload a file to the default storage for your workspace:
1. After uploading the file, use the template to create a model registration. In the following example, the `modelUri` parameter contains the path to the model:
1. Part of the environment is a conda file that specifies the model dependencies needed to host the model. The following example demonstrates how to read the contents of the conda file into an environment variables:
1. The following example demonstrates how to use the template to register the environment. The contents of the conda filefrom the previous step are passed to the template using the `condaFile` parameter:
@@ -299,6 +388,20 @@ As noted earlier, the script specified in `code_configuration.scoring_script` mu
299
388
# [Python](#tab/python)
300
389
As noted earlier, the script specified in`CodeConfiguration(scoring_script="score.py")` must have an `init()` function and a `run()` function. This example uses the [score.py file](https://github.com/Azure/azureml-examples/blob/main/sdk/python/endpoints/online/model-1/onlinescoring/score.py).
301
390
391
+
# [ARM template](#tab/arm)
392
+
393
+
As noted earlier, the script specified in`code_configuration.scoring_script` must have an `init()` function and a `run()` function. This example uses the [score.py file](https://github.com/Azure/azureml-examples/blob/main/cli/endpoints/online/model-1/onlinescoring/score.py).
394
+
395
+
When using a template for deployment, you must first upload the scoring file(s) to an Azure Blob store and then register it:
396
+
397
+
1. The following example uses the Azure CLI command `az storage blob upload-batch` to upload the scoring file(s):
The `init()` function is called when the container is initialized or started. Initialization typically occurs shortly after the deployment is created or updated. Write logic here forglobal initialization operations like caching the model in memory (as we do in this example). The `run()` function is called for every invocation of the endpoint and should do the actual scoring and prediction. In the example, we extract the data from the JSONinput, call the scikit-learn model's `predict()` method, and then return the result.
@@ -330,6 +433,10 @@ First create an endpoint. Optionally, for a local endpoint, you can skip this st
By default, logs are pulled from inference-server. To see the logs from storage-initializer (it mounts assets like model and code to the container), add the `--container storage-initializer` flag.
788
+
631
789
---
632
790
633
791
For more information on deployment logs, see [Get container logs](how-to-troubleshoot-online-endpoints.md#get-container-logs).
0 commit comments