Skip to content

Commit 909a0ae

Browse files
authored
Merge pull request #217021 from Blackmist/revert-215410-endpoint-arm
Revert "arm examples"
2 parents f9f9747 + 3cf43fd commit 909a0ae

File tree

1 file changed

+1
-174
lines changed

1 file changed

+1
-174
lines changed

articles/machine-learning/how-to-deploy-managed-online-endpoints.md

Lines changed: 1 addition & 174 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ ms.subservice: mlops
88
author: dem108
99
ms.author: sehan
1010
ms.reviewer: mopeakande
11-
ms.date: 11/01/2022
11+
ms.date: 10/06/2022
1212
ms.topic: how-to
1313
ms.custom: how-to, devplatv2, ignite-fall-2021, cliv2, event-tier1-build-2022, sdkv2
1414
---
@@ -59,25 +59,6 @@ The main example in this doc uses managed online endpoints for deployment. To us
5959

6060
* (Optional) To deploy locally, you must [install Docker Engine](https://docs.docker.com/engine/install/) on your local computer. We *highly recommend* this option, so it's easier to debug issues.
6161

62-
# [ARM template](#tab/arm)
63-
64-
> [!NOTE]
65-
> While the Azure CLI and CLI extension for machine learning are used in these steps, they are not the main focus. They are used more as utilities, passing templates to Azure and checking the status of template deployments.
66-
67-
[!INCLUDE [basic prereqs cli](../../includes/machine-learning-cli-prereqs.md)]
68-
69-
* Azure role-based access controls (Azure RBAC) are used to grant access to operations in Azure Machine Learning. To perform the steps in this article, your user account must be assigned the __owner__ or __contributor__ role for the Azure Machine Learning workspace, or a custom role allowing `Microsoft.MachineLearningServices/workspaces/onlineEndpoints/*`. For more information, see [Manage access to an Azure Machine Learning workspace](how-to-assign-roles.md).
70-
71-
* If you haven't already set the defaults for the Azure CLI, save your default settings. To avoid passing in the values for your subscription, workspace, and resource group multiple times, run this code:
72-
73-
```azurecli
74-
az account set --subscription <subscription ID>
75-
az configure --defaults workspace=<Azure Machine Learning workspace name> group=<resource group>
76-
```
77-
78-
> [!IMPORTANT]
79-
> The examples in this document assume that you are using the Bash shell. For example, from a Linux system or [Windows Subsystem for Linux](/windows/wsl/about).
80-
8162
---
8263

8364
## Prepare your system
@@ -161,54 +142,6 @@ The [workspace](concept-workspace.md) is the top-level resource for Azure Machin
161142
)
162143
```
163144

164-
# [ARM template](#tab/arm)
165-
166-
### Clone the sample repository
167-
168-
To follow along with this article, first clone the [samples repository (azureml-examples)](https://github.com/azure/azureml-examples). Then, run the following code to go to the samples directory:
169-
170-
```azurecli
171-
git clone --depth 1 https://github.com/Azure/azureml-examples
172-
cd azureml-examples
173-
```
174-
175-
> [!TIP]
176-
> Use `--depth 1` to clone only the latest commit to the repository, which reduces time to complete the operation.
177-
178-
### Set an endpoint name
179-
180-
To set your endpoint name, run the following command (replace `YOUR_ENDPOINT_NAME` with a unique name).
181-
182-
For Unix, run this command:
183-
184-
:::code language="azurecli" source="~/azureml-examples-main/deploy-arm-templates-az-cli.sh" ID="set_endpoint_name":::
185-
186-
> [!NOTE]
187-
> Endpoint names must be unique within an Azure region. For example, in the Azure `westus2` region, there can be only one endpoint with the name `my-endpoint`.
188-
189-
Also set the following environment variables, as they are used in the examples in this article. Replace the values with your Azure subscription ID, the Azure region where your workspace is located, the resource group that contains the workspace, and the workspace name:
190-
191-
```bash
192-
export SUBSCRIPTION_ID="your Azure subscription ID"
193-
export LOCATION="Azure region where your workspace is located"
194-
export RESOURCE_GROUP="Azure resource group that contains your workspace"
195-
export WORKSPACE="Azure Machine Learning workspace name"
196-
```
197-
198-
A couple of the template examples require you to upload files to the Azure Blob store for your workspace. The following steps will query the workspace and store this information in environment variables used in the examples:
199-
200-
1. Get an access token:
201-
202-
:::code language="azurecli" source="~/azureml-examples-main/deploy-arm-templates-az-cli.sh" id="get_access_token":::
203-
204-
1. Set the REST API version:
205-
206-
:::code language="azurecli" source="~/azureml-examples-main/deploy-arm-templates-az-cli.sh" id="api_version":::
207-
208-
1. Get the storage information:
209-
210-
:::code language="azurecli" source="~/azureml-examples-main/deploy-arm-templates-az-cli.sh" id="get_storage_details":::
211-
212145
---
213146

214147
## Define the endpoint and deployment
@@ -321,10 +254,6 @@ In this article, we first define names of online endpoint and deployment for deb
321254
)
322255
```
323256

324-
# [ARM template](#tab/arm)
325-
326-
The Azure Resource Manager templates [online-endpoint.json](https://github.com/Azure/azureml-examples/tree/main/arm-templates/online-endpoint.json) and [online-endpoint-deployment.json](https://github.com/Azure/azureml-examples/tree/main/arm-templates/online-endpoint-deployment.json) are used by the steps in this article.
327-
328257
---
329258

330259
### Register your model and environment separately
@@ -344,24 +273,6 @@ For more information on registering your model as an asset, see [Register your m
344273
For more information on creating an environment, see
345274
[Manage Azure Machine Learning environments with the CLI & SDK (v2)](how-to-manage-environments-v2.md#create-an-environment)
346275

347-
# [ARM template](#tab/arm)
348-
349-
1. To register the model using a template, you must first upload the model file to an Azure Blob store. The following example uses the `az storage blob upload-batch` command to upload a file to the default storage for your workspace:
350-
351-
:::code language="{language}" source="~/azureml-examples-main/deploy-arm-templates-az-cli.sh" id="upload_model":::
352-
353-
1. After uploading the file, use the template to create a model registration. In the following example, the `modelUri` parameter contains the path to the model:
354-
355-
:::code language="azurecli" source="~/azureml-examples-main/deploy-arm-templates-az-cli.sh" id="create_model":::
356-
357-
1. Part of the environment is a conda file that specifies the model dependencies needed to host the model. The following example demonstrates how to read the contents of the conda file into an environment variables:
358-
359-
:::code language="azurecli" source="~/azureml-examples-main/deploy-arm-templates-az-cli.sh" id="read_condafile":::
360-
361-
1. The following example demonstrates how to use the template to register the environment. The contents of the conda file from the previous step are passed to the template using the `condaFile` parameter:
362-
363-
:::code language="azurecli" source="~/azureml-examples-main/deploy-arm-templates-az-cli.sh" id="create_environment":::
364-
365276
---
366277

367278
### Use different CPU and GPU instance types
@@ -388,20 +299,6 @@ As noted earlier, the script specified in `code_configuration.scoring_script` mu
388299
# [Python](#tab/python)
389300
As noted earlier, the script specified in `CodeConfiguration(scoring_script="score.py")` must have an `init()` function and a `run()` function. This example uses the [score.py file](https://github.com/Azure/azureml-examples/blob/main/sdk/python/endpoints/online/model-1/onlinescoring/score.py).
390301

391-
# [ARM template](#tab/arm)
392-
393-
As noted earlier, the script specified in `code_configuration.scoring_script` must have an `init()` function and a `run()` function. This example uses the [score.py file](https://github.com/Azure/azureml-examples/blob/main/cli/endpoints/online/model-1/onlinescoring/score.py).
394-
395-
When using a template for deployment, you must first upload the scoring file(s) to an Azure Blob store and then register it:
396-
397-
1. The following example uses the Azure CLI command `az storage blob upload-batch` to upload the scoring file(s):
398-
399-
:::code language="azurecli" source="~/azureml-examples-main/deploy-arm-templates-az-cli.sh" id="upload_code":::
400-
401-
1. The following example demonstrates hwo to register the code using a template:
402-
403-
:::code language="azurecli" source="~/azureml-examples-main/deploy-arm-templates-az-cli.sh" id="create_code":::
404-
405302
---
406303

407304
The `init()` function is called when the container is initialized or started. Initialization typically occurs shortly after the deployment is created or updated. Write logic here for global initialization operations like caching the model in memory (as we do in this example). The `run()` function is called for every invocation of the endpoint and should do the actual scoring and prediction. In the example, we extract the data from the JSON input, call the scikit-learn model's `predict()` method, and then return the result.
@@ -433,10 +330,6 @@ First create an endpoint. Optionally, for a local endpoint, you can skip this st
433330
ml_client.online_endpoints.begin_create_or_update(endpoint, local=True)
434331
```
435332

436-
# [ARM template](#tab/arm)
437-
438-
The template doesn't support local endpoints. See the Azure CLI or Python tabs for steps to test the endpoint locally.
439-
440333
---
441334

442335
Now, create a deployment named `blue` under the endpoint.
@@ -457,10 +350,6 @@ ml_client.online_deployments.begin_create_or_update(
457350

458351
The `local=True` flag directs the SDK to deploy the endpoint in the Docker environment.
459352

460-
# [ARM template](#tab/arm)
461-
462-
The template doesn't support local endpoints. See the Azure CLI or Python tabs for steps to test the endpoint locally.
463-
464353
---
465354

466355
> [!TIP]
@@ -501,10 +390,6 @@ The method returns [`ManagedOnlineEndpoint` entity](/python/api/azure-ai-ml/azur
501390
ManagedOnlineEndpoint({'public_network_access': None, 'provisioning_state': 'Succeeded', 'scoring_uri': 'http://localhost:49158/score', 'swagger_uri': None, 'name': 'local-10061534497697', 'description': 'this is a sample local endpoint', 'tags': {}, 'properties': {}, 'id': None, 'Resource__source_path': None, 'base_path': '/path/to/your/working/directory', 'creation_context': None, 'serialize': <msrest.serialization.Serializer object at 0x7ffb781bccd0>, 'auth_mode': 'key', 'location': 'local', 'identity': None, 'traffic': {}, 'mirror_traffic': {}, 'kind': None})
502391
```
503392

504-
# [ARM template](#tab/arm)
505-
506-
The template doesn't support local endpoints. See the Azure CLI or Python tabs for steps to test the endpoint locally.
507-
508393
---
509394

510395
The following table contains the possible values for `provisioning_state`:
@@ -546,10 +431,6 @@ endpoint = ml_client.online_endpoints.get(endpoint_name)
546431
scoring_uri = endpoint.scoring_uri
547432
```
548433

549-
# [ARM template](#tab/arm)
550-
551-
The template doesn't support local endpoints. See the Azure CLI or Python tabs for steps to test the endpoint locally.
552-
553434
---
554435

555436
### Review the logs for output from the invoke operation
@@ -572,10 +453,6 @@ ml_client.online_deployments.get_logs(
572453
)
573454
```
574455

575-
# [ARM template](#tab/arm)
576-
577-
The template doesn't support local endpoints. See the Azure CLI or Python tabs for steps to test the endpoint locally.
578-
579456
---
580457

581458
## Deploy your online endpoint to Azure
@@ -676,16 +553,6 @@ This deployment might take up to 15 minutes, depending on whether the underlying
676553
ml_client.online_endpoints.begin_create_or_update(endpoint)
677554
```
678555

679-
# [ARM template](#tab/arm)
680-
681-
1. The following example demonstrates using the template to create an online endpoint:
682-
683-
:::code language="azurecli" source="~/azureml-examples-main/deploy-arm-templates-az-cli.sh" id="create_endpoint":::
684-
685-
1. After the endpoint has been created, the following example demonstrates how to deploy the model to the endpoint:
686-
687-
:::code language="azurecli" source="~/azureml-examples-main/deploy-arm-templates-az-cli.sh" id="create_deployment":::
688-
689556
---
690557

691558
> [!TIP]
@@ -731,18 +598,6 @@ for endpoint in ml_client.online_endpoints.list():
731598
print(f"{endpoint.kind}\t{endpoint.location}\t{endpoint.name}")
732599
```
733600

734-
# [ARM template](#tab/arm)
735-
736-
The `show` command contains information in `provisioning_status` for endpoint and deployment:
737-
738-
::: code language="azurecli" source="~/azureml-examples-main/cli/deploy-managed-online-endpoint.sh" ID="get_status" :::
739-
740-
You can list all the endpoints in the workspace in a table format by using the `list` command:
741-
742-
```azurecli
743-
az ml online-endpoint list --output table
744-
```
745-
746601
---
747602

748603
### Check the status of the online deployment
@@ -773,13 +628,6 @@ ml_client.online_deployments.get_logs(
773628
name="blue", endpoint_name=online_endpoint_name, lines=50, container_type="storage-initializer"
774629
)
775630
```
776-
777-
# [ARM template](#tab/arm)
778-
779-
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-managed-online-endpoint.sh" ID="get_logs" :::
780-
781-
By default, logs are pulled from inference-server. To see the logs from storage-initializer (it mounts assets like model and code to the container), add the `--container storage-initializer` flag.
782-
783631
---
784632

785633
For more information on deployment logs, see [Get container logs](how-to-troubleshoot-online-endpoints.md#get-container-logs).
@@ -829,19 +677,6 @@ ml_client.online_endpoints.invoke(
829677
)
830678
```
831679

832-
# [ARM template](#tab/arm)
833-
834-
You can use either the `invoke` command or a REST client of your choice to invoke the endpoint and score some data:
835-
836-
::: code language="azurecli" source="~/azureml-examples-main/cli/deploy-managed-online-endpoint.sh" ID="test_endpoint" :::
837-
838-
The following example shows how to get the key used to authenticate to the endpoint:
839-
840-
> [!TIP]
841-
> You can control which Azure Active Directory security principals can get the authentication key by assigning them to a custom role that allows `Microsoft.MachineLearningServices/workspaces/onlineEndpoints/token/action` and `Microsoft.MachineLearningServices/workspaces/onlineEndpoints/listkeys/action`. For more information, see [Manage access to an Azure Machine Learning workspace](how-to-assign-roles.md).
842-
843-
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-managed-online-endpoint.sh" ID="test_endpoint_using_curl_get_key":::
844-
845680
---
846681

847682
### (Optional) Update the deployment
@@ -904,10 +739,6 @@ To understand how `begin_create_or_update` works:
904739

905740
The `begin_create_or_update` method also works with local deployments. Use the same method with the `local=True` flag.
906741

907-
# [ARM template](#tab/arm)
908-
909-
There currently is not an option to update the deployment using an ARM template.
910-
911742
---
912743

913744
> [!Note]
@@ -944,10 +775,6 @@ If you aren't going use the deployment, you should delete it by running the foll
944775
ml_client.online_endpoints.begin_delete(name=online_endpoint_name)
945776
```
946777

947-
# [ARM template](#tab/arm)
948-
949-
::: code language="azurecli" source="~/azureml-examples-main/cli/deploy-managed-online-endpoint.sh" ID="delete_endpoint" :::
950-
951778
---
952779

953780
## Next steps

0 commit comments

Comments
 (0)