Skip to content

Commit aac169b

Browse files
Merge pull request #179737 from Blackmist/11891786-verbatims
validating and fixing CLI 1.0 commands
2 parents 70b3f85 + 1e5858d commit aac169b

File tree

1 file changed

+107
-53
lines changed

1 file changed

+107
-53
lines changed

articles/machine-learning/how-to-deploy-and-where.md

Lines changed: 107 additions & 53 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,9 @@ services: machine-learning
66
ms.service: machine-learning
77
ms.subservice: core
88
ms.reviewer: larryfr
9-
ms.date: 04/21/2021
9+
ms.author: ssambare
10+
author: shivanissambare
11+
ms.date: 11/12/2021
1012
ms.topic: how-to
1113
ms.custom: devx-track-python, deploy, devx-track-azurecli, contperf-fy21q2, contperf-fy21q4, mktng-kw-nov2021
1214
adobe-target: true
@@ -16,8 +18,7 @@ adobe-target: true
1618

1719
Learn how to deploy your machine learning or deep learning model as a web service in the Azure cloud.
1820

19-
> [!TIP]
20-
> Managed online endpoints (preview) provide a way to deploy your trained model without your having to create and manage the underlying infrastructure. For more information, see [Deploy and score a machine learning model with a managed online endpoint (preview)](how-to-deploy-managed-online-endpoints.md).
21+
[!INCLUDE [endpoints-option](../../includes/machine-learning-endpoints-preview-note.md)]
2122

2223
## Workflow for deploying a model
2324

@@ -28,26 +29,25 @@ The workflow is similar no matter where you deploy your model:
2829
1. Prepare an inference configuration.
2930
1. Deploy the model locally to ensure everything works.
3031
1. Choose a compute target.
31-
1. Re-deploy the model to the cloud.
32+
1. Deploy the model to the cloud.
3233
1. Test the resulting web service.
3334

3435
For more information on the concepts involved in the machine learning deployment workflow, see [Manage, deploy, and monitor models with Azure Machine Learning](concept-model-management-and-deployment.md).
3536

36-
[!INCLUDE [endpoints-option](../../includes/machine-learning-endpoints-preview-note.md)]
37-
3837
## Prerequisites
3938

4039
# [Azure CLI](#tab/azcli)
4140

41+
[!INCLUDE [cli10-only](../../includes/machine-learning-cli-version-1-only.md)]
42+
4243
- An Azure Machine Learning workspace. For more information, see [Create an Azure Machine Learning workspace](how-to-manage-workspace.md).
43-
- A model. If you don't have a trained model, you can use the model and dependency files provided in [this tutorial](https://aka.ms/azml-deploy-cloud).
44-
- The [Azure Command Line Interface (CLI) extension for the Machine Learning service](reference-azure-machine-learning-cli.md).
44+
- A model. The examples in this article use a pre-trained model.
4545
- A machine that can run Docker, such as a [compute instance](how-to-create-manage-compute-instance.md).
4646

4747
# [Python](#tab/python)
4848

4949
- An Azure Machine Learning workspace. For more information, see [Create an Azure Machine Learning workspace](how-to-manage-workspace.md).
50-
- A model. If you don't have a trained model, you can use the model and dependency files provided in [this tutorial](https://aka.ms/azml-deploy-cloud).
50+
- A model. The examples in this article use a pre-trained model.
5151
- The [Azure Machine Learning software development kit (SDK) for Python](/python/api/overview/azure/ml/intro).
5252
- A machine that can run Docker, such as a [compute instance](how-to-create-manage-compute-instance.md).
5353
---
@@ -56,16 +56,14 @@ For more information on the concepts involved in the machine learning deployment
5656

5757
# [Azure CLI](#tab/azcli)
5858

59-
Do
59+
To see the workspaces that you have access to, use the following commands:
6060

6161
```azurecli-interactive
6262
az login
63-
az account set -s <my subscription>
64-
az ml workspace list --resource-group=<my resource group>
63+
az account set -s <subscription>
64+
az ml workspace list --resource-group=<resource-group>
6565
```
6666

67-
to see the workspaces you have access to.
68-
6967
# [Python](#tab/python)
7068

7169
```python
@@ -84,8 +82,8 @@ For more information on using the SDK to connect to a workspace, see the [Azure
8482

8583
A typical situation for a deployed machine learning service is that you need the following components:
8684

87-
+ resources representing the specific model that you want deployed (for example: a pytorch model file)
88-
+ code that you will be running in the service, that executes the model on a given input
85+
+ Resources representing the specific model that you want deployed (for example: a pytorch model file).
86+
+ Code that you will be running in the service, that executes the model on a given input.
8987

9088
Azure Machine Learnings allows you to separate the deployment into two separate components, so that you can keep the same code, but merely update the model. We define the mechanism by which you upload a model _separately_ from your code as "registering the model".
9189

@@ -97,33 +95,39 @@ The following examples demonstrate how to register a model.
9795

9896
# [Azure CLI](#tab/azcli)
9997

100-
### Register a model from a local file
98+
The following commands download a model and then register it with your Azure Machine Learning workspace:
10199

102-
[!notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/2.deploy-local-cli.ipynb?name=register-model-from-local-file-code)]
100+
```azurecli-interactive
101+
wget https://aka.ms/bidaf-9-model -O model.onnx --show-progress
102+
az ml model register -n bidaf_onnx \
103+
-p ./model.onnx \
104+
-g <resource-group> \
105+
-w <workspace-name>
106+
```
103107

104108
Set `-p` to the path of a folder or a file that you want to register.
105109

106-
For more information on `az ml model register`, consult the [reference documentation](/cli/azure/ext/azure-cli-ml/ml/model).
110+
For more information on `az ml model register`, see the [reference documentation](/cli/azure/ml(v1)/model).
107111

108112
### Register a model from an Azure ML training run
109113

114+
If you need to register a model that was created previously through an Azure Machine Learning training job, you can specify the experiment, run, and path to the model:
115+
110116
```azurecli-interactive
111117
az ml model register -n bidaf_onnx --asset-path outputs/model.onnx --experiment-name myexperiment --run-id myrunid --tag area=qna
112118
```
113119

114-
[!INCLUDE [install extension](../../includes/machine-learning-service-install-extension.md)]
115-
116120
The `--asset-path` parameter refers to the cloud location of the model. In this example, the path of a single file is used. To include multiple files in the model registration, set `--asset-path` to the path of a folder that contains the files.
117121

118-
For more information on `az ml model register`, consult the [reference documentation](/cli/azure/ml/model).
122+
For more information on `az ml model register`, see the [reference documentation](/cli/azure/ml(v1)/model).
119123

120124
# [Python](#tab/python)
121125

122126
### Register a model from a local file
123127

124128
You can register a model by providing the local path of the model. You can provide the path of either a folder or a single file on your local machine.
125129
<!-- pyhton nb call -->
126-
[!notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=register-model-from-local-file-code)]
130+
[!Notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=register-model-from-local-file-code)]
127131

128132

129133
To include multiple files in the model registration, set `model_path` to the path of a folder that contains the files.
@@ -195,7 +199,7 @@ Save this file with the name `dummyinferenceconfig.json`.
195199

196200
The following example demonstrates how to create a minimal environment with no pip dependencies, using the dummy scoring script you defined above.
197201

198-
[!notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=inference-configuration-code)]
202+
[!Notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=inference-configuration-code)]
199203

200204
For more information on environments, see [Create and manage environments for training and deployment](how-to-use-environments.md).
201205

@@ -206,43 +210,69 @@ For more information on inference configuration, see the [InferenceConfig](/pyth
206210

207211
## Define a deployment configuration
208212

209-
A deployment configuration specifies the amount of memory and cores to reserve for your webservice will require in order to run, as well as configuration details of the underlying webservice. For example, a deployment configuration lets you specify that your service needs 2 gigabytes of memory, 2 CPU cores, 1 GPU core, and that you want to enable autoscaling.
213+
A deployment configuration specifies the amount of memory and cores your webservice needs in order to run. It also provides configuration details of the underlying webservice. For example, a deployment configuration lets you specify that your service needs 2 gigabytes of memory, 2 CPU cores, 1 GPU core, and that you want to enable autoscaling.
210214

211215
The options available for a deployment configuration differ depending on the compute target you choose. In a local deployment, all you can specify is which port your webservice will be served on.
212216

213217
# [Azure CLI](#tab/azcli)
214218

215219
[!INCLUDE [aml-local-deploy-config](../../includes/machine-learning-service-local-deploy-config.md)]
216220

217-
For more information, see [this reference](./reference-azure-machine-learning-cli.md#deployment-configuration-schema).
221+
For more information, see the [deployment schema](./reference-azure-machine-learning-cli.md#deployment-configuration-schema).
218222

219223
# [Python](#tab/python)
220224

221-
To create a local deployment configuration, do the following:
225+
The following Python demonstrates how to create a local deployment configuration:
222226

223-
[!notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=deployment-configuration-code)]
227+
[!Notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=deployment-configuration-code)]
224228

225229
---
226230

227231
## Deploy your machine learning model
228232

229233
You are now ready to deploy your model.
230234

231-
[!INCLUDE [aml-deploy-service](../../includes/machine-learning-deploy-service.md)]
235+
# [Azure CLI](#tab/azcli)
236+
237+
Replace `bidaf_onnx:1` with the name of your model and its version number.
238+
239+
```azurecli-interactive
240+
az ml model deploy -n myservice \
241+
-m bidaf_onnx:1 \
242+
--overwrite \
243+
--ic dummyinferenceconfig.json \
244+
--dc deploymentconfig.json \
245+
-g <resource-group> \
246+
-w <workspace-name>
247+
```
248+
249+
# [Python](#tab/python)
250+
232251

252+
[!Notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=deploy-model-code)]
253+
254+
[!Notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=deploy-model-print-logs)]
255+
256+
For more information, see the documentation for [Model.deploy()](/python/api/azureml-core/azureml.core.model.model#deploy-workspace--name--models--inference-config-none--deployment-config-none--deployment-target-none--overwrite-false-) and [Webservice](/python/api/azureml-core/azureml.core.webservice.webservice).
257+
258+
---
233259

234260
## Call into your model
235261

236262
Let's check that your echo model deployed successfully. You should be able to do a simple liveness request, as well as a scoring request:
237263

238264
# [Azure CLI](#tab/azcli)
239-
<!-- cli nb call -->
240265

241-
[!notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/2.deploy-local-cli.ipynb?name=call-into-model-code)]
266+
```azurecli-interactive
267+
curl -v http://localhost:32267
268+
curl -v -X POST -H "content-type:application/json" \
269+
-d '{"query": "What color is the fox", "context": "The quick brown fox jumped over the lazy dog."}' \
270+
http://localhost:32267/score
271+
```
242272

243273
# [Python](#tab/python)
244274
<!-- python nb call -->
245-
[!notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=call-into-model-code)]
275+
[!Notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=call-into-model-code)]
246276

247277
---
248278

@@ -282,19 +312,25 @@ For more information, see the documentation for [LocalWebservice](/python/api/az
282312

283313
Deploy your service again:
284314

285-
---
286-
287315
# [Azure CLI](#tab/azcli)
288316

289317
Replace `bidaf_onnx:1` with the name of your model and its version number.
290318

291-
[!notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/2.deploy-local-cli.ipynb?name=re-deploy-model-code)]
319+
```azurecli-interactive
320+
az ml model deploy -n myservice \
321+
-m bidaf_onnx:1 \
322+
--overwrite \
323+
--ic inferenceconfig.json \
324+
--dc deploymentconfig.json \
325+
-g <resource-group> \
326+
-w <workspace-name>
327+
```
292328

293329
# [Python](#tab/python)
294330

295-
[!notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=re-deploy-model-code)]
331+
[!Notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=re-deploy-model-code)]
296332

297-
[!notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=re-deploy-model-print-logs)]
333+
[!Notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=re-deploy-model-print-logs)]
298334

299335
For more information, see the documentation for [Model.deploy()](/python/api/azureml-core/azureml.core.model.model#deploy-workspace--name--models--inference-config-none--deployment-config-none--deployment-target-none--overwrite-false-) and [Webservice](/python/api/azureml-core/azureml.core.webservice.webservice).
300336

@@ -303,23 +339,23 @@ Then ensure you can send a post request to the service:
303339

304340
# [Azure CLI](#tab/azcli)
305341

306-
[!notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/2.deploy-local-cli.ipynb?name=send-post-request-code)]
342+
```azurecli-interactive
343+
curl -v -X POST -H "content-type:application/json" \
344+
-d '{"query": "What color is the fox", "context": "The quick brown fox jumped over the lazy dog."}' \
345+
http://localhost:32267/score
346+
```
307347

308348
# [Python](#tab/python)
309349

310-
[!notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=send-post-request-code)]
350+
[!Notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=send-post-request-code)]
311351

312352
---
313353

314354
## Choose a compute target
315355

316-
Refer to the below diagram when choosing a compute target.
317-
318-
[![How to choose a compute target](./media/how-to-deploy-and-where/how-to-choose-target.png)](././media/how-to-deploy-and-where/how-to-choose-target.png#lightbox)
319-
320356
[!INCLUDE [aml-deploy-target](../../includes/aml-compute-target-deploy.md)]
321357

322-
## Re-deploy to cloud
358+
## Deploy to cloud
323359

324360
Once you've confirmed your service works locally and chosen a remote compute target, you are ready to deploy to the cloud.
325361

@@ -337,7 +373,7 @@ For more information, see [this reference](./reference-azure-machine-learning-cl
337373

338374
# [Python](#tab/python)
339375

340-
[!notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=deploy-model-on-cloud-code)]
376+
[!Notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=deploy-model-on-cloud-code)]
341377

342378
---
343379

@@ -348,16 +384,30 @@ Deploy your service again:
348384

349385
Replace `bidaf_onnx:1` with the name of your model and its version number.
350386

387+
```azurecli-interactive
388+
az ml model deploy -n myservice \
389+
-m bidaf_onnx:1 \
390+
--overwrite \
391+
--ic inferenceconfig.json \
392+
--dc re-deploymentconfig.json \
393+
-g <resource-group> \
394+
-w <workspace-name>
395+
```
351396

397+
To view the service logs, use the following command:
352398

353-
[!notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/2.deploy-local-cli.ipynb?name=deploy-model-on-cloud-code)]
399+
```azurecli-interactive
400+
az ml service get-logs -n myservice \
401+
-g <resource-group> \
402+
-w <workspace-name>
403+
```
354404

355405
# [Python](#tab/python)
356406

357407

358-
[!notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=re-deploy-service-code)]
408+
[!Notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=re-deploy-service-code)]
359409

360-
[!notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=re-deploy-service-print-logs)]
410+
[!Notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=re-deploy-service-print-logs)]
361411

362412
For more information, see the documentation for [Model.deploy()](/python/api/azureml-core/azureml.core.model.model#deploy-workspace--name--models--inference-config-none--deployment-config-none--deployment-target-none--overwrite-false-) and [Webservice](/python/api/azureml-core/azureml.core.webservice.webservice).
363413

@@ -368,9 +418,9 @@ For more information, see the documentation for [Model.deploy()](/python/api/azu
368418

369419
When you deploy remotely, you may have key authentication enabled. The example below shows how to get your service key with Python in order to make an inference request.
370420

371-
[!notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=call-remote-web-service-code)]
421+
[!Notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=call-remote-web-service-code)]
372422

373-
[!notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=call-remote-webservice-print-logs)]
423+
[!Notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=call-remote-webservice-print-logs)]
374424

375425

376426

@@ -406,9 +456,13 @@ The following table describes the different service states:
406456
# [Azure CLI](#tab/azcli)
407457

408458

409-
[!notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/2.deploy-local-cli.ipynb?name=delete-resource-code)]
459+
[!Notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/2.deploy-local-cli.ipynb?name=delete-resource-code)]
410460

411-
[!notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/2.deploy-local-cli.ipynb?name=delete-your-resource-code)]
461+
```azurecli-interactive
462+
az ml service delete -n myservice
463+
az ml service delete -n myaciservice
464+
az ml model delete --model-id=<MODEL_ID>
465+
```
412466

413467
To delete a deployed webservice, use `az ml service delete <name of webservice>`.
414468

@@ -418,7 +472,7 @@ Read more about [deleting a webservice](/cli/azure/ml(v1)/computetarget/create#a
418472

419473
# [Python](#tab/python)
420474

421-
[!notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=delete-resource-code)]
475+
[!Notebook-python[] (~/azureml-examples-main/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=delete-resource-code)]
422476

423477
To delete a deployed web service, use `service.delete()`.
424478
To delete a registered model, use `model.delete()`.

0 commit comments

Comments
 (0)