Skip to content

Commit c0dd9e5

Browse files
authored
Merge pull request #77494 from j-martens/patch-481
Update how-to-deploy-and-where.md
2 parents c593e6b + 1b8239b commit c0dd9e5

File tree

1 file changed

+69
-68
lines changed

1 file changed

+69
-68
lines changed

articles/machine-learning/service/how-to-deploy-and-where.md

Lines changed: 69 additions & 68 deletions
Original file line numberDiff line numberDiff line change
@@ -43,45 +43,45 @@ For more information on the concepts involved in the deployment workflow, see [M
4343

4444
- The [Azure CLI extension for Machine Learning service](reference-azure-machine-learning-cli.md), or the [Azure Machine Learning Python SDK](https://aka.ms/aml-sdk).
4545

46-
## <a id="registermodel"></a> Register ML models
46+
## <a id="registermodel"></a> Register your model
4747

4848
Register your machine learning models in your Azure Machine Learning workspace. The model can come from Azure Machine Learning or can come from somewhere else. The following examples demonstrate how to register a model from file:
4949

5050
### Register a model from an Experiment Run
5151

52-
**Scikit-Learn example with the CLI**
53-
```azurecli-interactive
54-
az ml model register -n sklearn_mnist --asset-path outputs/sklearn_mnist_model.pkl --experiment-name myexperiment
55-
```
56-
**Using the SDK**
57-
```python
58-
model = run.register_model(model_name='sklearn_mnist', model_path='outputs/sklearn_mnist_model.pkl')
59-
print(model.name, model.id, model.version, sep='\t')
60-
```
52+
+ **Scikit-Learn example using the SDK**
53+
```python
54+
model = run.register_model(model_name='sklearn_mnist', model_path='outputs/sklearn_mnist_model.pkl')
55+
print(model.name, model.id, model.version, sep='\t')
56+
```
57+
+ **Using the CLI**
58+
```azurecli-interactive
59+
az ml model register -n sklearn_mnist --asset-path outputs/sklearn_mnist_model.pkl --experiment-name myexperiment
60+
```
6161

6262
### Register an externally created model
6363

6464
[!INCLUDE [trusted models](../../../includes/machine-learning-service-trusted-model.md)]
6565

6666
You can register an externally created model by providing a **local path** to the model. You can provide either a folder or a single file.
6767

68-
**ONNX example with the Python SDK:**
69-
```python
70-
onnx_model_url = "https://www.cntk.ai/OnnxModels/mnist/opset_7/mnist.tar.gz"
71-
urllib.request.urlretrieve(onnx_model_url, filename="mnist.tar.gz")
72-
!tar xvzf mnist.tar.gz
73-
74-
model = Model.register(workspace = ws,
75-
model_path ="mnist/model.onnx",
76-
model_name = "onnx_mnist",
77-
tags = {"onnx": "demo"},
78-
description = "MNIST image classification CNN from ONNX Model Zoo",)
79-
```
80-
81-
**Using the CLI**
82-
```azurecli-interactive
83-
az ml model register -n onnx_mnist -p mnist/model.onnx
84-
```
68+
+ **ONNX example with the Python SDK:**
69+
```python
70+
onnx_model_url = "https://www.cntk.ai/OnnxModels/mnist/opset_7/mnist.tar.gz"
71+
urllib.request.urlretrieve(onnx_model_url, filename="mnist.tar.gz")
72+
!tar xvzf mnist.tar.gz
73+
74+
model = Model.register(workspace = ws,
75+
model_path ="mnist/model.onnx",
76+
model_name = "onnx_mnist",
77+
tags = {"onnx": "demo"},
78+
description = "MNIST image classification CNN from ONNX Model Zoo",)
79+
```
80+
81+
+ **Using the CLI**
82+
```azurecli-interactive
83+
az ml model register -n onnx_mnist -p mnist/model.onnx
84+
```
8585

8686
**Time estimate**: Approximately 10 seconds.
8787

@@ -218,79 +218,80 @@ The following sections demonstrate how to create the deployment configuration, a
218218

219219
## Deploy to target
220220

221-
### <a id="local"></a> Deploy locally
221+
### <a id="local"></a> Local deployment
222+
223+
To deploy locally, you need to have **Docker installed** on your local machine.
222224

223225
The examples in this section use [deploy_from_image](https://docs.microsoft.com/python/api/azureml-core/azureml.core.webservice(class)?view=azure-ml-py#deploy-from-model-workspace--name--models--image-config--deployment-config-none--deployment-target-none-), which requires you to register the model and image before doing a deployment. For more information on other deployment methods, see [deploy](https://docs.microsoft.com/python/api/azureml-core/azureml.core.webservice(class)?view=azure-ml-py#deploy-workspace--name--model-paths--image-config--deployment-config-none--deployment-target-none-) and [deploy_from_model](https://docs.microsoft.com/python/api/azureml-core/azureml.core.webservice(class)?view=azure-ml-py#deploy-from-model-workspace--name--models--image-config--deployment-config-none--deployment-target-none-).
224226

225-
**To deploy locally, you need to have Docker installed on your local machine.**
226227

227-
**Using the SDK**
228+
+ **Using the SDK**
228229

229-
```python
230-
deployment_config = LocalWebservice.deploy_configuration(port=8890)
231-
service = Model.deploy(ws, "myservice", [model], inference_config, deployment_config)
232-
service.wait_for_deployment(show_output = True)
233-
print(service.state)
234-
```
230+
```python
231+
deployment_config = LocalWebservice.deploy_configuration(port=8890)
232+
service = Model.deploy(ws, "myservice", [model], inference_config, deployment_config)
233+
service.wait_for_deployment(show_output = True)
234+
print(service.state)
235+
```
235236

236-
**Using the CLI**
237+
+ **Using the CLI**
237238

238-
```azurecli-interactive
239-
az ml model deploy -m sklearn_mnist:1 -ic inferenceconfig.json -dc deploymentconfig.json
240-
```
239+
```azurecli-interactive
240+
az ml model deploy -m sklearn_mnist:1 -ic inferenceconfig.json -dc deploymentconfig.json
241+
```
241242

242-
### <a id="aci"></a> Deploy to Azure Container Instances (DEVTEST)
243+
### <a id="aci"></a> Azure Container Instances (DEVTEST)
243244

244245
Use Azure Container Instances for deploying your models as a web service if one or more of the following conditions is true:
245246
- You need to quickly deploy and validate your model.
246247
- You are testing a model that is under development.
247248

248249
To see quota and region availability for ACI, see the [Quotas and region availability for Azure Container Instances](https://docs.microsoft.com/azure/container-instances/container-instances-quotas) article.
249250

250-
**Using the SDK**
251+
+ **Using the SDK**
251252

252-
```python
253-
deployment_config = AciWebservice.deploy_configuration(cpu_cores = 1, memory_gb = 1)
254-
service = Model.deploy(ws, "aciservice", [model], inference_config, deployment_config)
255-
service.wait_for_deployment(show_output = True)
256-
print(service.state)
257-
```
253+
```python
254+
deployment_config = AciWebservice.deploy_configuration(cpu_cores = 1, memory_gb = 1)
255+
service = Model.deploy(ws, "aciservice", [model], inference_config, deployment_config)
256+
service.wait_for_deployment(show_output = True)
257+
print(service.state)
258+
```
258259

259-
**Using the CLI**
260+
+ **Using the CLI**
260261

261-
```azurecli-interactive
262-
az ml model deploy -m sklearn_mnist:1 -n aciservice -ic inferenceconfig.json -dc deploymentconfig.json
263-
```
262+
```azurecli-interactive
263+
az ml model deploy -m sklearn_mnist:1 -n aciservice -ic inferenceconfig.json -dc deploymentconfig.json
264+
```
264265

265266
For more information, see the reference documentation for the [AciWebservice](https://docs.microsoft.com/python/api/azureml-core/azureml.core.webservice.aciwebservice?view=azure-ml-py) and [Webservice](https://docs.microsoft.com/python/api/azureml-core/azureml.core.webservice.webservice?view=azure-ml-py) classes.
266267

267-
### <a id="aks"></a> Deploy to Azure Kubernetes Service (PRODUCTION)
268+
### <a id="aks"></a>Azure Kubernetes Service (PRODUCTION)
268269

269270
You can use an existing AKS cluster or create a new one using the Azure Machine Learning SDK, CLI, or the Azure portal.
270271

271272
<a id="deploy-aks"></a>
272273

273-
If you already have an AKS cluster attached, you can deploy to it. If you have NOT created or attached an AKS cluster go <a href="#create-attach-aks">here</a>.
274+
If you already have an AKS cluster attached, you can deploy to it. If you haven't created or attached an AKS cluster, follow the process to <a href="#create-attach-aks">create a new AKS cluster</a>.
274275

275276

276-
**Using the SDK**
277+
+ **Using the SDK**
277278

278-
```python
279-
aks_target = AksCompute(ws,"myaks")
280-
deployment_config = AksWebservice.deploy_configuration(cpu_cores = 1, memory_gb = 1)
281-
service = Model.deploy(ws, "aksservice", [model], inference_config, deployment_config, aks_target)
282-
service.wait_for_deployment(show_output = True)
283-
print(service.state)
284-
print(service.get_logs())
285-
```
279+
```python
280+
aks_target = AksCompute(ws,"myaks")
281+
deployment_config = AksWebservice.deploy_configuration(cpu_cores = 1, memory_gb = 1)
282+
service = Model.deploy(ws, "aksservice", [model], inference_config, deployment_config, aks_target)
283+
service.wait_for_deployment(show_output = True)
284+
print(service.state)
285+
print(service.get_logs())
286+
```
286287

287-
Learn more about AKS deployment and autoscale in the [AksWebservice.deploy_configuration](https://docs.microsoft.com/python/api/azureml-core/azureml.core.webservice.akswebservice) reference.
288+
Learn more about AKS deployment and autoscale in the [AksWebservice.deploy_configuration](https://docs.microsoft.com/python/api/azureml-core/azureml.core.webservice.akswebservice) reference.
288289

289-
**Using the CLI**
290+
+ **Using the CLI**
290291

291-
```azurecli-interactive
292-
az ml model deploy -ct myaks -m mymodel:1 -n aksservice -ic inferenceconfig.json -dc deploymentconfig.json
293-
```
292+
```azurecli-interactive
293+
az ml model deploy -ct myaks -m mymodel:1 -n aksservice -ic inferenceconfig.json -dc deploymentconfig.json
294+
```
294295

295296
#### Create a new AKS cluster<a id="create-attach-aks"></a>
296297
**Time estimate:** Approximately 5 minutes.

0 commit comments

Comments
 (0)