Skip to content

Commit f11725f

Browse files
authored
Update how-to-deploy-and-where.md
1 parent 8ac78d3 commit f11725f

File tree

1 file changed

+66
-66
lines changed

1 file changed

+66
-66
lines changed

articles/machine-learning/service/how-to-deploy-and-where.md

Lines changed: 66 additions & 66 deletions
Original file line numberDiff line numberDiff line change
@@ -43,45 +43,45 @@ For more information on the concepts involved in the deployment workflow, see [M
4343

4444
- The [Azure CLI extension for Machine Learning service](reference-azure-machine-learning-cli.md), or the [Azure Machine Learning Python SDK](https://aka.ms/aml-sdk).
4545

46-
## <a id="registermodel"></a> Register ML models
46+
## <a id="registermodel"></a> Register your model
4747

4848
Register your machine learning models in your Azure Machine Learning workspace. The model can come from Azure Machine Learning or can come from somewhere else. The following examples demonstrate how to register a model from file:
4949

5050
### Register a model from an Experiment Run
5151

52-
**Scikit-Learn example with the CLI**
53-
```azurecli-interactive
54-
az ml model register -n sklearn_mnist --asset-path outputs/sklearn_mnist_model.pkl --experiment-name myexperiment
55-
```
56-
**Using the SDK**
57-
```python
58-
model = run.register_model(model_name='sklearn_mnist', model_path='outputs/sklearn_mnist_model.pkl')
59-
print(model.name, model.id, model.version, sep='\t')
60-
```
52+
+ **Scikit-Learn example using the SDK**
53+
```python
54+
model = run.register_model(model_name='sklearn_mnist', model_path='outputs/sklearn_mnist_model.pkl')
55+
print(model.name, model.id, model.version, sep='\t')
56+
```
57+
+ **Using the CLI**
58+
```azurecli-interactive
59+
az ml model register -n sklearn_mnist --asset-path outputs/sklearn_mnist_model.pkl --experiment-name myexperiment
60+
```
6161

6262
### Register an externally created model
6363

6464
[!INCLUDE [trusted models](../../../includes/machine-learning-service-trusted-model.md)]
6565

6666
You can register an externally created model by providing a **local path** to the model. You can provide either a folder or a single file.
6767

68-
**ONNX example with the Python SDK:**
69-
```python
70-
onnx_model_url = "https://www.cntk.ai/OnnxModels/mnist/opset_7/mnist.tar.gz"
71-
urllib.request.urlretrieve(onnx_model_url, filename="mnist.tar.gz")
72-
!tar xvzf mnist.tar.gz
73-
74-
model = Model.register(workspace = ws,
75-
model_path ="mnist/model.onnx",
76-
model_name = "onnx_mnist",
77-
tags = {"onnx": "demo"},
78-
description = "MNIST image classification CNN from ONNX Model Zoo",)
79-
```
80-
81-
**Using the CLI**
82-
```azurecli-interactive
83-
az ml model register -n onnx_mnist -p mnist/model.onnx
84-
```
68+
+ **ONNX example with the Python SDK:**
69+
```python
70+
onnx_model_url = "https://www.cntk.ai/OnnxModels/mnist/opset_7/mnist.tar.gz"
71+
urllib.request.urlretrieve(onnx_model_url, filename="mnist.tar.gz")
72+
!tar xvzf mnist.tar.gz
73+
74+
model = Model.register(workspace = ws,
75+
model_path ="mnist/model.onnx",
76+
model_name = "onnx_mnist",
77+
tags = {"onnx": "demo"},
78+
description = "MNIST image classification CNN from ONNX Model Zoo",)
79+
```
80+
81+
+ **Using the CLI**
82+
```azurecli-interactive
83+
az ml model register -n onnx_mnist -p mnist/model.onnx
84+
```
8585

8686
**Time estimate**: Approximately 10 seconds.
8787

@@ -218,53 +218,53 @@ The following sections demonstrate how to create the deployment configuration, a
218218

219219
## Deploy to target
220220

221-
### <a id="local"></a> Deploy locally
221+
### <a id="local"></a> Local
222222

223223
The examples in this section use [deploy_from_image](https://docs.microsoft.com/python/api/azureml-core/azureml.core.webservice(class)?view=azure-ml-py#deploy-from-model-workspace--name--models--image-config--deployment-config-none--deployment-target-none-), which requires you to register the model and image before doing a deployment. For more information on other deployment methods, see [deploy](https://docs.microsoft.com/python/api/azureml-core/azureml.core.webservice(class)?view=azure-ml-py#deploy-workspace--name--model-paths--image-config--deployment-config-none--deployment-target-none-) and [deploy_from_model](https://docs.microsoft.com/python/api/azureml-core/azureml.core.webservice(class)?view=azure-ml-py#deploy-from-model-workspace--name--models--image-config--deployment-config-none--deployment-target-none-).
224224

225225
**To deploy locally, you need to have Docker installed on your local machine.**
226226

227-
**Using the SDK**
227+
+ **Using the SDK**
228228

229-
```python
230-
deployment_config = LocalWebservice.deploy_configuration(port=8890)
231-
service = Model.deploy(ws, "myservice", [model], inference_config, deployment_config)
232-
service.wait_for_deployment(show_output = True)
233-
print(service.state)
234-
```
229+
```python
230+
deployment_config = LocalWebservice.deploy_configuration(port=8890)
231+
service = Model.deploy(ws, "myservice", [model], inference_config, deployment_config)
232+
service.wait_for_deployment(show_output = True)
233+
print(service.state)
234+
```
235235

236-
**Using the CLI**
236+
+ **Using the CLI**
237237

238-
```azurecli-interactive
239-
az ml model deploy -m sklearn_mnist:1 -ic inferenceconfig.json -dc deploymentconfig.json
240-
```
238+
```azurecli-interactive
239+
az ml model deploy -m sklearn_mnist:1 -ic inferenceconfig.json -dc deploymentconfig.json
240+
```
241241

242-
### <a id="aci"></a> Deploy to Azure Container Instances (DEVTEST)
242+
### <a id="aci"></a> Azure Container Instances (DEVTEST)
243243

244244
Use Azure Container Instances for deploying your models as a web service if one or more of the following conditions is true:
245245
- You need to quickly deploy and validate your model.
246246
- You are testing a model that is under development.
247247

248248
To see quota and region availability for ACI, see the [Quotas and region availability for Azure Container Instances](https://docs.microsoft.com/azure/container-instances/container-instances-quotas) article.
249249

250-
**Using the SDK**
250+
+ **Using the SDK**
251251

252-
```python
253-
deployment_config = AciWebservice.deploy_configuration(cpu_cores = 1, memory_gb = 1)
254-
service = Model.deploy(ws, "aciservice", [model], inference_config, deployment_config)
255-
service.wait_for_deployment(show_output = True)
256-
print(service.state)
257-
```
252+
```python
253+
deployment_config = AciWebservice.deploy_configuration(cpu_cores = 1, memory_gb = 1)
254+
service = Model.deploy(ws, "aciservice", [model], inference_config, deployment_config)
255+
service.wait_for_deployment(show_output = True)
256+
print(service.state)
257+
```
258258

259-
**Using the CLI**
259+
+ **Using the CLI**
260260

261-
```azurecli-interactive
262-
az ml model deploy -m sklearn_mnist:1 -n aciservice -ic inferenceconfig.json -dc deploymentconfig.json
263-
```
261+
```azurecli-interactive
262+
az ml model deploy -m sklearn_mnist:1 -n aciservice -ic inferenceconfig.json -dc deploymentconfig.json
263+
```
264264

265265
For more information, see the reference documentation for the [AciWebservice](https://docs.microsoft.com/python/api/azureml-core/azureml.core.webservice.aciwebservice?view=azure-ml-py) and [Webservice](https://docs.microsoft.com/python/api/azureml-core/azureml.core.webservice.webservice?view=azure-ml-py) classes.
266266

267-
### <a id="aks"></a> Deploy to Azure Kubernetes Service (PRODUCTION)
267+
### <a id="aks"></a>Azure Kubernetes Service (PRODUCTION)
268268

269269
You can use an existing AKS cluster or create a new one using the Azure Machine Learning SDK, CLI, or the Azure portal.
270270

@@ -273,24 +273,24 @@ You can use an existing AKS cluster or create a new one using the Azure Machine
273273
If you already have an AKS cluster attached, you can deploy to it. If you have NOT created or attached an AKS cluster go <a href="#create-attach-aks">here</a>.
274274

275275

276-
**Using the SDK**
276+
+ **Using the SDK**
277277

278-
```python
279-
aks_target = AksCompute(ws,"myaks")
280-
deployment_config = AksWebservice.deploy_configuration(cpu_cores = 1, memory_gb = 1)
281-
service = Model.deploy(ws, "aksservice", [model], inference_config, deployment_config, aks_target)
282-
service.wait_for_deployment(show_output = True)
283-
print(service.state)
284-
print(service.get_logs())
285-
```
278+
```python
279+
aks_target = AksCompute(ws,"myaks")
280+
deployment_config = AksWebservice.deploy_configuration(cpu_cores = 1, memory_gb = 1)
281+
service = Model.deploy(ws, "aksservice", [model], inference_config, deployment_config, aks_target)
282+
service.wait_for_deployment(show_output = True)
283+
print(service.state)
284+
print(service.get_logs())
285+
```
286286

287-
Learn more about AKS deployment and autoscale in the [AksWebservice.deploy_configuration](https://docs.microsoft.com/python/api/azureml-core/azureml.core.webservice.akswebservice) reference.
287+
Learn more about AKS deployment and autoscale in the [AksWebservice.deploy_configuration](https://docs.microsoft.com/python/api/azureml-core/azureml.core.webservice.akswebservice) reference.
288288

289-
**Using the CLI**
289+
+ **Using the CLI**
290290

291-
```azurecli-interactive
292-
az ml model deploy -ct myaks -m mymodel:1 -n aksservice -ic inferenceconfig.json -dc deploymentconfig.json
293-
```
291+
```azurecli-interactive
292+
az ml model deploy -ct myaks -m mymodel:1 -n aksservice -ic inferenceconfig.json -dc deploymentconfig.json
293+
```
294294

295295
#### Create a new AKS cluster<a id="create-attach-aks"></a>
296296
**Time estimate:** Approximately 5 minutes.

0 commit comments

Comments
 (0)