Skip to content

Commit 41ee76c

Browse files
authored
Merge pull request #77531 from j-martens/patch-482
Update how-to-deploy-and-where.md
2 parents ec7cc9c + bc0fe3b commit 41ee76c

File tree

1 file changed

+16
-11
lines changed

1 file changed

+16
-11
lines changed

articles/machine-learning/service/how-to-deploy-and-where.md

Lines changed: 16 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -18,17 +18,7 @@ ms.custom: seoapril2019
1818

1919
Learn how to deploy your machine learning model as a web service in the Azure cloud, or to IoT Edge devices.
2020

21-
The following compute targets, or compute resources, can be used to host your service deployment.
22-
23-
| Compute target | Deployment type | Description |
24-
| ----- | ----- | ----- |
25-
| [Local web service](#local) | Test/debug | Good for limited testing and troubleshooting.
26-
| [Azure Kubernetes Service (AKS)](#aks) | Real-time inference | Good for high-scale production deployments. Provides autoscaling, and fast response times. |
27-
| [Azure Container Instances (ACI)](#aci) | Testing | Good for low scale, CPU-based workloads. |
28-
| [Azure Machine Learning Compute](how-to-run-batch-predictions.md) | (Preview) Batch inference | Run batch scoring on serverless compute. Supports normal and low-priority VMs. |
29-
| [Azure IoT Edge](#iotedge) | (Preview) IoT module | Deploy & serve ML models on IoT devices. |
30-
31-
The workflow is similar for all compute targets:
21+
The workflow is similar regardless of [where you deploy](#target) your model:
3222

3323
1. Register the model.
3424
1. Prepare to deploy (specify assets, usage, compute target)
@@ -87,6 +77,21 @@ You can register an externally created model by providing a **local path** to th
8777

8878
For more information, see the reference documentation for the [Model class](https://docs.microsoft.com/python/api/azureml-core/azureml.core.model.model?view=azure-ml-py).
8979

80+
<a name="target"></a>
81+
82+
## Choose a compute target
83+
84+
The following compute targets, or compute resources, can be used to host your web service deployment.
85+
86+
| Compute target | Usage | Description |
87+
| ----- | ----- | ----- |
88+
| [Local web service](#local) | Testing/debug | Good for limited testing and troubleshooting.
89+
| [Azure Kubernetes Service (AKS)](#aks) | Real-time inference | Good for high-scale production deployments. Provides autoscaling, and fast response times. |
90+
| [Azure Container Instances (ACI)](#aci) | Testing | Good for low scale, CPU-based workloads. |
91+
| [Azure Machine Learning Compute](how-to-run-batch-predictions.md) | (Preview) Batch inference | Run batch scoring on serverless compute. Supports normal and low-priority VMs. |
92+
| [Azure IoT Edge](#iotedge) | (Preview) IoT module | Deploy & serve ML models on IoT devices. |
93+
94+
9095
## Prepare to deploy
9196

9297
To deploy as a web service, you must create an inference configuration (`InferenceConfig`) and a deployment configuration. Inference, or model scoring, is the phase where the deployed model is used for prediction, most commonly on production data. In the inference config, you specify the scripts and dependencies needed to serve your model. In the deployment config you specify details of how to serve the model on the compute target.

0 commit comments

Comments
 (0)