Skip to content

Commit 7880cdf

Browse files
authored
Broken link fixed
1 parent 708b4ab commit 7880cdf

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

articles/machine-learning/v1/how-to-attach-compute-targets.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -54,14 +54,14 @@ To use compute targets managed by Azure Machine Learning, see:
5454

5555
## What's a compute target?
5656

57-
With Azure Machine Learning, you can train your model on various resources or environments, collectively referred to as [__compute targets__](concept-azure-machine-learning-architecture.md#compute-targets). A compute target can be a local machine or a cloud resource, such as an Azure Machine Learning Compute, Azure HDInsight, or a remote virtual machine. You also use compute targets for model deployment as described in ["Where and how to deploy your models"](../how-to-deploy-and-where.md).
57+
With Azure Machine Learning, you can train your model on various resources or environments, collectively referred to as [__compute targets__](concept-azure-machine-learning-architecture.md#compute-targets). A compute target can be a local machine or a cloud resource, such as an Azure Machine Learning Compute, Azure HDInsight, or a remote virtual machine. You also use compute targets for model deployment as described in ["Where and how to deploy your models"](../how-to-deploy-managed-online-endpoints.md).
5858

5959

6060
## Local computer
6161

6262
When you use your local computer for **training**, there is no need to create a compute target. Just [submit the training run](../how-to-set-up-training-targets.md) from your local machine.
6363

64-
When you use your local computer for **inference**, you must have Docker installed. To perform the deployment, use [LocalWebservice.deploy_configuration()](/python/api/azureml-core/azureml.core.webservice.local.localwebservice#deploy-configuration-port-none-) to define the port that the web service will use. Then use the normal deployment process as described in [Deploy models with Azure Machine Learning](../how-to-deploy-and-where.md).
64+
When you use your local computer for **inference**, you must have Docker installed. To perform the deployment, use [LocalWebservice.deploy_configuration()](/python/api/azureml-core/azureml.core.webservice.local.localwebservice#deploy-configuration-port-none-) to define the port that the web service will use. Then use the normal deployment process as described in [Deploy models with Azure Machine Learning](../how-to-deploy-managed-online-endpoints.md).
6565

6666
## Remote virtual machines
6767

@@ -370,5 +370,5 @@ See these notebooks for examples of training with various compute targets:
370370
* Use the compute resource to [configure and submit a training run](../how-to-set-up-training-targets.md).
371371
* [Tutorial: Train and deploy a model](../tutorial-train-deploy-notebook.md) uses a managed compute target to train a model.
372372
* Learn how to [efficiently tune hyperparameters](../how-to-tune-hyperparameters.md) to build better models.
373-
* Once you have a trained model, learn [how and where to deploy models](../how-to-deploy-and-where.md).
373+
* Once you have a trained model, learn [how and where to deploy models](../how-to-deploy-managed-online-endpoints.md).
374374
* [Use Azure Machine Learning with Azure Virtual Networks](../how-to-network-security-overview.md)

0 commit comments

Comments
 (0)