Skip to content

Commit 1c35584

Browse files
author
Larry Franks
committed
acrolinx and broken links
1 parent 0635f43 commit 1c35584

15 files changed

+19
-14
lines changed

.openpublishing.redirection.json

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -170,6 +170,11 @@
170170
"redirect_url": "/azure/machine-learning/service/how-to-debug-parallel-run-step",
171171
"redirect_document_id": false
172172
},
173+
{
174+
"source_path": "articles/machine-learning/service/how-to-run-batch-predictions.md",
175+
"redirect_url": "/azure/machine-learning/service/how-to-use-parallel-run-step",
176+
"redirect_document_id": false
177+
},
173178
{
174179
"source_path": "articles/machine-learning/service/quickstart-run-local-notebook.md",
175180
"redirect_url": "/azure/machine-learning/service/how-to-configure-environment#local",

articles/machine-learning/azure-machine-learning-release-notes.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1486,7 +1486,7 @@ Azure Machine Learning Compute can be created in Python, using Azure portal, or
14861486
+ ML Pipelines
14871487
+ New and updated notebooks for getting started with pipelines, batch scoping, and style transfer examples: https://aka.ms/aml-pipeline-notebooks
14881488
+ Learn how to [create your first pipeline](how-to-create-your-first-pipeline.md)
1489-
+ Learn how to [run batch predictions using pipelines](how-to-run-batch-predictions.md)
1489+
+ Learn how to [run batch predictions using pipelines](how-to-use-parallel-run-step.md)
14901490
+ Azure Machine Learning compute target
14911491
+ [Sample notebooks](https://aka.ms/aml-notebooks) are now updated to use the new managed compute.
14921492
+ [Learn about this compute](how-to-set-up-training-targets.md#amlcompute)

articles/machine-learning/concept-enterprise-security.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -330,7 +330,7 @@ Here are the details:
330330

331331
* [Secure Azure Machine Learning web services with SSL](how-to-secure-web-service.md)
332332
* [Consume a Machine Learning model deployed as a web service](how-to-consume-web-service.md)
333-
* [How to run batch predictions](how-to-run-batch-predictions.md)
333+
* [How to run batch predictions](how-to-use-parallel-run-step.md)
334334
* [Monitor your Azure Machine Learning models with Application Insights](how-to-enable-app-insights.md)
335335
* [Collect data for models in production](how-to-enable-data-collection.md)
336336
* [Azure Machine Learning SDK](https://docs.microsoft.com/python/api/overview/azure/ml/intro?view=azure-ml-py)

articles/machine-learning/concept-model-management-and-deployment.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -85,7 +85,7 @@ You also provide the configuration of the target deployment platform. For exampl
8585
When the image is created, components required by Azure Machine Learning are also added. For example, assets needed to run the web service and interact with IoT Edge.
8686

8787
#### Batch scoring
88-
Batch scoring is supported through ML pipelines. For more information, see [Batch predictions on big data](how-to-run-batch-predictions.md).
88+
Batch scoring is supported through ML pipelines. For more information, see [Batch predictions on big data](how-to-use-parallel-run-step.md).
8989

9090
#### Real-time web services
9191

articles/machine-learning/how-to-access-data.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -340,7 +340,7 @@ Azure Machine Learning provides several ways to use your models for scoring. Som
340340

341341
| Method | Datastore access | Description |
342342
| ----- | :-----: | ----- |
343-
| [Batch prediction](how-to-run-batch-predictions.md) || Make predictions on large quantities of data asynchronously. |
343+
| [Batch prediction](how-to-use-parallel-run-step.md) || Make predictions on large quantities of data asynchronously. |
344344
| [Web service](how-to-deploy-and-where.md) |   | Deploy models as a web service. |
345345
| [Azure IoT Edge module](how-to-deploy-and-where.md) |   | Deploy models to IoT Edge devices. |
346346

articles/machine-learning/how-to-debug-parallel-run-step.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -77,7 +77,7 @@ def run(mini_batch):
7777

7878
### How could I pass a side input such as, a file or file(s) containing a lookup table, to all my workers?
7979

80-
Construct a [Dataset](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.dataset.dataset?view=azure-ml-py) object containing the side input and register with your workspace. After that you can access it in your inference script (for example, in your init() method) as follows:
80+
Construct a [Dataset](https://docs.microsoft.com/python/api/azureml-core/azureml.core.dataset.dataset?view=azure-ml-py) object containing the side input and register with your workspace. After that you can access it in your inference script (for example, in your init() method) as follows:
8181

8282
```python
8383
from azureml.core.run import Run

articles/machine-learning/how-to-deploy-and-where.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -181,7 +181,7 @@ To deploy the model, you need the following items:
181181
>
182182
> * The Azure Machine Learning SDK doesn't provide a way for web services or IoT Edge deployments to access your data store or datasets. If your deployed model needs to access data stored outside the deployment, like data in an Azure storage account, you must develop a custom code solution by using the relevant SDK. For example, the [Azure Storage SDK for Python](https://github.com/Azure/azure-storage-python).
183183
>
184-
> An alternative that might work for your scenario is [batch prediction](how-to-run-batch-predictions.md), which does provide access to data stores during scoring.
184+
> An alternative that might work for your scenario is [batch prediction](how-to-use-parallel-run-step.md), which does provide access to data stores during scoring.
185185

186186
* **Dependencies**, like helper scripts or Python/Conda packages required to run the entry script or model.
187187

articles/machine-learning/how-to-deploy-app-service.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@ Before deploying, you must define what is needed to run the model as a web servi
6262
> [!IMPORTANT]
6363
> The Azure Machine Learning SDK does not provide a way for the web service access your datastore or data sets. If you need the deployed model to access data stored outside the deployment, such as in an Azure Storage account, you must develop a custom code solution using the relevant SDK. For example, the [Azure Storage SDK for Python](https://github.com/Azure/azure-storage-python).
6464
>
65-
> Another alternative that may work for your scenario is [batch predictions](how-to-run-batch-predictions.md), which does provide access to datastores when scoring.
65+
> Another alternative that may work for your scenario is [batch predictions](how-to-use-parallel-run-step.md), which does provide access to datastores when scoring.
6666
6767
For more information on entry scripts, see [Deploy models with Azure Machine Learning](service/how-to-deploy-and-where.md).
6868

articles/machine-learning/how-to-deploy-inferencing-gpus.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ This article teaches you how to use Azure Machine Learning to deploy a GPU-enabl
2020
Inference, or model scoring, is the phase where the deployed model is used to make predictions. Using GPUs instead of CPUs offers performance advantages on highly parallelizable computation.
2121

2222
> [!IMPORTANT]
23-
> For web service deployments, GPU inference is only supported on Azure Kubernetes Service. For inference using a __machine learning pipeline__, GPUs are only supported on Azure Machine Learning Compute. For more information on using ML pipelines, see [Run batch predictions](how-to-run-batch-predictions.md).
23+
> For web service deployments, GPU inference is only supported on Azure Kubernetes Service. For inference using a __machine learning pipeline__, GPUs are only supported on Azure Machine Learning Compute. For more information on using ML pipelines, see [Run batch predictions](how-to-use-parallel-run-step.md).
2424
2525
> [!TIP]
2626
> Although the code snippets in this article use a TensorFlow model, you can apply the information to any machine learning framework that supports GPUs.

articles/machine-learning/how-to-run-batch-predictions-designer.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ ms.custom: Ignite2019
1818

1919
In this how-to, you learn how to use the designer to train a model and set up a batch prediction pipeline and web service. Batch prediction allows for continuous and on-demand scoring of trained models on large data sets, optionally configured as a web service that can be triggered from any HTTP library.
2020

21-
For setting up batch scoring services using the SDK, see the accompanying [how-to](how-to-run-batch-predictions.md).
21+
For setting up batch scoring services using the SDK, see the accompanying [how-to](how-to-use-parallel-run-step.md).
2222

2323
In this how-to, you learn the following tasks:
2424

0 commit comments

Comments
 (0)