Skip to content

Commit df5b054

Browse files
committed
new title
1 parent b87cf7d commit df5b054

File tree

1 file changed

+2
-5
lines changed

1 file changed

+2
-5
lines changed

articles/machine-learning/concept-distributed-training.md

Lines changed: 2 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -11,13 +11,11 @@ ms.topic: conceptual
1111
ms.date: 03/27/2020
1212
---
1313

14-
# Distributed training with Azure Machine Learning
15-
16-
## What is distributed training?
14+
# What is distributed training?
1715

1816
Distributed training refers to the ability to share and parallelize data loads and training tasks across multiple GPUs to accelerate model training. The typical use case for distributed training is for training deep neural networks and [deep learning](concept-deep-learning-vs-machine-learning.md) models.
1917

20-
Deep neural networks are often compute intensive, as they require large learning workloads in order to process millions of examples and parameters across multiple layers. This deep learning lends itself well to distributed training, since running tasks in parallel instead of serially, saves time and compute resources.
18+
Deep neural networks are often compute intensive, as they require large learning workloads in order to process millions of examples and parameters across multiple layers. This deep learning lends itself well to distributed training, since running tasks in parallel, instead of serially, saves time and compute resources.
2119

2220
## Distributed training in Azure Machine Learning
2321

@@ -27,7 +25,6 @@ Azure Machine Learning supports distributed training via integrations with popul
2725

2826
* [Distributed training with TensorFlow](how-to-train-pytorch.md#distributed-training)
2927

30-
3128
For training traditional ML models, see [Azure Machine Learning SDK for Python](concept-train-machine-learning-model.md#python-sdk) for the different ways to train models using the Python SDK.
3229

3330
## Types of distributed training

0 commit comments

Comments
 (0)