Skip to content

Commit 3d72e5e

Browse files
author
Sandeep Kumar
committed
fix
1 parent 0afe31d commit 3d72e5e

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

samples/04_gis_analysts_data_scientists/training_a_wind_turbine_detection_model_using_large_volume_of_training_data.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -21,8 +21,8 @@
2121
"source": [
2222
"## Table of Contents\n",
2323
"* [Introduction](#Introduction)\n",
24-
"* [Export training data](#Export-Training-Data) or [Download training data](#Download-sample-training-data-optional)\n",
25-
"* [Model training](#Model-training)\n",
24+
"* [Export training data](#Export-Training-Data) or [Download sample training data](#Download-sample-training-data-optional)\n",
25+
"* [Model training](#Model-Training)\n",
2626
" * [Executing model training script](#Executing-model-training-script)\n",
2727
" * [Monitor model training](#Monitor-model-training)\n",
2828
"* [Model inference](#Model-inference)"
@@ -39,7 +39,7 @@
3939
"cell_type": "markdown",
4040
"metadata": {},
4141
"source": [
42-
"When training robust deep learning models, large amounts of training data is usually required. Unfortunately, the large volume of data can often be difficult to manage and process. To reduce the time required to export training data and train a model, we can distribute the workload to different processes, or even different machines altogether."
42+
"When training robust deep learning models, large amounts of training data is usually required. Unfortunately, large volumes of data can often be difficult to manage and process. To reduce the effort required to export training data and train a model, we can distribute the workload to different processes, or even different machines altogether."
4343
]
4444
},
4545
{

0 commit comments

Comments
 (0)