You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/machine-learning/service/tutorial-train-models-with-aml.md
+35-33Lines changed: 35 additions & 33 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,7 +9,7 @@ ms.topic: tutorial
9
9
author: hning86
10
10
ms.author: haining
11
11
ms.reviewer: sgilley
12
-
ms.date: 09/24/2018
12
+
ms.date: 11/16/2018
13
13
#Customer intent: As a professional data scientist, I can build an image classification model with Azure Machine Learning using Python in a Jupyter notebook.
14
14
---
15
15
@@ -39,7 +39,7 @@ For your convenience, this tutorial is available as a [Jupyter notebook](https:/
Azure Batch AI is a managed service that enables data scientists to train machine learning models on clusters of Azure virtual machines, including VMs with GPU support. In this tutorial, you create an Azure Batch AI cluster as your training environment. This code creates a cluster for you if it does not already exist in your workspace.
93
+
Azure Azure ML Managed Compute is a managed service that enables data scientists to train machine learning models on clusters of Azure virtual machines, including VMs with GPU support. In this tutorial, you create an Azure Managed Compute cluster as your training environment. This code creates a cluster for you if it does not already exist in your workspace.
94
94
95
95
**Creation of the cluster takes approximately 5 minutes.** If the cluster is already in the workspace this code uses it and skips the creation process.
96
96
97
97
98
98
```python
99
-
from azureml.core.compute import ComputeTarget, BatchAiCompute
100
-
from azureml.core.compute_target import ComputeTargetException
Load the compressed files into `numpy` arrays. Then use `matplotlib` to plot 30 random images from the dataset with their labels above them. Note this step requires a `load_data` function that's included in the`util.py` file. This file is included in the sample folder. Please make sure it is placed in the same folder as this notebook. The `load_data` function parses the compresse files into numpy arrays.
163
+
Load the compressed files into `numpy` arrays. Then use `matplotlib` to plot 30 random images from the dataset with their labels above them. Note this step requires a `load_data` function that's included in an`util.py` file. This file is included in the sample folder. Please make sure it is placed in the same folder as this notebook. The `load_data` function simply parses the compressed files into numpy arrays.
+ The training script reads an argument to find the directory containing the data. When you submit the job later, you point to the datastore for this argument:
315
-
`parser.add_argument('--data-folder', type = str, dest = 'data_folder', help = 'data directory mounting point')`
Anything written in this directory is automatically uploaded into your workspace. You'll access your model from this directory later in the tutorial.
321
323
322
324
The file `utils.py` is referenced from the training script to load the dataset correctly. Copy this script into the script folder so that it can be accessed along with the training script on the remote resource.
@@ -339,7 +341,7 @@ An estimator object is used to submit the run. Create your estimator by running
339
341
* Parameters required from the training script
340
342
* Python packages needed for training
341
343
342
-
In this tutorial, this target is the Batch AI cluster. All files in the project directory are uploaded into the cluster nodes for execution. The data_folder is set to use the datastore (`ds.as_mount()`).
344
+
In this tutorial, this target is the Batch AI cluster. All files in the script folder are uploaded into the cluster nodes for execution. The data_folder is set to use the datastore (`ds.as_mount()`).
343
345
344
346
```python
345
347
from azureml.train.estimator import Estimator
@@ -421,7 +423,7 @@ The output shows the remote model has an accuracy slightly higher than the local
0 commit comments