Skip to content

Commit e7ba160

Browse files
authored
mxnet_mnist.ipynb fix (#1597)
* Update mxnet_mnist.ipynb Set notebook to default to CPU training * Update mxnet_mnist.ipynb
1 parent 51b6d6b commit e7ba160

File tree

1 file changed

+4
-6
lines changed

1 file changed

+4
-6
lines changed

advanced_functionality/mxnet_mnist_byom/mxnet_mnist.ipynb

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@
2828
"## Introduction\n",
2929
"In this notebook, we will train a neural network locally on the location from where this notebook is run using MXNet. We will then see how to create an endpoint from the trained MXNet model and deploy it on SageMaker. We will then inference from the newly created SageMaker endpoint. \n",
3030
"\n",
31-
"The neural network that we will use is a simple fully-connected neural network. The definition of the neural network can be found in the accompanying [mnist.py](mnist.py) file. The ``build_graph`` method contains the model defnition (shown below).\n",
31+
"The neural network that we will use is a simple fully-connected neural network. The definition of the neural network can be found in the accompanying [mnist.py](mnist.py) file. The ``build_graph`` method contains the model definition (shown below).\n",
3232
"\n",
3333
"```python\n",
3434
"def build_graph():\n",
@@ -98,10 +98,10 @@
9898
"source": [
9999
"### Training\n",
100100
"\n",
101-
"It is time to train the network. Since we are training the network locally, we can make use of mxnet training tools. The training method is also in the accompanying [mnist.py](mnist.py) file. The notebook assumes that this instance is a `p2.xlarge`. If running this in a non-GPU notebook instance, please adjust num_gpus=0 and num_cpu=1 The method is shown below. \n",
101+
"It is time to train the network. Since we are training the network locally, we can make use of mxnet training tools. The training method is also in the accompanying [mnist.py](mnist.py) file. The method is as follows. \n",
102102
"\n",
103103
"```python \n",
104-
"def train(data, hyperparameters= {'learning_rate': 0.11}, num_cpus=0, num_gpus =1 , **kwargs):\n",
104+
"def train(data, hyperparameters= {'learning_rate': 0.11}, num_cpus=1, num_gpus =0 , **kwargs):\n",
105105
" train_labels = data['train_label']\n",
106106
" train_images = data['train_data']\n",
107107
" test_labels = data['test_label']\n",
@@ -133,15 +133,13 @@
133133
"outputs": [],
134134
"source": [
135135
"from mnist import train\n",
136-
"model = train(data = data, num_cpus=0, num_gpus=1)"
136+
"model = train(data = data, num_cpus=1, num_gpus=0)"
137137
]
138138
},
139139
{
140140
"cell_type": "markdown",
141141
"metadata": {},
142142
"source": [
143-
"If you want to run the training on a cpu or if you are on an instance with cpus only, pass appropriate arguments. \n",
144-
"\n",
145143
"## Set up hosting for the model\n",
146144
"\n",
147145
"### Export the model from mxnet\n",

0 commit comments

Comments
 (0)