From 77debaf71dff3aae4541b8d300aebab433baaf56 Mon Sep 17 00:00:00 2001 From: Philip J Briggs Date: Mon, 24 Feb 2020 10:29:32 -0800 Subject: [PATCH 1/2] Update iwd_2020.ipynb --- iwd_2020.ipynb | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/iwd_2020.ipynb b/iwd_2020.ipynb index 0e41fcf..14e03a3 100644 --- a/iwd_2020.ipynb +++ b/iwd_2020.ipynb @@ -605,7 +605,7 @@ "Change:\n", "\n", "```\n", - "history = model.fit(train_images, train_labels, epochs=EPOCHS)\n", + "model.fit(train_images, train_labels, epochs=EPOCHS)\n", "```\n", "\n", "to:\n", @@ -2445,4 +2445,4 @@ ] } ] -} \ No newline at end of file +} From c1ed41422007939dd141a60ca83839b891839bb7 Mon Sep 17 00:00:00 2001 From: Philip J Briggs Date: Mon, 24 Feb 2020 11:14:36 -0800 Subject: [PATCH 2/2] Update iwd_2020.ipynb --- iwd_2020.ipynb | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/iwd_2020.ipynb b/iwd_2020.ipynb index 14e03a3..c425ed0 100644 --- a/iwd_2020.ipynb +++ b/iwd_2020.ipynb @@ -598,7 +598,7 @@ "\n", "**3) Add plots to observe overfitting**\n", "\n", - "If trained for too long, a NN may begin to memorize the training data (rather than learning patterns that generalize to unseen data). This is called overfitting. Of all the hyperparmeters in the design of your network (the number and width of layers, the optimizer, etc) - the most important to set properly is ```epochs```. You will learn more about this in exercise two.\n", + "If trained for too long, a NN may begin to memorize the training data (rather than learning patterns that generalize to unseen data). This is called overfitting. Of all the hyperparameters in the design of your network (the number and width of layers, the optimizer, etc) - the most important to set properly is ```epochs```. You will learn more about this in exercise two.\n", "\n", "To create plots to observe overfitting, modify your training loop as follows.\n", "\n",