diff --git a/iwd_2020.ipynb b/iwd_2020.ipynb index 0e41fcf..c425ed0 100644 --- a/iwd_2020.ipynb +++ b/iwd_2020.ipynb @@ -598,14 +598,14 @@ "\n", "**3) Add plots to observe overfitting**\n", "\n", - "If trained for too long, a NN may begin to memorize the training data (rather than learning patterns that generalize to unseen data). This is called overfitting. Of all the hyperparmeters in the design of your network (the number and width of layers, the optimizer, etc) - the most important to set properly is ```epochs```. You will learn more about this in exercise two.\n", + "If trained for too long, a NN may begin to memorize the training data (rather than learning patterns that generalize to unseen data). This is called overfitting. Of all the hyperparameters in the design of your network (the number and width of layers, the optimizer, etc) - the most important to set properly is ```epochs```. You will learn more about this in exercise two.\n", "\n", "To create plots to observe overfitting, modify your training loop as follows.\n", "\n", "Change:\n", "\n", "```\n", - "history = model.fit(train_images, train_labels, epochs=EPOCHS)\n", + "model.fit(train_images, train_labels, epochs=EPOCHS)\n", "```\n", "\n", "to:\n", @@ -2445,4 +2445,4 @@ ] } ] -} \ No newline at end of file +}