Skip to content

Commit c841a6e

Browse files
Merge pull request #2325 from sanskarmodi8:issue#75194-fix
PiperOrigin-RevId: 674988489
2 parents 7d04d5d + f8341de commit c841a6e

File tree

1 file changed

+4
-5
lines changed

1 file changed

+4
-5
lines changed

site/en/tutorials/keras/overfit_and_underfit.ipynb

Lines changed: 4 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -543,10 +543,10 @@
543543
" model.summary()\n",
544544
"\n",
545545
" history = model.fit(\n",
546-
" train_ds,\n",
546+
" train_ds.map(lambda x, y: (x, tf.expand_dims(y, axis=-1))),\n",
547547
" steps_per_epoch = STEPS_PER_EPOCH,\n",
548548
" epochs=max_epochs,\n",
549-
" validation_data=validate_ds,\n",
549+
" validation_data=validate_ds.map(lambda x, y: (x, tf.expand_dims(y, axis=-1))),\n",
550550
" callbacks=get_callbacks(name),\n",
551551
" verbose=0)\n",
552552
" return history"
@@ -977,7 +977,7 @@
977977
"source": [
978978
"`l2(0.001)` means that every coefficient in the weight matrix of the layer will add `0.001 * weight_coefficient_value**2` to the total **loss** of the network.\n",
979979
"\n",
980-
"That is why we're monitoring the `binary_crossentropy` directly. Because it doesn't have this regularization component mixed in.\n",
980+
"That is why you need to monitor the `binary_crossentropy` directly. Because it doesn't have this regularization component mixed in.\n",
981981
"\n",
982982
"So, that same `\"Large\"` model with an `L2` regularization penalty performs much better:\n"
983983
]
@@ -1228,10 +1228,9 @@
12281228
}
12291229
],
12301230
"metadata": {
1231-
"accelerator": "GPU",
12321231
"colab": {
12331232
"name": "overfit_and_underfit.ipynb",
1234-
"toc_visible": true
1233+
"toc_visible": true
12351234
},
12361235
"kernelspec": {
12371236
"display_name": "Python 3",

0 commit comments

Comments
 (0)