Skip to content

Commit 8aafb15

Browse files
authored
revert
1 parent bab3160 commit 8aafb15

File tree

1 file changed

+4
-2
lines changed

1 file changed

+4
-2
lines changed

site/en/guide/core/optimizers_core.ipynb

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,6 @@
1313
"cell_type": "code",
1414
"execution_count": null,
1515
"metadata": {
16-
"cellView": "form",
1716
"id": "AwOEIRJC6Une"
1817
},
1918
"outputs": [],
@@ -592,7 +591,10 @@
592591
"This notebook introduced the basics of writing and comparing optimizers with the [TensorFlow Core APIs](https://www.tensorflow.org/guide/core). Although prebuilt optimizers like Adam are generalizable, they may not always be the best choice for every model or dataset. Having fine-grained control over the optimization process can help streamline ML training workflows and improve overall performance. Refer to the following documentation for more examples of custom optimizers:\n",
593592
"\n",
594593
"* This Adam optimizer is used in the [Multilayer perceptrons](https://www.tensorflow.org/guide/core/mlp_core) tutorial and the [Distributed training]()\n",
595-
"* [Model Garden](https://blog.tensorflow.org/2020/03/introducing-model-garden-for-tensorflow-2.html) has a variety of [custom optimizers](https://github.com/tensorflow/models/tree/master/official/modeling/optimization) written with the Core APIs.\n"
594+
"* [Model Garden](https://blog.tensorflow.org/2020/03/introducing-model-garden-for-tensorflow-2.html) has a variety of [custom optimizers](https://github.com/tensorflow/models/tree/master/official/modeling/optimization) written with the Core APIs.\n",
595+
"\n",
596+
"\n",
597+
"\n"
596598
]
597599
}
598600
],

0 commit comments

Comments
 (0)