Skip to content

Commit f680eee

Browse files
committed
nbfmt
1 parent 8aafb15 commit f680eee

File tree

1 file changed

+2
-4
lines changed

1 file changed

+2
-4
lines changed

site/en/guide/core/optimizers_core.ipynb

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,7 @@
1313
"cell_type": "code",
1414
"execution_count": null,
1515
"metadata": {
16+
"cellView": "form",
1617
"id": "AwOEIRJC6Une"
1718
},
1819
"outputs": [],
@@ -591,10 +592,7 @@
591592
"This notebook introduced the basics of writing and comparing optimizers with the [TensorFlow Core APIs](https://www.tensorflow.org/guide/core). Although prebuilt optimizers like Adam are generalizable, they may not always be the best choice for every model or dataset. Having fine-grained control over the optimization process can help streamline ML training workflows and improve overall performance. Refer to the following documentation for more examples of custom optimizers:\n",
592593
"\n",
593594
"* This Adam optimizer is used in the [Multilayer perceptrons](https://www.tensorflow.org/guide/core/mlp_core) tutorial and the [Distributed training]()\n",
594-
"* [Model Garden](https://blog.tensorflow.org/2020/03/introducing-model-garden-for-tensorflow-2.html) has a variety of [custom optimizers](https://github.com/tensorflow/models/tree/master/official/modeling/optimization) written with the Core APIs.\n",
595-
"\n",
596-
"\n",
597-
"\n"
595+
"* [Model Garden](https://blog.tensorflow.org/2020/03/introducing-model-garden-for-tensorflow-2.html) has a variety of [custom optimizers](https://github.com/tensorflow/models/tree/master/official/modeling/optimization) written with the Core APIs.\n"
598596
]
599597
}
600598
],

0 commit comments

Comments
 (0)