Skip to content

Commit 9d2f54b

Browse files
Merge pull request #2129 from jancervenka:patch-1
PiperOrigin-RevId: 483313036
2 parents e715708 + f680eee commit 9d2f54b

File tree

1 file changed

+3
-7
lines changed

1 file changed

+3
-7
lines changed

site/en/guide/core/optimizers_core.ipynb

Lines changed: 3 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,7 @@
1313
"cell_type": "code",
1414
"execution_count": null,
1515
"metadata": {
16+
"cellView": "form",
1617
"id": "AwOEIRJC6Une"
1718
},
1819
"outputs": [],
@@ -343,7 +344,7 @@
343344
"\n",
344345
"Gradient descent with momentum not only uses the gradient to update a variable but also involves the change in position of a variable based on its previous update. The momentum parameter determines the level of influence the update at timestep $t-1$ has on the update at timestep $t$. Accumulating momentum helps to move variables past plataeu regions faster than basic gradient descent. The momentum update rule is as follows:\n",
345346
"\n",
346-
"$$\\Delta_x^{[t]} = lr \\cdot L^\\prime(x^{[t]}) + p \\cdot \\Delta_x^{[t-1]}$$\n",
347+
"$$\\Delta_x^{[t]} = lr \\cdot L^\\prime(x^{[t-1]}) + p \\cdot \\Delta_x^{[t-1]}$$\n",
347348
"\n",
348349
"$$x^{[t]} = x^{[t-1]} - \\Delta_x^{[t]}$$\n",
349350
"\n",
@@ -591,21 +592,16 @@
591592
"This notebook introduced the basics of writing and comparing optimizers with the [TensorFlow Core APIs](https://www.tensorflow.org/guide/core). Although prebuilt optimizers like Adam are generalizable, they may not always be the best choice for every model or dataset. Having fine-grained control over the optimization process can help streamline ML training workflows and improve overall performance. Refer to the following documentation for more examples of custom optimizers:\n",
592593
"\n",
593594
"* This Adam optimizer is used in the [Multilayer perceptrons](https://www.tensorflow.org/guide/core/mlp_core) tutorial and the [Distributed training]()\n",
594-
"* [Model Garden](https://blog.tensorflow.org/2020/03/introducing-model-garden-for-tensorflow-2.html) has a variety of [custom optimizers](https://github.com/tensorflow/models/tree/master/official/modeling/optimization) written with the Core APIs.\n",
595-
"\n",
596-
"\n",
597-
"\n"
595+
"* [Model Garden](https://blog.tensorflow.org/2020/03/introducing-model-garden-for-tensorflow-2.html) has a variety of [custom optimizers](https://github.com/tensorflow/models/tree/master/official/modeling/optimization) written with the Core APIs.\n"
598596
]
599597
}
600598
],
601599
"metadata": {
602600
"colab": {
603601
"collapsed_sections": [],
604602
"name": "optimizers_core.ipynb",
605-
"provenance": [],
606603
"toc_visible": true
607604
},
608-
"gpuClass": "standard",
609605
"kernelspec": {
610606
"display_name": "Python 3",
611607
"name": "python3"

0 commit comments

Comments
 (0)