Skip to content

Commit c0cf619

Browse files
Fix some typos in the Uncertainty-aware Deep Learning with SNGP tutorial colab.
PiperOrigin-RevId: 379287145
1 parent a029568 commit c0cf619

File tree

1 file changed

+5
-5
lines changed

1 file changed

+5
-5
lines changed

site/en/tutorials/understanding/sngp.ipynb

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -499,7 +499,7 @@
499499
},
500500
"outputs": [],
501501
"source": [
502-
"resnet_model.compile(optimizer=optimizer, loss=loss, metrics=metrics)\n",
502+
"resnet_model.compile(**train_config)\n",
503503
"resnet_model.fit(train_examples, train_labels, **fit_config)"
504504
]
505505
},
@@ -838,7 +838,7 @@
838838
"id": "P13X7Adt-c2d"
839839
},
840840
"source": [
841-
"Note: The momentum-based update method can be sensitive to batch size. Therefore it is generally recommended to set `gp_cov_momentum=-1` to compute the covariance exactly. For this to work properly, the covariance matrix estimator needs to be reset at the begining of a new epoch in order to avoid counting the same data twice. For `RandomFeatureGaussianProcess`, this is can be done by calling its `rest_covariance_matrix()`. The next section shows an easy implementation of this using Keras' built-in API.\n"
841+
"Note: The momentum-based update method can be sensitive to batch size. Therefore it is generally recommended to set `gp_cov_momentum=-1` to compute the covariance exactly. For this to work properly, the covariance matrix estimator needs to be reset at the begining of a new epoch in order to avoid counting the same data twice. For `RandomFeatureGaussianProcess`, this is can be done by calling its `reset_covariance_matrix()`. The next section shows an easy implementation of this using Keras' built-in API.\n"
842842
]
843843
},
844844
{
@@ -871,7 +871,7 @@
871871
"source": [
872872
"Note: Notice that under this implementation of the SNGP model, the predictive logits $logit(x_{test})$ for all classes share the same covariance matrix $var(x_{test})$, which describes the distance between $x_{test}$ from the training data. \n",
873873
"\n",
874-
"Theoretically, it is possible to extend the algorithm to compute different variance values for different classes (as introduced in the [original SNGP paper](https://arxiv.org/abs/2006.10108)). However, this is diffcult to scale to problems with large output spaces (e.g., ImageNet or language modeling)."
874+
"Theoretically, it is possible to extend the algorithm to compute different variance values for different classes (as introduced in the [original SNGP paper](https://arxiv.org/abs/2006.10108)). However, this is difficult to scale to problems with large output spaces (e.g., ImageNet or language modeling)."
875875
]
876876
},
877877
{
@@ -1512,7 +1512,7 @@
15121512
"In this tutorial, you have:\n",
15131513
"* Implemented a SNGP model on a deep classifier to improve its distance awareness.\n",
15141514
"* Trained the SNGP model end-to-end using Keras `model.fit()` API.\n",
1515-
"* Visualized the uncertainty behavior of SNGP\n",
1515+
"* Visualized the uncertainty behavior of SNGP.\n",
15161516
"* Compared the uncertainty behavior between SNGP, Monte Carlo dropout and deep ensemble models."
15171517
]
15181518
},
@@ -1531,7 +1531,7 @@
15311531
"id": "HoIikRybke-b"
15321532
},
15331533
"source": [
1534-
"* See the [SNGP-BERT tutorial](https://www.tensorflow.org/official_models/tutorials/uncertainty_quantification_with_sngp_bert) for an example of applying SNGP on a BERT model for uncertainty-aware natural language understanding. \n",
1534+
"* See the [SNGP-BERT tutorial](https://www.tensorflow.org/text/tutorials/uncertainty_quantification_with_sngp_bert) for an example of applying SNGP on a BERT model for uncertainty-aware natural language understanding.\n",
15351535
"* See [Uncertainty Baselines](https://github.com/google/uncertainty-baselines) for the implementation of SNGP model (and many other uncertainty methods) on a wide variety of benchmark datasets (e.g., [CIFAR](https://www.tensorflow.org/datasets/catalog/cifar100), [ImageNet](https://www.tensorflow.org/datasets/catalog/imagenet2012), [Jigsaw toxicity detection](https://www.tensorflow.org/datasets/catalog/wikipedia_toxicity_subtypes), etc).\n",
15361536
"* For a deeper understanding of the SNGP method, check out the paper [Simple and Principled Uncertainty Estimation with Deterministic Deep Learning via Distance Awareness](https://arxiv.org/abs/2006.10108).\n"
15371537
]

0 commit comments

Comments
 (0)