Skip to content

Commit 74e67dd

Browse files
authored
Updating the typos in TF document (#10883)
* Updating the typos in TF document I have updated the typos in customize_encoder.ipynb . Thank you! * Update customize_encoder.ipynb * Update customize_encoder.ipynb
1 parent 29d5353 commit 74e67dd

File tree

1 file changed

+5
-5
lines changed

1 file changed

+5
-5
lines changed

docs/nlp/customize_encoder.ipynb

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -70,7 +70,7 @@
7070
"source": [
7171
"## Learning objectives\n",
7272
"\n",
73-
"The [TensorFlow Models NLP library](https://github.com/tensorflow/models/tree/master/official/nlp/modeling) is a collection of tools for building and training modern high performance natural language models.\n",
73+
"The [TensorFlow Models NLP library](https://github.com/tensorflow/models/tree/master/official/nlp/modeling) is a collection of tools for building and training modern high-performance natural language models.\n",
7474
"\n",
7575
"The `tfm.nlp.networks.EncoderScaffold` is the core of this library, and lots of new network architectures are proposed to improve the encoder. In this Colab notebook, we will learn how to customize the encoder to employ new network architectures."
7676
]
@@ -151,7 +151,7 @@
151151
"source": [
152152
"## Canonical BERT encoder\n",
153153
"\n",
154-
"Before learning how to customize the encoder, let's firstly create a canonical BERT enoder and use it to instantiate a `bert_classifier.BertClassifier` for classification task."
154+
"Before learning how to customize the encoder, let's first create a canonical BERT encoder and use it to instantiate a `bert_classifier.BertClassifier` for the classification task."
155155
]
156156
},
157157
{
@@ -256,9 +256,9 @@
256256
"source": [
257257
"#### Without Customization\n",
258258
"\n",
259-
"Without any customization, `networks.EncoderScaffold` behaves the same the canonical `networks.BertEncoder`.\n",
259+
"Without any customization, `networks.EncoderScaffold` behaves the same as the canonical `networks.BertEncoder`.\n",
260260
"\n",
261-
"As shown in the following example, `networks.EncoderScaffold` can load `networks.BertEncoder`'s weights and output the same values:"
261+
"As shown in the following example, `networks.EncoderScaffold` can load `networks.BertEncoder`'s weights and output are the same values:"
262262
]
263263
},
264264
{
@@ -564,7 +564,7 @@
564564
"id": "MeidDfhlHKSO"
565565
},
566566
"source": [
567-
"Inspecting the `albert_encoder`, we see it stacks the same `Transformer` layer multiple times (note the loop-back on the \"Transformer\" block below.."
567+
"Inspecting the `albert_encoder`, we see it stacks the same `Transformer` layer multiple times (note the loop-back on the \"Transformer\" block below."
568568
]
569569
},
570570
{

0 commit comments

Comments
 (0)