|
70 | 70 | "source": [
|
71 | 71 | "## Learning objectives\n",
|
72 | 72 | "\n",
|
73 |
| - "The [TensorFlow Models NLP library](https://github.com/tensorflow/models/tree/master/official/nlp/modeling) is a collection of tools for building and training modern high performance natural language models.\n", |
| 73 | + "The [TensorFlow Models NLP library](https://github.com/tensorflow/models/tree/master/official/nlp/modeling) is a collection of tools for building and training modern high-performance natural language models.\n", |
74 | 74 | "\n",
|
75 | 75 | "The `tfm.nlp.networks.EncoderScaffold` is the core of this library, and lots of new network architectures are proposed to improve the encoder. In this Colab notebook, we will learn how to customize the encoder to employ new network architectures."
|
76 | 76 | ]
|
|
151 | 151 | "source": [
|
152 | 152 | "## Canonical BERT encoder\n",
|
153 | 153 | "\n",
|
154 |
| - "Before learning how to customize the encoder, let's firstly create a canonical BERT enoder and use it to instantiate a `bert_classifier.BertClassifier` for classification task." |
| 154 | + "Before learning how to customize the encoder, let's first create a canonical BERT encoder and use it to instantiate a `bert_classifier.BertClassifier` for the classification task." |
155 | 155 | ]
|
156 | 156 | },
|
157 | 157 | {
|
|
256 | 256 | "source": [
|
257 | 257 | "#### Without Customization\n",
|
258 | 258 | "\n",
|
259 |
| - "Without any customization, `networks.EncoderScaffold` behaves the same the canonical `networks.BertEncoder`.\n", |
| 259 | + "Without any customization, `networks.EncoderScaffold` behaves the same as the canonical `networks.BertEncoder`.\n", |
260 | 260 | "\n",
|
261 |
| - "As shown in the following example, `networks.EncoderScaffold` can load `networks.BertEncoder`'s weights and output the same values:" |
| 261 | + "As shown in the following example, `networks.EncoderScaffold` can load `networks.BertEncoder`'s weights and output are the same values:" |
262 | 262 | ]
|
263 | 263 | },
|
264 | 264 | {
|
|
564 | 564 | "id": "MeidDfhlHKSO"
|
565 | 565 | },
|
566 | 566 | "source": [
|
567 |
| - "Inspecting the `albert_encoder`, we see it stacks the same `Transformer` layer multiple times (note the loop-back on the \"Transformer\" block below.." |
| 567 | + "Inspecting the `albert_encoder`, we see it stacks the same `Transformer` layer multiple times (note the loop-back on the \"Transformer\" block below." |
568 | 568 | ]
|
569 | 569 | },
|
570 | 570 | {
|
|
0 commit comments