You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: official/README.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -61,7 +61,7 @@ In the near future, we will add:
61
61
|[ALBERT (A Lite BERT)](nlp/MODEL_GARDEN.md#available-model-configs)|[ALBERT: A Lite BERT for Self-supervised Learning of Language Representations](https://arxiv.org/abs/1909.11942)|
62
62
|[BERT (Bidirectional Encoder Representations from Transformers)](nlp/MODEL_GARDEN.md#available-model-configs)|[BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding](https://arxiv.org/abs/1810.04805)|
63
63
|[NHNet (News Headline generation model)](projects/nhnet)|[Generating Representative Headlines for News Stories](https://arxiv.org/abs/2001.09386)|
64
-
|[Transformer](nlp/transformer)|[Attention Is All You Need](https://arxiv.org/abs/1706.03762)|
64
+
|[Transformer](nlp/MODEL_GARDEN.md#available-model-configs)|[Attention Is All You Need](https://arxiv.org/abs/1706.03762)|
65
65
|[XLNet](nlp/xlnet)|[XLNet: Generalized Autoregressive Pretraining for Language Understanding](https://arxiv.org/abs/1906.08237)|
66
66
|[MobileBERT](projects/mobilebert)|[MobileBERT: a Compact Task-Agnostic BERT for Resource-Limited Devices](https://arxiv.org/abs/2004.02984)|
0 commit comments