@@ -84,6 +84,8 @@ PaddleNLP的Transformer预训练模型包含从 `huggingface.co`_ 直接转换
84
84
ChineseBert <transformers/ChineseBert/contents >
85
85
ConvBert <transformers/ConvBert/contents >
86
86
CTRL <transformers/CTRL/contents >
87
+ Deberta <transformers/Deberta/contents >
88
+ DebertaV2 <transformers/DebertaV2/contents >
87
89
DistilBert <transformers/DistilBert/contents >
88
90
ELECTRA <transformers/ELECTRA/contents >
89
91
ERNIE <transformers/ERNIE/contents >
@@ -145,6 +147,10 @@ Transformer预训练模型适用任务汇总
145
147
+--------------------+-------------------------+----------------------+--------------------+-----------------+-----------------+
146
148
| CTRL_ | ✅ | ❌ | ❌ | ❌ | ❌ |
147
149
+--------------------+-------------------------+----------------------+--------------------+-----------------+-----------------+
150
+ | Deberta_ | ✅ | ✅ | ✅ | ❌ | ✅ |
151
+ +--------------------+-------------------------+----------------------+--------------------+-----------------+-----------------+
152
+ | DebertaV2_ | ✅ | ✅ | ✅ | ❌ | ✅ |
153
+ +--------------------+-------------------------+----------------------+--------------------+-----------------+-----------------+
148
154
| DistilBert_ | ✅ | ✅ | ✅ | ❌ | ❌ |
149
155
+--------------------+-------------------------+----------------------+--------------------+-----------------+-----------------+
150
156
| ELECTRA_ | ✅ | ✅ | ✅ | ❌ | ✅ |
@@ -220,6 +226,8 @@ Transformer预训练模型适用任务汇总
220
226
.. _ChineseBert : https://arxiv.org/abs/2106.16038
221
227
.. _ConvBert : https://arxiv.org/abs/2008.02496
222
228
.. _CTRL : https://arxiv.org/abs/1909.05858
229
+ .. _DeBERTa : https://arxiv.org/abs/2006.03654
230
+ .. _DebertaV2 : https://arxiv.org/abs/2111.09543
223
231
.. _DistilBert : https://arxiv.org/abs/1910.01108
224
232
.. _ELECTRA : https://arxiv.org/abs/2003.10555
225
233
.. _ERNIE : https://arxiv.org/abs/1904.09223
0 commit comments