You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
|[Distill-LSTM](./examples/model_compression/distill_lstm/)| 基于[Distilling Task-Specific Knowledge from BERT into Simple Neural Networks](https://arxiv.org/abs/1903.12136)论文策略的实现,将BERT中英文分类的下游模型知识通过蒸馏的方式迁移至LSTM的小模型结构中,取得比LSTM单独训练更好的效果。 |
234
236
235
237
238
+
#### 小样本学习 (Few-Shot Learning)
239
+
240
+
| 算法 | 简介 |
241
+
| :--------------- | ------- |
242
+
|[PET](./examples/few_shot/pet/)| 基于[Exploiting Cloze Questions for Few Shot Text Classification and Natural Language Inference](https://arxiv.org/abs/2001.07676) 论文策略实现, 基于人工知识设计 Prompt, 将下游目标任务转换为完形填空任务来充分挖掘预训练模型中的知识, 显著提升模型效果。|
0 commit comments