You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: configs/xcit/README.md
+18-15Lines changed: 18 additions & 15 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,6 +8,8 @@ Following tremendous success in natural language processing, transformers have r
8
8
9
9
## Getting Started
10
10
11
+
An [AI Studio](https://aistudio.baidu.com/aistudio/index) project about XCiT has been published, and you can click [here](https://aistudio.baidu.com/aistudio/projectdetail/3449604) to open the project and run commands of training and evaluation directly.
For knowledge distillation, you only need to replace `${XCIT_ARCH}.yaml` to corresponding distillation config file, `${XCIT_ARCH}_dist.yaml`, at above commands. We provide pretrained weights of Teacher model `RegNetY_160`, which can be downloaded [here](https://passl.bj.bcebos.com/vision_transformers/pvt_v2/regnety_160.pdparams).
30
+
For knowledge distillation, you only need to replace `${XCIT_ARCH}.yaml` to corresponding distillation config file, `${XCIT_ARCH}_dist.yaml`, at above commands. We provide pretrained weights of Teacher model `RegNetY_160`, which can be downloaded [here](https://passl.bj.bcebos.com/vision_transformers/xcit/regnety_160.pdparams).
29
31
30
32
Checkpoints saved in distillation training include both Teacher's and Student's weights. You can extract the weights of Student by following command.
0 commit comments