Skip to content

Commit a2ee8eb

Browse files
committed
add on-going work
1 parent 7ef1df1 commit a2ee8eb

File tree

1 file changed

+8
-3
lines changed

1 file changed

+8
-3
lines changed

README.md

Lines changed: 8 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -56,9 +56,14 @@ Please refer to [Tutorial.md](Tutorial.md) and [Code documentation](https://micr
5656

5757

5858
# Contribute
59-
NeuronBlocks operates in an open model. It is designed and developed by **STCA NLP Group, Microsoft**. Contributions from academia and industry are also highly welcome.
60-
61-
For more details, please refer to [Contributing.md](Contributing.md).
59+
NeuronBlocks operates in an open model. It is designed and developed by **STCA NLP Group, Microsoft**. Contributions from academia and industry are also highly welcome. For more details, please refer to [Contributing.md](Contributing.md).
60+
61+
## Ongoing Work and Call for Contributions
62+
Anyone who are familiar with are highly encouraged to contribute code.
63+
* Knowledge Distillation for Model Compression. Knowledge distillation for heavy models such as BERT, OpenAI Transformer. Teacher-Student based knowledge distillation is one common method for model compression.
64+
* Multi-Lingual Support
65+
* NER Model Support
66+
* Multi-Task Training Support
6267

6368
# Reference
6469
**NeuronBlocks -- Building Your NLP DNN Models Like Playing Lego**, at https://arxiv.org/abs/1904.09535.

0 commit comments

Comments
 (0)