Skip to content

Commit c4dd7d1

Browse files
authored
Add Chinese Support in README.md
1 parent f6891b0 commit c4dd7d1

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

README.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,13 @@
11
# ***NeuronBlocks*** - Building Your NLP DNN Models Like Playing Lego
22

3+
![language](https://img.shields.io/badge/language-en%7C中文-brightgreen.svg)
34
[![python](https://img.shields.io/badge/python-3.6%20%7C%203.7-blue.svg)](https://www.python.org)
45
[![pytorch](https://img.shields.io/badge/pytorch-0.4%20%7C%201.x-orange.svg)](https://pytorch.org)
56
[![license](https://img.shields.io/badge/license-MIT-green.svg)](https://opensource.org/licenses/MIT)
67

78
[简体中文](README_zh_CN.md)
89

10+
911
# Table of Contents
1012
* [Overview](#Overview)
1113
* [Get Started in 60 Seconds](#Get-Started-in-60-Seconds)
@@ -114,7 +116,6 @@ NeuronBlocks operates in an open model. It is designed and developed by **STCA N
114116
Anyone who are familiar with are highly encouraged to contribute code.
115117
* Knowledge Distillation for Model Compression. Knowledge distillation for heavy models such as BERT, OpenAI Transformer. Teacher-Student based knowledge distillation is one common method for model compression.
116118
* Multi-Lingual Support
117-
* Chinese Language Support
118119
* NER Model Support
119120
* Multi-Task Training Support
120121

0 commit comments

Comments
 (0)