@@ -47,14 +47,14 @@ in order to keep consistent with BERT paper.
47
47
48
48
Model | Configuration | Training Data | Checkpoint & Vocabulary | TF-HUB SavedModels
49
49
---------------------------------------- | :--------------------------: | ------------: | ----------------------: | ------:
50
- BERT-base uncased English | uncased_L-12_H-768_A-12 | Wiki + Books | [ uncased_L-12_H-768_A-12] ( https://storage.googleapis.com/cloud-tpu-checkpoints /bert/v3/uncased_L-12_H-768_A-12.tar.gz ) | [ ` BERT-Base, Uncased ` ] ( https://tfhub.dev/tensorflow/bert_en_uncased_L-12_H-768_A-12/ )
51
- BERT-base cased English | cased_L-12_H-768_A-12 | Wiki + Books | [ cased_L-12_H-768_A-12] ( https://storage.googleapis.com/cloud-tpu-checkpoints /bert/v3/cased_L-12_H-768_A-12.tar.gz ) | [ ` BERT-Base, Cased ` ] ( https://tfhub.dev/tensorflow/bert_en_cased_L-12_H-768_A-12/ )
52
- BERT-large uncased English | uncased_L-24_H-1024_A-16 | Wiki + Books | [ uncased_L-24_H-1024_A-16] ( https://storage.googleapis.com/cloud-tpu-checkpoints /bert/v3/uncased_L-24_H-1024_A-16.tar.gz ) | [ ` BERT-Large, Uncased ` ] ( https://tfhub.dev/tensorflow/bert_en_uncased_L-24_H-1024_A-16/ )
53
- BERT-large cased English | cased_L-24_H-1024_A-16 | Wiki + Books | [ cased_L-24_H-1024_A-16] ( https://storage.googleapis.com/cloud-tpu-checkpoints /bert/v3/cased_L-24_H-1024_A-16.tar.gz ) | [ ` BERT-Large, Cased ` ] ( https://tfhub.dev/tensorflow/bert_en_cased_L-24_H-1024_A-16/ )
54
- BERT-large, Uncased (Whole Word Masking) | wwm_uncased_L-24_H-1024_A-16 | Wiki + Books | [ wwm_uncased_L-24_H-1024_A-16] ( https://storage.googleapis.com/cloud-tpu-checkpoints /bert/v3/wwm_uncased_L-24_H-1024_A-16.tar.gz ) | [ ` BERT-Large, Uncased (Whole Word Masking) ` ] ( https://tfhub.dev/tensorflow/bert_en_wwm_uncased_L-24_H-1024_A-16/ )
55
- BERT-large, Cased (Whole Word Masking) | wwm_cased_L-24_H-1024_A-16 | Wiki + Books | [ wwm_cased_L-24_H-1024_A-16] ( https://storage.googleapis.com/cloud-tpu-checkpoints /bert/v3/wwm_cased_L-24_H-1024_A-16.tar.gz ) | [ ` BERT-Large, Cased (Whole Word Masking) ` ] ( https://tfhub.dev/tensorflow/bert_en_wwm_cased_L-24_H-1024_A-16/ )
56
- BERT-base MultiLingual | multi_cased_L-12_H-768_A-12 | Wiki + Books | [ multi_cased_L-12_H-768_A-12] ( https://storage.googleapis.com/cloud-tpu-checkpoints /bert/v3/multi_cased_L-12_H-768_A-12.tar.gz ) | [ ` BERT-Base, Multilingual Cased ` ] ( https://tfhub.dev/tensorflow/bert_multi_cased_L-12_H-768_A-12/ )
57
- BERT-base Chinese | chinese_L-12_H-768_A-12 | Wiki + Books | [ chinese_L-12_H-768_A-12] ( https://storage.googleapis.com/cloud-tpu-checkpoints /bert/v3/chinese_L-12_H-768_A-12.tar.gz ) | [ ` BERT-Base, Chinese ` ] ( https://tfhub.dev/tensorflow/bert_zh_L-12_H-768_A-12/ )
50
+ BERT-base uncased English | uncased_L-12_H-768_A-12 | Wiki + Books | [ uncased_L-12_H-768_A-12] ( https://storage.googleapis.com/tf_model_garden/nlp /bert/v3/uncased_L-12_H-768_A-12.tar.gz ) | [ ` BERT-Base, Uncased ` ] ( https://tfhub.dev/tensorflow/bert_en_uncased_L-12_H-768_A-12/ )
51
+ BERT-base cased English | cased_L-12_H-768_A-12 | Wiki + Books | [ cased_L-12_H-768_A-12] ( https://storage.googleapis.com/tf_model_garden/nlp /bert/v3/cased_L-12_H-768_A-12.tar.gz ) | [ ` BERT-Base, Cased ` ] ( https://tfhub.dev/tensorflow/bert_en_cased_L-12_H-768_A-12/ )
52
+ BERT-large uncased English | uncased_L-24_H-1024_A-16 | Wiki + Books | [ uncased_L-24_H-1024_A-16] ( https://storage.googleapis.com/tf_model_garden/nlp /bert/v3/uncased_L-24_H-1024_A-16.tar.gz ) | [ ` BERT-Large, Uncased ` ] ( https://tfhub.dev/tensorflow/bert_en_uncased_L-24_H-1024_A-16/ )
53
+ BERT-large cased English | cased_L-24_H-1024_A-16 | Wiki + Books | [ cased_L-24_H-1024_A-16] ( https://storage.googleapis.com/tf_model_garden/nlp /bert/v3/cased_L-24_H-1024_A-16.tar.gz ) | [ ` BERT-Large, Cased ` ] ( https://tfhub.dev/tensorflow/bert_en_cased_L-24_H-1024_A-16/ )
54
+ BERT-large, Uncased (Whole Word Masking) | wwm_uncased_L-24_H-1024_A-16 | Wiki + Books | [ wwm_uncased_L-24_H-1024_A-16] ( https://storage.googleapis.com/tf_model_garden/nlp /bert/v3/wwm_uncased_L-24_H-1024_A-16.tar.gz ) | [ ` BERT-Large, Uncased (Whole Word Masking) ` ] ( https://tfhub.dev/tensorflow/bert_en_wwm_uncased_L-24_H-1024_A-16/ )
55
+ BERT-large, Cased (Whole Word Masking) | wwm_cased_L-24_H-1024_A-16 | Wiki + Books | [ wwm_cased_L-24_H-1024_A-16] ( https://storage.googleapis.com/tf_model_garden/nlp /bert/v3/wwm_cased_L-24_H-1024_A-16.tar.gz ) | [ ` BERT-Large, Cased (Whole Word Masking) ` ] ( https://tfhub.dev/tensorflow/bert_en_wwm_cased_L-24_H-1024_A-16/ )
56
+ BERT-base MultiLingual | multi_cased_L-12_H-768_A-12 | Wiki + Books | [ multi_cased_L-12_H-768_A-12] ( https://storage.googleapis.com/tf_model_garden/nlp /bert/v3/multi_cased_L-12_H-768_A-12.tar.gz ) | [ ` BERT-Base, Multilingual Cased ` ] ( https://tfhub.dev/tensorflow/bert_multi_cased_L-12_H-768_A-12/ )
57
+ BERT-base Chinese | chinese_L-12_H-768_A-12 | Wiki + Books | [ chinese_L-12_H-768_A-12] ( https://storage.googleapis.com/tf_model_garden/nlp /bert/v3/chinese_L-12_H-768_A-12.tar.gz ) | [ ` BERT-Base, Chinese ` ] ( https://tfhub.dev/tensorflow/bert_zh_L-12_H-768_A-12/ )
58
58
59
59
You may explore more in the TF-Hub BERT collection:
60
60
https://tfhub.dev/google/collections/bert/1
0 commit comments