File tree Expand file tree Collapse file tree 2 files changed +8
-8
lines changed Expand file tree Collapse file tree 2 files changed +8
-8
lines changed Original file line number Diff line number Diff line change 3
3
The academic paper which describes BERT in detail and provides full results on a
4
4
number of tasks can be found here: https://arxiv.org/abs/1810.04805 .
5
5
6
- This repository contains TensorFlow 2 implementation for BERT.
7
-
8
- N.B. This repository is under active development. Though we intend
9
- to keep the top-level BERT Keras model interface stable, expect continued
10
- changes to the training code, utility function interface and flags.
6
+ This repository contains TensorFlow 2.x implementation for BERT.
11
7
12
8
## Contents
13
9
* [ Contents] ( #contents )
@@ -110,8 +106,8 @@ pip install tf-nightly
110
106
```
111
107
112
108
Warning: More details TPU-specific set-up instructions and tutorial should come
113
- along with official TF 2.x release for TPU. Note that this repo is not officially
114
- supported by Google Cloud TPU team yet.
109
+ along with official TF 2.x release for TPU. Note that this repo is not
110
+ officially supported by Google Cloud TPU team yet until TF 2.1 released .
115
111
116
112
## Process Datasets
117
113
Original file line number Diff line number Diff line change 1
1
# Transformer Translation Model
2
2
This is an implementation of the Transformer translation model as described in
3
3
the [ Attention is All You Need] ( https://arxiv.org/abs/1706.03762 ) paper. The
4
- implementation leverages tf.keras and makes sure it is compatible with TF 2.0.
4
+ implementation leverages tf.keras and makes sure it is compatible with TF 2.x.
5
+
6
+ ** Note: this transformer folder is subject to be integrated into official/nlp
7
+ folder. Due to its dependencies, we will finish the refactoring after the model
8
+ garden 2.1 release.**
5
9
6
10
## Contents
7
11
* [ Contents] ( #contents )
You can’t perform that action at this time.
0 commit comments