You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Jul 10, 2025. It is now read-only.
@@ -180,10 +181,9 @@ class MultiHeadAttention(tf.keras.layers.Layer):
180
181
* Value (source) attention axes shape (S), the rank must match the target.
181
182
182
183
Args:
183
-
inputs: List of the following tensors:
184
-
* query: Query `Tensor` of shape `[B, T, dim]`.
185
-
* value: Value `Tensor` of shape `[B, S, dim]`.
186
-
* key: Optional key `Tensor` of shape `[B, S, dim]`. If not given, will
184
+
query: Query `Tensor` of shape `[B, T, dim]`.
185
+
value: Value `Tensor` of shape `[B, S, dim]`.
186
+
key: Optional key `Tensor` of shape `[B, S, dim]`. If not given, will
187
187
use `value` for both `key` and `value`, which is the most common case.
188
188
attention_mask: a boolean mask of shape `[B, T, S]`, that prevents
189
189
attention to certain positions.
@@ -242,14 +242,15 @@ we would like to introduce an optional argument `attention_mask` for
242
242
the shape is (batch_size, target_length, source_length). Whenever
243
243
`attention_mask` is specified, the `mask` argument is OK to be skipped.
244
244
245
-
* TFA `MultiHeadAttention` Deprecation and Re-mapping
246
-
247
-
[MultiHeadAttention](https://github.com/tensorflow/addons/blob/master/tensorflow_addons/layers/multihead_attention.py) has been released. The proposed `MultiHeadAttention` has similar `__init__` arguments
248
-
and `call` interface, where the minor differences are argument names and the attention `mask` shape.
249
-
We expect the new `MultiHeadAttention` keras layer will
250
-
cover the functionalities. Once the implementation are merged as experimental layers,
251
-
we will work with TF Addons team to design the deprecation and re-mapping procedure.
245
+
* TFA `MultiHeadAttention` Deprecation and Re-mapping
0 commit comments