Skip to content

Commit 1762f0d

Browse files
committed
Slight doc change
1 parent a6a6630 commit 1762f0d

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

bayesflow/attention.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ class MultiHeadAttentionBlock(tf.keras.Model):
3434

3535
def __init__(self, input_dim, attention_settings, num_dense_fc, dense_settings, use_layer_norm, **kwargs):
3636
"""Creates a multihead attention block which will typically be used as part of a
37-
set transformer architecture according to [1].
37+
set transformer architecture according to [1]. Corresponds to standard cross-attention.
3838
3939
Parameters
4040
----------

0 commit comments

Comments
 (0)