Skip to content

Commit eb28ca5

Browse files
committed
Change default of SetTransformer to not use LayerNorm -> better performance found
1 parent a3eecab commit eb28ca5

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

bayesflow/summary_networks.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -185,7 +185,7 @@ def __init__(
185185
input_dim,
186186
attention_settings=None,
187187
dense_settings=None,
188-
use_layer_norm=True,
188+
use_layer_norm=False,
189189
num_dense_fc=2,
190190
summary_dim=10,
191191
num_attention_blocks=2,
@@ -224,8 +224,8 @@ def __init__(
224224
225225
For more details and arguments, see:
226226
https://www.tensorflow.org/api_docs/python/tf/keras/layers/Dense
227-
use_layer_norm : boolean, optional, default: True
228-
Whether layer normalization before and after attention + feedforward
227+
use_layer_norm : boolean, optional, default: False
228+
Whether to use layer normalization before and after attention + feedforward
229229
num_dense_fc : int, optional, default: 2
230230
The number of hidden layers for the internal feedforward network
231231
summary_dim : int

0 commit comments

Comments
 (0)