Skip to content

Commit 5d43376

Browse files
committed
Fix unused weights warning in coupling flow
The projector weights were created even when no residual connections were required, leading to warnings as their gradients were not present, as they had no influence on the loss. Fixes #331
1 parent 051f998 commit 5d43376

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

bayesflow/networks/mlp/hidden_block.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -51,7 +51,7 @@ def call(self, inputs: Tensor, training: bool = False, **kwargs) -> Tensor:
5151
def build(self, input_shape):
5252
self.dense.build(input_shape)
5353

54-
if input_shape[-1] != self.units:
54+
if input_shape[-1] != self.units and self.residual:
5555
self.projector = self.add_weight(
5656
shape=(input_shape[-1], self.units), initializer="glorot_uniform", trainable=True, name="projector"
5757
)

0 commit comments

Comments
 (0)