Skip to content

Commit 381452e

Browse files
committed
fnn: move layer initializaton to setup method
1 parent 9a48fdf commit 381452e

File tree

1 file changed

+7
-2
lines changed

1 file changed

+7
-2
lines changed

chebai/models/ffn.py

Lines changed: 7 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -20,10 +20,15 @@ def __init__(
2020
**kwargs
2121
):
2222
super().__init__(**kwargs)
23+
self.input_size = input_size
24+
self.hidden_layers = hidden_layers
25+
26+
def setup(self, stage: str) -> None:
27+
super().setup(stage)
2328

2429
layers = []
25-
current_layer_input_size = input_size
26-
for hidden_dim in hidden_layers:
30+
current_layer_input_size = self.input_size
31+
for hidden_dim in self.hidden_layers:
2732
layers.append(MLPBlock(current_layer_input_size, hidden_dim))
2833
layers.append(Residual(MLPBlock(hidden_dim, hidden_dim)))
2934
current_layer_input_size = hidden_dim

0 commit comments

Comments
 (0)