Skip to content

Commit fd6dca0

Browse files
removed outdated comments
1 parent 47b8c10 commit fd6dca0

File tree

1 file changed

+0
-3
lines changed

1 file changed

+0
-3
lines changed

src/aging_gan/train.py

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -138,9 +138,6 @@ def initialize_optimizers(
138138
cfg, G, F, DX, DY
139139
) -> tuple[optim.Optimizer, optim.Optimizer, optim.Optimizer, optim.Optimizer]:
140140
"""Create Adam optimizers for all models."""
141-
# track all generator params (even frozen encoder params during initial training).
142-
# This would allow us to transition easily to the full fine-tuning later on by simply toggling requires_grad=True
143-
# since the optimizers already track all the parameters from the start.
144141
opt_G = optim.Adam(
145142
G.parameters(),
146143
lr=cfg.gen_lr,

0 commit comments

Comments
 (0)