Skip to content

Commit 1284694

Browse files
committed
Add warmup to default optimizer
1 parent 0d64899 commit 1284694

File tree

1 file changed

+4
-2
lines changed

1 file changed

+4
-2
lines changed

bayesflow/workflows/basic_workflow.py

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -844,9 +844,11 @@ def build_optimizer(self, epochs: int, num_batches: int, strategy: str) -> keras
844844

845845
# Default case
846846
learning_rate = keras.optimizers.schedules.CosineDecay(
847-
initial_learning_rate=self.initial_learning_rate,
847+
initial_learning_rate=0.5 * self.initial_learning_rate,
848+
warmup_target=self.initial_learning_rate,
849+
warmup_steps=num_batches,
848850
decay_steps=epochs * num_batches,
849-
alpha=self.initial_learning_rate**2,
851+
alpha=0,
850852
)
851853

852854
# Use adam for online learning, apply weight decay otherwise

0 commit comments

Comments
 (0)