Skip to content

How to use WarmupReduceLROnPlateauScheduler in one epoch train part? #3

@pikaliov

Description

@pikaliov

In your example, warmup_steps=30000, steps_in_epoch=10000. And condition timestep < warmup_steps is met always.
I guess it's mistake of constants in provided example.

So I wanna use you lr scheduler, but I can't get it how to use it properly.
When I should use scheduler step() in one epoch train part ? Always? Or increment timestep each epoch and when timestep > warmup_steps I will use scheduler.step() ?

scheduler = WarmupReduceLROnPlateauScheduler(
        optimizer, 
        init_lr=1e-10, 
        peak_lr=1e-4, 
        warmup_steps=30000, 
        patience=1,
        factor=0.3,
    )

    for epoch in range(max_epochs):
        for timestep in range(steps_in_epoch):
            ...
            ...
            if timestep < warmup_steps:
                scheduler.step()
                
        val_loss = validate()
        scheduler.step(val_loss)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions