Skip to content

Problems when using custom learning rate scheduler for NeuralForecast Models #1096

@MLfreakPy

Description

@MLfreakPy

What happened + What you expected to happen

Hi together,

first, thank you so much for the implementation of a custom lr scheduler!

I ran into problems employing a ReduceLROnPlateau scheduler.

Currently I am passing through the lr_scheduler and lr_scheduler_kwargs parameters. However, I get the below error when implementing ReduceLROnPlateau. Other schedulers, like OneCycle, work perfectly though! I think some of your earlier discussion evolved around it, however, I couldn't make it work yet.
[(https://github.com//pull/998)]

MisconfigurationException('The lr scheduler dict must include a monitor when a ReduceLROnPlateau scheduler is used. For example: {"optimizer": optimizer, "lr_scheduler": {"scheduler": scheduler, "monitor": "your_loss"}}').

Versions / Dependencies

GoogleColab
neuralforecast
Version: 1.7.4

Reproduction script

See above - if needed happy to supply more details.

Issue Severity

Medium: It is a significant difficulty but I can work around it.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions