Replies: 1 comment
-
The number of training steps, set here https://github.com/amazon-science/chronos-forecasting/blob/9d59057b72a1ddd52e163c536d0f99631eea4857/scripts/training/train.py#L510 is independent of the size of the dataset. For 1M series you probably want to set it to a value that, together with the batch size, allows the training loop to go over the dataset a sufficient number of times (“epochs”).
|
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
abdulfatir
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I did a test by running the fine-tuning process with 20 time series as in the example and with 1_000_000 time series. Both processes took less than 8 minutes.
Shouldn't the fine-tuning process take longer with more data?
Beta Was this translation helpful? Give feedback.
All reactions