-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Open
Labels
data handlingGeneric data-related topicGeneric data-related topicfeatureIs an improvement or enhancementIs an improvement or enhancement
Description
Description & Motivation
I have a customized batch sampler that has an undetermined number of batches (to maximize the use of GPU memory with variable-sized samples). I believe this behavior is supported by vanilla PyTorch. However, in Lightning, the number of batches is precalculated, as shown here:
num_batches = _parse_num_batches(stage, length, trainer.limit_train_batches) |
Pitch
Support a variable number of batches by calculating the batch limit at the beginning of each epoch or at the end of last epoch.
Alternatives
No response
Additional context
No response
Metadata
Metadata
Assignees
Labels
data handlingGeneric data-related topicGeneric data-related topicfeatureIs an improvement or enhancementIs an improvement or enhancement