Skip to content

Support A Variable Number of Batches #20330

@e-yi

Description

@e-yi

Description & Motivation

I have a customized batch sampler that has an undetermined number of batches (to maximize the use of GPU memory with variable-sized samples). I believe this behavior is supported by vanilla PyTorch. However, in Lightning, the number of batches is precalculated, as shown here:

num_batches = _parse_num_batches(stage, length, trainer.limit_train_batches)

Pitch

Support a variable number of batches by calculating the batch limit at the beginning of each epoch or at the end of last epoch.

Alternatives

No response

Additional context

No response

cc @lantiga @Borda @tchaton

Metadata

Metadata

Assignees

No one assigned

    Labels

    data handlingGeneric data-related topicfeatureIs an improvement or enhancement

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions