-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Closed
Labels
bugSomething isn't workingSomething isn't workingstrategy: ddpDistributedDataParallelDistributedDataParallelver: 2.4.x
Description
Bug description
I might be wrong, but when using a BatchSampler with DDP, Lightning never wraps it with a DistributedSamplerWrapper. This is due to the following condition in its code:
if isinstance(dataloader.sampler, (RandomSampler, SequentialSampler)):
return DistributedSampler(dataloader.dataset, **kwargs)
Since PyTorch uses a SequentialSampler by default, this condition is always met
DataLoader(ds, sampler=None, batch_sampler=MNSampler(ds.labels)).sampler
<torch.utils.data.sampler.SequentialSampler at 0x7deb78e01eb0>
What version are you seeing the problem on?
v2.4
Reproduced in studio
No response
How to reproduce the bug
Error messages and logs
# Error messages and logs here please
Environment
Current environment
#- PyTorch Lightning Version (e.g., 2.5.0):
#- PyTorch Version (e.g., 2.5):
#- Python version (e.g., 3.12):
#- OS (e.g., Linux):
#- CUDA/cuDNN version:
#- GPU models and configuration:
#- How you installed Lightning(`conda`, `pip`, source):
More info
No response
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't workingstrategy: ddpDistributedDataParallelDistributedDataParallelver: 2.4.x