Skip to content

Commit b198435

Browse files
committed
added single gpu train doc
1 parent cca6d2c commit b198435

File tree

1 file changed

+4
-1
lines changed

1 file changed

+4
-1
lines changed

docs/Trainer/Distributed training.md

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,9 +28,12 @@ For a deeper understanding of what lightning is doing, feel free to read [this g
2828
Due to an issue with apex and DistributedDataParallel (PyTorch and NVIDIA issue), Lightning does
2929
not allow 16-bit and DP training. We tried to get this to work, but it's an issue on their end.
3030

31+
Below are the possible configurations we support.
32+
3133
| 1 GPU | 1+ GPUs | DP | DDP | 16-bit | command |
3234
|---|---|---|---|---|---|
33-
| Y | | | | Y | ```Trainer(gpus=[0])``` |
35+
| Y | | | | | ```Trainer(gpus=[0])``` |
36+
| Y | | | | Y | ```Trainer(gpus=[0], use_amp=True)``` |
3437
| | Y | Y | | | ```Trainer(gpus=[0, ...])``` |
3538
| | Y | | Y | | ```Trainer(gpus=[0, ...], distributed_backend='ddp')``` |
3639
| | Y | | Y | Y | ```Trainer(gpus=[0, ...], distributed_backend='ddp', use_amp=True)``` |

0 commit comments

Comments
 (0)