File tree Expand file tree Collapse file tree 2 files changed +2
-2
lines changed
source-fabric/advanced/model_parallel
source-pytorch/advanced/model_parallel Expand file tree Collapse file tree 2 files changed +2
-2
lines changed Original file line number Diff line number Diff line change 2
2
Training models with billions of parameters
3
3
###########################################
4
4
5
- Use Fully Shared Data Parallel (FSDP) to train large models with billions of parameters efficiently on multiple GPUs and across multiple machines.
5
+ Use Fully Sharded Data Parallel (FSDP) to train large models with billions of parameters efficiently on multiple GPUs and across multiple machines.
6
6
7
7
.. note :: This is an experimental feature.
8
8
Original file line number Diff line number Diff line change 6
6
Train models with billions of parameters using FSDP
7
7
###################################################
8
8
9
- Use Fully Shared Data Parallel (FSDP) to train large models with billions of parameters efficiently on multiple GPUs and across multiple machines.
9
+ Use Fully Sharded Data Parallel (FSDP) to train large models with billions of parameters efficiently on multiple GPUs and across multiple machines.
10
10
11
11
.. note :: This is an experimental feature.
12
12
You can’t perform that action at this time.
0 commit comments