File tree Expand file tree Collapse file tree 2 files changed +2
-2
lines changed
source-fabric/advanced/model_parallel
source-pytorch/advanced/model_parallel Expand file tree Collapse file tree 2 files changed +2
-2
lines changed Original file line number Diff line number Diff line change 22Training models with billions of parameters
33###########################################
44
5- Use Fully Shared Data Parallel (FSDP) to train large models with billions of parameters efficiently on multiple GPUs and across multiple machines.
5+ Use Fully Sharded Data Parallel (FSDP) to train large models with billions of parameters efficiently on multiple GPUs and across multiple machines.
66
77.. note :: This is an experimental feature.
88
Original file line number Diff line number Diff line change 66Train models with billions of parameters using FSDP
77###################################################
88
9- Use Fully Shared Data Parallel (FSDP) to train large models with billions of parameters efficiently on multiple GPUs and across multiple machines.
9+ Use Fully Sharded Data Parallel (FSDP) to train large models with billions of parameters efficiently on multiple GPUs and across multiple machines.
1010
1111.. note :: This is an experimental feature.
1212
You can’t perform that action at this time.
0 commit comments