Skip to content

Commit 509b2ca

Browse files
authored
Docs: fix FSDP acronym (#19384)
1 parent 8280519 commit 509b2ca

File tree

2 files changed

+2
-2
lines changed
  • docs
    • source-fabric/advanced/model_parallel
    • source-pytorch/advanced/model_parallel

2 files changed

+2
-2
lines changed

docs/source-fabric/advanced/model_parallel/fsdp.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
Training models with billions of parameters
33
###########################################
44

5-
Use Fully Shared Data Parallel (FSDP) to train large models with billions of parameters efficiently on multiple GPUs and across multiple machines.
5+
Use Fully Sharded Data Parallel (FSDP) to train large models with billions of parameters efficiently on multiple GPUs and across multiple machines.
66

77
.. note:: This is an experimental feature.
88

docs/source-pytorch/advanced/model_parallel/fsdp.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
Train models with billions of parameters using FSDP
77
###################################################
88

9-
Use Fully Shared Data Parallel (FSDP) to train large models with billions of parameters efficiently on multiple GPUs and across multiple machines.
9+
Use Fully Sharded Data Parallel (FSDP) to train large models with billions of parameters efficiently on multiple GPUs and across multiple machines.
1010

1111
.. note:: This is an experimental feature.
1212

0 commit comments

Comments
 (0)