Skip to content

Commit 7acc2a0

Browse files
wz337svekars
andauthored
Update recipes_source/distributed_device_mesh.rst
Co-authored-by: Svetlana Karslioglu <[email protected]>
1 parent d0806c8 commit 7acc2a0

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

recipes_source/distributed_device_mesh.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -150,7 +150,7 @@ Then, run the following `torch elastic/torchrun <https://pytorch.org/docs/stable
150150
151151
How to use DeviceMesh for your custom parallel solutions
152152
--------------------------------------------------------
153-
When working with large scale training, one can have a more complicated custom parallel training composition. For example, one may need to slice out submeshes for different parallelism solutions.
153+
When working with large scale training, you might have more complex custom parallel training composition. For example, you may need to slice out submeshes for different parallelism solutions.
154154
DeviceMesh allows users to slice child mesh from the parent mesh and re-use the NCCL communicators already created when the parent mesh is initialized.
155155

156156
.. code-block:: python

0 commit comments

Comments
 (0)