Skip to content

Commit b9e2918

Browse files
authored
Merge branch 'main' into triton_kernel
2 parents 2d074b1 + df285cf commit b9e2918

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

beginner_source/dist_overview.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -70,7 +70,7 @@ When deciding what parallelism techniques to choose for your model, use these co
7070
#. Use `DistributedDataParallel (DDP) <https://pytorch.org/docs/stable/notes/ddp.html>`__,
7171
if your model fits in a single GPU but you want to easily scale up training using multiple GPUs.
7272

73-
* Use `torchrun <https://pytorch.org/docs/stable/elastic/run.html>`__, to launch multiple pytorch processes if you are you using more than one node.
73+
* Use `torchrun <https://pytorch.org/docs/stable/elastic/run.html>`__, to launch multiple pytorch processes if you are using more than one node.
7474

7575
* See also: `Getting Started with Distributed Data Parallel <../intermediate/ddp_tutorial.html>`__
7676

intermediate_source/dist_tuto.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -569,7 +569,7 @@ finally handshake with them.
569569
- ``WORLD_SIZE``: The total number of processes, so that the master
570570
knows how many workers to wait for.
571571
- ``RANK``: Rank of each process, so they will know whether it is the
572-
master of a worker.
572+
master or a worker.
573573

574574
**Shared File System**
575575

0 commit comments

Comments
 (0)