Skip to content

Commit bc489dd

Browse files
authored
Apply suggestions from code review
Minor editorial fixes
1 parent fd9ef0f commit bc489dd

File tree

1 file changed

+5
-3
lines changed

1 file changed

+5
-3
lines changed

intermediate_source/pipelining_tutorial.rst

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ Introduction to Distributed Pipeline Parallelism
55
.. note::
66
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/main/intermediate_source/pipelining_tutorial.rst>`__.
77

8-
This tutorial uses a gpt-style transformer model to demonstrate implementing distributed
8+
This tutorial uses a GPT-style transformer model to demonstrate implementing distributed
99
pipeline parallelism with `torch.distributed.pipelining <https://pytorch.org/docs/main/distributed.pipelining.html>`__
1010
APIs.
1111

@@ -25,7 +25,7 @@ APIs.
2525
Setup
2626
-----
2727

28-
With ``torch.distributed.pipelining`` we will be partitioning the execution of a model and scheduling computation on micro-batches. We will be using a simplified version
28+
With ``torch.distributed.pipelining`` we will be partitioning the execution of a model and scheduling computation on microbatches. We will be using a simplified version
2929
of a transformer decoder model. The model architecture is for educational purposes and has multiple transformer decoder layers as we want to demonstrate how to split the model into different
3030
chunks. First, let us define the model:
3131

@@ -218,7 +218,9 @@ Finally, we are ready to run the script. We will use ``torchrun`` to create a si
218218
Our script is already written in a way rank 0 that performs the required logic for pipeline stage 0, and rank 1
219219
performs the logic for pipeline stage 1.
220220

221-
``torchrun --standalone --nnodes 1 --nproc_per_node 2 pipelining_tutorial.py``
221+
.. code-block:: bash
222+
223+
torchrun --standalone --nnodes 1 --nproc_per_node 2 pipelining_tutorial.py
222224
223225
Conclusion
224226
----------

0 commit comments

Comments
 (0)