Skip to content

Commit 1b41087

Browse files
authored
Merge branch 'main' into varlen_tutorial
2 parents 25db45f + 4fa1fa8 commit 1b41087

File tree

4 files changed

+12
-10
lines changed

4 files changed

+12
-10
lines changed

.github/workflows/build-tutorials-nightly.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ name: Build tutorials (nightly/test)
1515
# download the binaries in .jenkins/build.sh.
1616
on:
1717
# Only main branch for now. Uncomment the below line to enable it on PRs.
18-
# pull_request:
18+
pull_request:
1919

2020
# Comment out the below line to disable on the main branch
2121
push:

beginner_source/understanding_leaf_vs_nonleaf_tutorial.py

Lines changed: 7 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -265,8 +265,10 @@
265265
#
266266
# Computational graph after backward pass
267267
#
268-
# If you call ``retain_grad()`` on a non-leaf node, it results in a no-op.
269-
# If we call ``retain_grad()`` on a node that has ``requires_grad=False``,
268+
# If you call ``retain_grad()`` on a leaf tensor, it results in a no-op
269+
# since leaf tensors already retain their gradients by default (when
270+
# ``requires_grad=True``).
271+
# If we call ``retain_grad()`` on a tensor that has ``requires_grad=False``,
270272
# PyTorch actually throws an error, since it can’t store the gradient if
271273
# it is never calculated.
272274
#
@@ -298,13 +300,13 @@
298300
# +----------------+------------------------+------------------------+---------------------------------------------------+-------------------------------------+
299301
# | ``is_leaf`` | ``requires_grad`` | ``retains_grad`` | ``require_grad()`` | ``retain_grad()`` |
300302
# +================+========================+========================+===================================================+=====================================+
301-
# | ``True`` | ``False`` | ``False`` | sets ``requires_grad`` to ``True`` or ``False`` | no-op |
303+
# | ``True`` | ``False`` | ``False`` | sets ``requires_grad`` to ``True`` or ``False`` | throws error |
302304
# +----------------+------------------------+------------------------+---------------------------------------------------+-------------------------------------+
303-
# | ``True`` | ``True`` | ``False`` | sets ``requires_grad`` to ``True`` or ``False`` | no-op |
305+
# | ``True`` | ``True`` | ``False`` | sets ``requires_grad`` to ``True`` or ``False`` | no-op (already retains) |
304306
# +----------------+------------------------+------------------------+---------------------------------------------------+-------------------------------------+
305307
# | ``False`` | ``True`` | ``False`` | no-op | sets ``retains_grad`` to ``True`` |
306308
# +----------------+------------------------+------------------------+---------------------------------------------------+-------------------------------------+
307-
# | ``False`` | ``True`` | ``True`` | no-op | no-op |
309+
# | ``False`` | ``True`` | ``True`` | no-op | no-op (already retains) |
308310
# +----------------+------------------------+------------------------+---------------------------------------------------+-------------------------------------+
309311
#
310312

intermediate_source/torch_compile_tutorial.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -163,7 +163,7 @@ def timed(fn):
163163
result = fn()
164164
end.record()
165165
torch.cuda.synchronize()
166-
return result, start.elapsed_time(end) / 1024
166+
return result, start.elapsed_time(end) / 1000
167167

168168

169169
inp = torch.randn(4096, 4096).cuda()

unstable_source/flight_recorder_tutorial.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@ Flight Recorder consists of two core parts:
4444

4545
- The collection portion: when enabled, information about collectives is recorded in an in-memory circular buffer. Upon job timeout, or on demand, the in-memory buffer can be retrieved or dumped to file.
4646

47-
- An analyzer script is available in the `tools/flight_recorder <https://github.com/pytorch/pytorch/tree/main/tools/flight_recorder>`__ directory (details below).
47+
- An analyzer script is available in the `torch/distributed/flight_recorder <https://github.com/pytorch/pytorch/tree/main/torch/distributed/flight_recorder>`__ directory (details below).
4848
The analyzer script runs known heuristics using the collected data and attempts to automatically identify the underlying issue that caused the job to stall.
4949

5050
Enabling Flight Recorder
@@ -169,7 +169,7 @@ The contents of a Flight Recorder ``unpickled`` file are shown below:
169169
Analyzing Flight Recorder Dumps
170170
-------------------------------
171171

172-
We have convenient scripts available in `pytorch/tools/flight_recorder` directory for analyzing captured
172+
We have convenient scripts available in `pytorch/torch/distributed/flight_recorder <https://github.com/pytorch/pytorch/tree/main/torch/distributed/flight_recorder>`__ directory for analyzing captured
173173
data.
174174

175175
To run the convenience script, follow these steps:
@@ -300,5 +300,5 @@ Conclusion
300300
In this tutorial, we have learned about a new PyTorch diagnostic tool called Flight Recorder.
301301
We have discussed how to enable Flight Recorder to collect diagnostic data from a machine.
302302
Additionally, we explored how to analyze the data captured from the Flight Recorder using a
303-
convenience script located in the `tools/flight_recorder <https://github.com/pytorch/pytorch/tree/main/tools/flight_recorder>`__
303+
convenience script located in the `torch/distributed/flight_recorder <https://github.com/pytorch/pytorch/tree/main/torch/distributed/flight_recorder>`__
304304
directory of the PyTorch repository.

0 commit comments

Comments
 (0)