Skip to content

Commit b5ec299

Browse files
authored
Fixing Broken Links (pytorch#3540)
Fixes pytorch#3533
1 parent 763699c commit b5ec299

File tree

5 files changed

+7
-8
lines changed

5 files changed

+7
-8
lines changed

CONTRIBUTING.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -71,8 +71,7 @@ There are three types of tutorial content that we host on
7171
reStructuredText files. The build system only converts them into HTML;
7272
the code in them does not run on build. These tutorials are easier to
7373
create and maintain but they do not provide an interactive experience.
74-
An example is the [Dynamic Quantization
75-
tutorial](https://pytorch.org/tutorials/recipes/recipes/dynamic_quantization.html).
74+
7675

7776
* **Recipes** are tutorials that provide bite-sized, actionable
7877
examples of how to use specific features, which differentiates them
@@ -265,7 +264,7 @@ search, you need to include it in `index.rst`, or for recipes, in
265264
1. Open the relevant file
266265
[`index.rst`](https://github.com/pytorch/tutorials/blob/main/index.rst)
267266
or
268-
[`recipes_index.rst`](https://github.com/pytorch/tutorials/blob/main/recipes_source/recipes_index.rst)
267+
[`recipes_index.rst`](https://github.com/pytorch/tutorials/blob/main/recipes_index.rst)
269268
1. Add a _card_ in reStructuredText format similar to the following:
270269

271270
```

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ NOTE: Before submitting a new tutorial, read [PyTorch Tutorial Submission Policy
2727
1. Create a Python file. If you want it executed while inserted into documentation, save the file with the suffix `tutorial` so that the file name is `your_tutorial.py`.
2828
2. Put it in one of the `beginner_source`, `intermediate_source`, `advanced_source` directory based on the level of difficulty. If it is a recipe, add it to `recipes_source`. For tutorials demonstrating unstable prototype features, add to the `prototype_source`.
2929
3. For Tutorials (except if it is a prototype feature), include it in the `toctree` directive and create a `customcarditem` in [index.rst](./index.rst).
30-
4. For Tutorials (except if it is a prototype feature), create a thumbnail in the [index.rst file](https://github.com/pytorch/tutorials/blob/main/index.rst) using a command like `.. customcarditem:: beginner/your_tutorial.html`. For Recipes, create a thumbnail in the [recipes_index.rst](https://github.com/pytorch/tutorials/blob/main/recipes_source/recipes_index.rst)
30+
4. For Tutorials (except if it is a prototype feature), create a thumbnail in the [index.rst file](https://github.com/pytorch/tutorials/blob/main/index.rst) using a command like `.. customcarditem:: beginner/your_tutorial.html`. For Recipes, create a thumbnail in the [recipes_index.rst](https://github.com/pytorch/tutorials/blob/main/recipes_index.rst)
3131

3232
If you are starting off with a Jupyter notebook, you can use [this script](https://gist.github.com/chsasank/7218ca16f8d022e02a9c0deb94a310fe) to convert the notebook to Python file. After conversion and addition to the project, please make sure that section headings and other things are in logical order.
3333

advanced_source/generic_join.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -369,7 +369,7 @@ of inputs across all ranks.
369369
def join_hook(self, **kwargs) -> JoinHook:
370370
r"""
371371
Return a join hook that shadows the all-reduce in :meth:`__call__`.
372-
372+
373373
This join hook supports the following keyword arguments:
374374
sync_max_count (bool, optional): whether to synchronize the maximum
375375
count across all ranks once all ranks join; default is ``False``.
@@ -446,5 +446,5 @@ Some key points to highlight:
446446
.. _Getting Started with Distributed Data Parallel - Basic Use Case: https://pytorch.org/tutorials/intermediate/ddp_tutorial.html#basic-use-case
447447
.. _Shard Optimizer States with ZeroRedundancyOptimizer: https://pytorch.org/tutorials/recipes/zero_redundancy_optimizer.html
448448
.. _DistributedDataParallel: https://pytorch.org/docs/stable/generated/torch.nn.parallel.DistributedDataParallel.html
449-
.. _join(): https://pytorch.org/docs/stable/_modules/torch/nn/parallel/distributed.html#DistributedDataParallel.join
449+
.. _join(): https://docs.pytorch.org/docs/stable/generated/torch.nn.parallel.DistributedDataParallel.html#torch.nn.parallel.DistributedDataParallel.join
450450
.. _ZeroRedundancyOptimizer: https://pytorch.org/docs/stable/distributed.optim.html

intermediate_source/rpc_tutorial.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ Source code of the two examples can be found in
1919
Previous tutorials,
2020
`Getting Started With Distributed Data Parallel <ddp_tutorial.html>`__
2121
and `Writing Distributed Applications With PyTorch <dist_tuto.html>`__,
22-
described `DistributedDataParallel <https://pytorch.org/docs/stable/_modules/torch/nn/parallel/distributed.html>`__
22+
described `DistributedDataParallel <https://docs.pytorch.org/docs/stable/generated/torch.nn.parallel.DistributedDataParallel.html>`__
2323
which supports a specific training paradigm where the model is replicated across
2424
multiple processes and each process handles a split of the input data.
2525
Sometimes, you might run into scenarios that require different training

unstable_source/context_parallel.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ Introduction to Context Parallel
33
**Authors**: `Xilun Wu <https://github.com/XilunWu>`_, `Chien-Chin Huang <https://github.com/fegin>`__
44

55
.. note::
6-
|edit| View and edit this tutorial in `GitHub <https://github.com/pytorch/tutorials/blob/main/prototype_source/context_parallel.rst>`__.
6+
|edit| View and edit this tutorial in `GitHub <https://github.com/pytorch/tutorials/blob/main/unstable_source/context_parallel.rst>`__.
77

88
.. grid:: 2
99

0 commit comments

Comments
 (0)