Skip to content

Commit 1b9ef30

Browse files
marios1861lantiga
authored andcommitted
fix links in DeepSpeed docs (#21153)
(cherry picked from commit f61713a)
1 parent 70ad4fa commit 1b9ef30

File tree

1 file changed

+2
-3
lines changed

1 file changed

+2
-3
lines changed

src/lightning/pytorch/strategies/deepspeed.py

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -125,14 +125,13 @@ def __init__(
125125
exclude_frozen_parameters: bool = False,
126126
) -> None:
127127
"""Provides capabilities to run training using the DeepSpeed library, with training optimizations for large
128-
billion parameter models. `For more information: https://pytorch-
129-
lightning.readthedocs.io/en/stable/advanced/model_parallel.html#deepspeed`.
128+
billion parameter models. *For more information:* :ref:`deepspeed_advanced`.
130129
131130
.. warning:: This is an :ref:`experimental <versioning:Experimental API>` feature.
132131
133132
Defaults have been set to enable ZeRO-Offload and some have been taken from the link below.
134133
These defaults have been set generally, but may require tuning for optimum performance based on your model size.
135-
`For more information: https://www.deepspeed.ai/docs/config-json/#zero-optimizations-for-fp16-training`.
134+
*For more information:* https://www.deepspeed.ai/docs/config-json/#zero-optimizations-for-fp16-training.
136135
137136
Arguments:
138137

0 commit comments

Comments
 (0)