Skip to content

Commit 9cd6d0f

Browse files
authored
Add HPU Accelerator column to the precision doc (#12499)
1 parent 486f07b commit 9cd6d0f

File tree

1 file changed

+7
-2
lines changed

1 file changed

+7
-2
lines changed

docs/source/advanced/precision.rst

Lines changed: 7 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -20,34 +20,39 @@ Higher precision, such as the 64-bit floating-point, can be used for highly sens
2020
Following are the precisions available in Lightning along with their supported Accelerator:
2121

2222
.. list-table:: Precision with Accelerators
23-
:widths: 20 20 20 20 20
23+
:widths: 20 20 20 20 20 20
2424
:header-rows: 1
2525

2626
* - Precision
2727
- CPU
2828
- GPU
2929
- TPU
3030
- IPU
31+
- HPU
3132
* - 16
3233
- No
3334
- Yes
3435
- No
3536
- Yes
37+
- No
3638
* - BFloat16
3739
- Yes
3840
- Yes
3941
- Yes
4042
- No
43+
- Yes
4144
* - 32
4245
- Yes
4346
- Yes
4447
- Yes
4548
- Yes
49+
- Yes
4650
* - 64
4751
- Yes
4852
- Yes
4953
- No
5054
- No
55+
- No
5156

5257

5358
***************
@@ -224,4 +229,4 @@ You can also customize and pass your own Precision Plugin by subclassing the :cl
224229
***************
225230

226231
It is possible to further reduce the precision using third-party libraries like `bitsandbytes <https://github.com/facebookresearch/bitsandbytes>`_. Although,
227-
Lightning doesn't support it out of the box yet but you can still use it by configuring it in your LightningModule and setting ``Trainer(precision=32)``.
232+
Lightning doesn't support it out of the box yet but you can still use it by configuring it in your :class:`~pytorch_lightning.core.lightning.LightningModule` and setting ``Trainer(precision=32)``.

0 commit comments

Comments
 (0)