Skip to content

Commit b4e884d

Browse files
committed
update
1 parent 7caaad8 commit b4e884d

File tree

2 files changed

+24
-5
lines changed

2 files changed

+24
-5
lines changed

advanced_source/python_extension_autoload.rst

Lines changed: 23 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ see `[RFC] Autoload Device Extension <https://github.com/pytorch/pytorch/issues/
2323
and ask the out-of-tree extension maintainer for help.
2424

2525
How to apply this mechanism to out-of-tree extensions?
26-
--------------------------------------------
26+
------------------------------------------------------
2727

2828
For example, if you have a backend named ``foo`` and a package named
2929
``torch_foo``. Make sure your package is based on PyTorch 2.5+ and includes
@@ -61,7 +61,9 @@ Now the ``torch_foo`` module can be imported when running import torch:
6161
Examples
6262
^^^^^^^^
6363

64-
TODO: take HPU and NPU as examples
64+
Here we take Intel HPU and Huawei Ascend NPU as examples to determine how to
65+
integrate your out-of-tree extension with PyTorch based on the autoloading
66+
mechanism.
6567

6668
`habana_frameworks.torch`_ is a Python package that enables users to run
6769
PyTorch programs on Intel Gaudi via the PyTorch ``HPU`` device key.
@@ -79,13 +81,30 @@ is applied.
7981
input = torch.rand(128, 3, 224, 224).to("hpu")
8082
output = model(input)
8183
82-
`torch_npu`_ enables users to run PyTorch program on Huawei Ascend NPU, it
84+
`torch_npu`_ enables users to run PyTorch programs on Huawei Ascend NPU, it
8385
leverages the ``PrivateUse1`` device key and exposes the device name
8486
as ``npu`` to the end users.
85-
``import torch_npu`` is also no longer needed after applying this mechanism.
8687

8788
.. _torch_npu: https://github.com/Ascend/pytorch
8889

90+
Define an entry point in `torch_npu/setup.py`_:
91+
92+
.. _torch_npu/setup.py: https://github.com/Ascend/pytorch/blob/c164fbd5bb74790191ff8496b77d620fddf806d8/setup.py#L618
93+
94+
.. code-block:: diff
95+
96+
setup(
97+
name="torch_npu",
98+
version="2.5",
99+
+ entry_points={
100+
+ 'torch.backends': [
101+
+ 'torch_npu = torch_npu:_autoload',
102+
+ ],
103+
+ }
104+
)
105+
106+
``import torch_npu`` is also no longer needed after applying this mechanism:
107+
89108
.. code-block:: diff
90109
91110
import torch

index.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -514,7 +514,7 @@ Welcome to PyTorch Tutorials
514514
:card_description: Learn how to improve the seamless integration of out-of-tree extension with PyTorch based on the autoloading mechanism.
515515
:image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
516516
:link: advanced/python_extension_autoload.html
517-
:tags: Extending-PyTorch
517+
:tags: Extending-PyTorch,Frontend-APIs
518518

519519
.. customcarditem::
520520
:header: Custom Function Tutorial: Double Backward

0 commit comments

Comments
 (0)