@@ -12,9 +12,7 @@ programming model without needing to explicitly load or import device-specific
12
12
extensions. On the other hand, it facilitates effortless
13
13
adoption of existing PyTorch applications with zero-code changes on
14
14
out-of-tree devices. For more information,
15
- see `[RFC] Autoload Device Extension <rfc >`_.
16
-
17
- .. _rfc : https://github.com/pytorch/pytorch/issues/122468
15
+ see `[RFC] Autoload Device Extension <https://github.com/pytorch/pytorch/issues/122468 >`_.
18
16
19
17
Examples
20
18
^^^^^^^^
@@ -31,8 +29,8 @@ is applied.
31
29
import torch
32
30
import torchvision.models as models
33
31
- import habana_frameworks.torch # <-- extra import
34
- model = models.resnet50().eval().to(“ hpu” )
35
- input = torch.rand(128, 3, 224, 224).to(“ hpu” )
32
+ model = models.resnet50().eval().to(" hpu" )
33
+ input = torch.rand(128, 3, 224, 224).to(" hpu" )
36
34
output = model(input)
37
35
38
36
`torch_npu `_ enables users to run PyTorch program on Huawei Ascend NPU, it
@@ -47,8 +45,8 @@ as ``npu`` to the end users.
47
45
import torch
48
46
import torchvision.models as models
49
47
- import torch_npu # <-- extra import
50
- model = models.resnet50().eval().to(“ npu” )
51
- input = torch.rand(128, 3, 224, 224).to(“ npu” )
48
+ model = models.resnet50().eval().to(" npu" )
49
+ input = torch.rand(128, 3, 224, 224).to(" npu" )
52
50
output = model(input)
53
51
54
52
How it works
@@ -58,14 +56,12 @@ How it works
58
56
:alt: Autoloading implementation
59
57
:align: center
60
58
61
- This mechanism is implemented based on Python's `entry_points `_ mechanism.
62
- We discover and load all of the specific entry points in ``torch/__init__.py ``
63
- that are defined by out-of-tree extensions.
64
- Its implementation is in `[RFC] Add support for device extension autoloading <impl >`_
65
-
66
- .. _entry_points : https://packaging.python.org/en/latest/specifications/entry-points/
67
-
68
- .. _impl : https://github.com/pytorch/pytorch/pull/127074
59
+ This mechanism is implemented based on Python's `Entry point
60
+ <https://packaging.python.org/en/latest/specifications/entry-points/> `_
61
+ mechanism. We discover and load all of the specific entry points
62
+ in ``torch/__init__.py `` that are defined by out-of-tree extensions.
63
+ Its implementation is in `[RFC] Add support for device extension autoloading
64
+ <https://github.com/pytorch/pytorch/pull/127074> `_
69
65
70
66
How to apply this to out-of-tree extensions?
71
67
--------------------------------------------
@@ -87,8 +83,8 @@ package.
87
83
name = " torch_foo" ,
88
84
version = " 1.0" ,
89
85
entry_points = {
90
- ' torch.backends' : [
91
- ' torch_foo = torch_foo:_autoload' ,
86
+ " torch.backends" : [
87
+ " torch_foo = torch_foo:_autoload" ,
92
88
],
93
89
}
94
90
)
0 commit comments