File tree Expand file tree Collapse file tree 2 files changed +7
-0
lines changed Expand file tree Collapse file tree 2 files changed +7
-0
lines changed Original file line number Diff line number Diff line change @@ -23,6 +23,7 @@ You may wish to author a custom operator from Python (as opposed to C++) if:
2323 respect to ``torch.compile `` and ``torch.export ``.
2424- you have some Python bindings to C++/CUDA kernels and want those to compose with PyTorch
2525 subsystems (like ``torch.compile `` or ``torch.autograd ``)
26+ - you are using Python (and not a C++-only environment like AOTInductor).
2627
2728Integrating custom C++ and/or CUDA code with PyTorch
2829^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Original file line number Diff line number Diff line change 3030 into the function).
3131- Adding training support to an arbitrary Python function
3232
33+ Use :func:`torch.library.custom_op` to create Python custom operators.
34+ Use the C++ ``TORCH_LIBRARY`` APIs to create C++ custom operators (these
35+ work in Python-less environments).
36+ See the `Custom Operators Landing Page <https://pytorch.org/tutorials/advanced/custom_ops_landing_page.html>`_
37+ for more details.
38+
3339Please note that if your operation can be expressed as a composition of
3440existing PyTorch operators, then there is usually no need to use the custom operator
3541API -- everything (for example ``torch.compile``, training support) should
You can’t perform that action at this time.
0 commit comments