File tree Expand file tree Collapse file tree 3 files changed +9
-0
lines changed Expand file tree Collapse file tree 3 files changed +9
-0
lines changed Original file line number Diff line number Diff line change @@ -23,6 +23,7 @@ You may wish to author a custom operator from Python (as opposed to C++) if:
23
23
respect to ``torch.compile `` and ``torch.export ``.
24
24
- you have some Python bindings to C++/CUDA kernels and want those to compose with PyTorch
25
25
subsystems (like ``torch.compile `` or ``torch.autograd ``)
26
+ - you are using Python (and not a C++-only environment like AOTInductor).
26
27
27
28
Integrating custom C++ and/or CUDA code with PyTorch
28
29
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Original file line number Diff line number Diff line change 30
30
into the function).
31
31
- Adding training support to an arbitrary Python function
32
32
33
+ Use :func:`torch.library.custom_op` to create Python custom operators.
34
+ Use the C++ ``TORCH_LIBRARY`` APIs to create C++ custom operators (these
35
+ work in Python-less environments).
36
+ See the `Custom Operators Landing Page <https://pytorch.org/tutorials/advanced/custom_ops_landing_page.html>`_
37
+ for more details.
38
+
33
39
Please note that if your operation can be expressed as a composition of
34
40
existing PyTorch operators, then there is usually no need to use the custom operator
35
41
API -- everything (for example ``torch.compile``, training support) should
Original file line number Diff line number Diff line change @@ -392,6 +392,8 @@ FlexAttention
392
392
fp
393
393
frontend
394
394
functionalized
395
+ functionalizes
396
+ functionalization
395
397
functorch
396
398
fuser
397
399
geomean
You can’t perform that action at this time.
0 commit comments