You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Update PG backend extension tutorial for 1.13 release (#2099)
It covers two changes of c10d introduced in torch 1.13:
- Extract ProcessGroup::Work into a separate class, refer to pytorch/pytorch#83680
- Install c10d headers with absolute path, refer to Install pytorch/pytorch#86933
The tutorial code has also been updated at mrshenli/dummy_collectives#1
**Author**: `Feng Tian <https://github.com/ftian1>`__, `Shen Li <https://mrshenli.github.io/>`__
4
+
**Author**: `Feng Tian <https://github.com/ftian1>`__, `Shen Li <https://mrshenli.github.io/>`__, `Min Si <https://minsii.github.io/>`__
5
5
6
6
.. note::
7
7
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/process_group_cpp_extension_tutorial.rst>`__.
@@ -62,7 +62,7 @@ Step 1: Implement a Subclass of ``ProcessGroup``
62
62
63
63
This first step is to implement a ``ProcessGroup`` subclass that overrides
64
64
target collective communication APIs and runs the custom communication algorithm.
65
-
The extension also needs to implement a ``ProcessGroup::Work`` subclass, which
65
+
The extension also needs to implement a ``Work`` subclass, which
66
66
serves as a future of communication results and allows asynchronous execution in
67
67
application code. If the extension uses third-party libraries, it can
68
68
include the headers and call into the library APIs from the ``ProcessGroupDummy``
@@ -75,49 +75,49 @@ repository for the full implementation.
0 commit comments