File tree Expand file tree Collapse file tree 1 file changed +10
-0
lines changed Expand file tree Collapse file tree 1 file changed +10
-0
lines changed Original file line number Diff line number Diff line change @@ -195,3 +195,13 @@ RPC Tutorials are listed below:
195
195
`@rpc.functions.async_execution <https://pytorch.org/docs/master/rpc.html#torch.distributed.rpc.functions.async_execution >`__
196
196
decorator, which can help speed up inference and training. It uses similar
197
197
RL and PS examples employed in the above tutorials 1 and 2.
198
+ 5. The `Combining Distributed DataParallel with Distributed RPC Framework <../advanced/rpc_ddp_tutorial.html >`__
199
+ tutorial demonstrates how to combine DDP with RPC to train a model using
200
+ distributed data parallelism combined with distributed model parallelism.
201
+
202
+
203
+ PyTorch Distributed Developers
204
+ ------------------------------
205
+
206
+ If you'd like to contribute to PyTorch Distributed, please refer to our
207
+ `Developer Guide <https://github.com/pytorch/pytorch/blob/master/torch/distributed/CONTRIBUTING.md >`_.
You can’t perform that action at this time.
0 commit comments