We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 507a473 commit df09dc8Copy full SHA for df09dc8
index.rst
@@ -365,6 +365,7 @@ Welcome to PyTorch Tutorials
365
:card_description: Walk through a through a simple example of how to combine distributed data parallelism with distributed model parallelism.
366
:image: _static/img/thumbnails/cropped/Combining-Distributed-DataParallel-with-Distributed-RPC-Framework.png
367
:link: advanced/rpc_ddp_tutorial.html
368
+ :tags: Parallel-and-Distributed-Training
369
370
.. End of tutorial card section
371
0 commit comments