We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 38aa561 commit 8434f3dCopy full SHA for 8434f3d
intermediate_source/dist_tuto.rst
@@ -446,7 +446,7 @@ One of the most elegant aspects of ``torch.distributed`` is its ability
446
to abstract and build on top of different backends. As mentioned before,
447
there are currently three backends implemented in PyTorch: Gloo, NCCL, and
448
MPI. They each have different specifications and tradeoffs, depending
449
-on the desired use-case. A comparative table of supported functions can
+on the desired use case. A comparative table of supported functions can
450
be found
451
`here <https://pytorch.org/docs/stable/distributed.html#module-torch.distributed>`__.
452
0 commit comments