Skip to content

Commit f036495

Browse files
wz337svekars
andcommitted
Update recipes_source/distributed_device_mesh.rst
Co-authored-by: Svetlana Karslioglu <svekars@meta.com>
1 parent 6a90666 commit f036495

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

recipes_source/distributed_device_mesh.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ Prerequisites:
1010

1111
- `Distributed Communication Package - torch.distributed <https://pytorch.org/docs/stable/distributed.html>`__
1212

13-
.. Setting up nccl communicators for distributed communication during distributed training could be challenging. For workloads where users need to compose different parallelisms,
13+
.. Setting up the NVIDIA Collective Communication Library (NCCL) communicators for distributed communication during distributed training can pose a significant challenge. For workloads where users need to compose different parallelisms,
1414
.. users would need to manually set up and manage nccl communicators(for example, :class:`ProcessGroup`) for each parallelism solutions. This is fairly complicated and error-proned.
1515
.. :class:`DeviceMesh` can help make this process much easier.
1616

0 commit comments

Comments
 (0)