Skip to content

Commit 0fb2781

Browse files
netw0rkf10wbrianjoholly1238
authored
Fix typos (#707)
Co-authored-by: Brian Johnson <brianjo@fb.com> Co-authored-by: holly1238 <77758406+holly1238@users.noreply.github.com>
1 parent 35ae0ea commit 0fb2781

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

intermediate_source/dist_tuto.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -24,18 +24,18 @@ Setup
2424
The distributed package included in PyTorch (i.e.,
2525
``torch.distributed``) enables researchers and practitioners to easily
2626
parallelize their computations across processes and clusters of
27-
machines. To do so, it leverages messaging passing semantics
27+
machines. To do so, it leverages message passing semantics
2828
allowing each process to communicate data to any of the other processes.
2929
As opposed to the multiprocessing (``torch.multiprocessing``) package,
3030
processes can use different communication backends and are not
3131
restricted to being executed on the same machine.
3232

3333
In order to get started we need the ability to run multiple processes
3434
simultaneously. If you have access to compute cluster you should check
35-
with your local sysadmin or use your favorite coordination tool. (e.g.,
35+
with your local sysadmin or use your favorite coordination tool (e.g.,
3636
`pdsh <https://linux.die.net/man/1/pdsh>`__,
3737
`clustershell <https://cea-hpc.github.io/clustershell/>`__, or
38-
`others <https://slurm.schedmd.com/>`__) For the purpose of this
38+
`others <https://slurm.schedmd.com/>`__). For the purpose of this
3939
tutorial, we will use a single machine and fork multiple processes using
4040
the following template.
4141

0 commit comments

Comments
 (0)