File tree Expand file tree Collapse file tree 1 file changed +3
-3
lines changed Expand file tree Collapse file tree 1 file changed +3
-3
lines changed Original file line number Diff line number Diff line change @@ -24,18 +24,18 @@ Setup
24
24
The distributed package included in PyTorch (i.e.,
25
25
``torch.distributed ``) enables researchers and practitioners to easily
26
26
parallelize their computations across processes and clusters of
27
- machines. To do so, it leverages messaging passing semantics
27
+ machines. To do so, it leverages message passing semantics
28
28
allowing each process to communicate data to any of the other processes.
29
29
As opposed to the multiprocessing (``torch.multiprocessing ``) package,
30
30
processes can use different communication backends and are not
31
31
restricted to being executed on the same machine.
32
32
33
33
In order to get started we need the ability to run multiple processes
34
34
simultaneously. If you have access to compute cluster you should check
35
- with your local sysadmin or use your favorite coordination tool. (e.g.,
35
+ with your local sysadmin or use your favorite coordination tool (e.g.,
36
36
`pdsh <https://linux.die.net/man/1/pdsh >`__,
37
37
`clustershell <https://cea-hpc.github.io/clustershell/ >`__, or
38
- `others <https://slurm.schedmd.com/ >`__) For the purpose of this
38
+ `others <https://slurm.schedmd.com/ >`__). For the purpose of this
39
39
tutorial, we will use a single machine and fork multiple processes using
40
40
the following template.
41
41
You can’t perform that action at this time.
0 commit comments