File tree Expand file tree Collapse file tree 2 files changed +2
-2
lines changed Expand file tree Collapse file tree 2 files changed +2
-2
lines changed Original file line number Diff line number Diff line change @@ -186,7 +186,7 @@ RPC Tutorials are listed below:
186
186
tutorial borrows the spirit of
187
187
`HogWild! training <https://people.eecs.berkeley.edu/~brecht/papers/hogwildTR.pdf >`__
188
188
and applies it to an asynchronous parameter server (PS) training application.
189
- 3. The `Distributed Pipeliine Parallelism Using RPC <../intermediate/dist_pipeline_parallel_tutorial.html >`__
189
+ 3. The `Distributed Pipeline Parallelism Using RPC <../intermediate/dist_pipeline_parallel_tutorial.html >`__
190
190
tutorial extends the single-machine pipeline parallel example (presented in
191
191
`Single-Machine Model Parallel Best Practices <../intermediate/model_parallel_tutorial.html >`__)
192
192
to a distributed environment and shows how to implement it using RPC.
Original file line number Diff line number Diff line change @@ -6,7 +6,7 @@ Prerequisites:
6
6
7
7
- `PyTorch Distributed Overview <../beginner/dist_overview.html >`__
8
8
- `DistributedDataParallel API documents <https://pytorch.org/docs/master/generated/torch.nn.parallel.DistributedDataParallel.html >`__
9
- _ `DistributedDataParallel notes <https://pytorch.org/docs/master/notes/ddp.html >`__
9
+ - `DistributedDataParallel notes <https://pytorch.org/docs/master/notes/ddp.html >`__
10
10
11
11
12
12
`DistributedDataParallel <https://pytorch.org/docs/stable/nn.html#torch.nn.parallel.DistributedDataParallel >`__
You can’t perform that action at this time.
0 commit comments