File tree Expand file tree Collapse file tree 3 files changed +654
-1
lines changed Expand file tree Collapse file tree 3 files changed +654
-1
lines changed Original file line number Diff line number Diff line change @@ -203,6 +203,11 @@ Parallel and Distributed Training
203
203
:description: :doc: `/intermediate/dist_tuto `
204
204
:figure: _static/img/distributed/DistPyTorch.jpg
205
205
206
+ .. customgalleryitem ::
207
+ :tooltip: Getting Started with Distributed RPC Framework
208
+ :description: :doc: `/intermediate/rpc_tutorial `
209
+ :figure: _static/img/distributed/DistPyTorch.jpg
210
+
206
211
.. customgalleryitem ::
207
212
:tooltip: PyTorch distributed trainer with Amazon AWS
208
213
:description: :doc: `/beginner/aws_distributed_training_tutorial `
@@ -377,6 +382,7 @@ PyTorch Fundamentals In-Depth
377
382
intermediate/model_parallel_tutorial
378
383
intermediate/ddp_tutorial
379
384
intermediate/dist_tuto
385
+ intermediate/rpc_tutorial
380
386
beginner/aws_distributed_training_tutorial
381
387
382
388
.. toctree ::
Original file line number Diff line number Diff line change 1
1
# -*- coding: utf-8 -*-
2
2
"""
3
- Model Parallel Best Practices
3
+ Single-Machine Model Parallel Best Practices
4
4
================================
5
5
**Author**: `Shen Li <https://mrshenli.github.io/>`_
6
6
27
27
of model parallel. It is up to the readers to apply the ideas to real-world
28
28
applications.
29
29
30
+ .. note::
31
+
32
+ For distributed model parallel training where a model spans multiple
33
+ servers, please refer to
34
+ `Getting Started With Distributed RPC Framework <rpc_tutorial.html>`__
35
+ for examples and details.
36
+
30
37
Basic Usage
31
38
-----------
32
39
"""
You can’t perform that action at this time.
0 commit comments