Skip to content

Commit c60a6fc

Browse files
authored
Merge branch 'main' into patch-1
2 parents 802d686 + 1b011fb commit c60a6fc

File tree

15 files changed

+1010
-159
lines changed

15 files changed

+1010
-159
lines changed

.devcontainer/requirements.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ ipython
2424
# to run examples
2525
pandas
2626
scikit-image
27-
pillow==10.0.1
27+
pillow==10.2.0
2828
wget
2929

3030
# for codespaces env

CONTRIBUTING.md

Lines changed: 17 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -65,7 +65,7 @@ There are three types of tutorial content that we host on
6565
code in these tutorials is run every time they are built. To keep
6666
these tutorials up and running all their package dependencies need to
6767
be resolved--which makes it more challenging to maintain this type of
68-
tutorial.
68+
tutorial.
6969

7070
* **Non-interactive tutorials** are authored and submitted as
7171
reStructuredText files. The build system only converts them into HTML;
@@ -80,18 +80,16 @@ There are three types of tutorial content that we host on
8080
non-interactive.
8181

8282

83-
# Managing data that is used by your tutorial
83+
# Managing data that is used by your tutorial
8484

8585
Your tutorial might depend on external data, such as pre-trained models,
8686
training data, or test data. We recommend storing this data in a
8787
commonly-used storage service, such as Amazon S3, and instructing your
88-
users to download the data at the beginning of your tutorial.
89-
90-
The
91-
[Makefile](https://github.com/pytorch/tutorials/blob/main/Makefile)
92-
that we use to build the tutorials contains automation that downloads
93-
required data files.
88+
users to download the data at the beginning of your tutorial.
9489

90+
To download your data add a function to the [download.py](https://github.com/pytorch/tutorials/blob/main/.jenkins/download_data.py)
91+
script. Follow the same pattern as other download functions.
92+
Please do not add download logic to `Makefile` as it will incur download overhead for all CI shards.
9593

9694
# Python packages used by your tutorial
9795

@@ -104,7 +102,7 @@ tutorial fails to build in our Continuous Integration (CI) system, we
104102
might contact you in order to resolve the issue.
105103

106104

107-
# Deprecation of tutorials
105+
# Deprecation of tutorials
108106

109107
Under some circumstances, we might deprecate--and subsequently
110108
archive--a tutorial removing it from the site. For example, if the
@@ -137,7 +135,7 @@ end-to-end understanding of how to use PyTorch. Recipes are scoped
137135
examples of how to use specific features; the goal of a recipe is to
138136
teach readers how to easily leverage features of PyTorch for their
139137
needs. Tutorials and recipes are always _actionable_. If the material is
140-
purely informative, consider adding it to the API docs instead.
138+
purely informative, consider adding it to the API docs instead.
141139

142140
View our current [full-length tutorials](https://pytorch.org/tutorials/).
143141

@@ -165,11 +163,11 @@ Write for a global audience with an instructive and directive voice.
165163
- PyTorch has a global audience; use clear, easy to understand
166164
language. Avoid idioms or other figures of speech.
167165
- To keep your instructions concise, use
168-
[active voice](https://writing.wisc.edu/handbook/style/ccs_activevoice/) as much as possible.
169-
- For a short guide on the essentials of writing style,
166+
[active voice](https://writing.wisc.edu/handbook/style/ccs_activevoice/) as much as possible.
167+
- For a short guide on the essentials of writing style,
170168
[The Elements of Style](https://www.gutenberg.org/files/37134/37134-h/37134-h.htm)
171169
is invaluable.
172-
- For extensive guidance on technical-writing style, the Google developer documentation
170+
- For extensive guidance on technical-writing style, the Google developer documentation
173171
[google style](https://developers.google.com/style)
174172
is a great resource.
175173
- Think of the process as similar to creating a (really practical)
@@ -195,7 +193,7 @@ We recommend that tutorials use the following structure which guides users throu
195193
1. Step-by-step instructions. Ideally, the steps in the tutorial should
196194
map back to the learning objectives. Consider adding comments in the
197195
code that correspond to these steps and that help to clarify what
198-
each section of the code is doing.
196+
each section of the code is doing.
199197
1. Link to relevant [PyTorch
200198
documentation](https://pytorch.org/docs/stable/index.html). This
201199
helps readers have context for the tutorial source code and better
@@ -230,7 +228,7 @@ as a guide.
230228
Submit your tutorial as either a Python (`.py`) file or a
231229
reStructuredText (`.rst`) file. For Python files, the filename for your
232230
tutorial should end in "`_tutorial.py`"; for example,
233-
"`cool_pytorch_feature_tutorial.py`".
231+
"`cool_pytorch_feature_tutorial.py`".
234232

235233
Do not submit a Jupyter notebook. If you develop your tutorial in
236234
Jupyter, you'll need to convert it to Python. This
@@ -276,8 +274,8 @@ search, you need to include it in `index.rst`, or for recipes, in
276274
:header: Learn the Basics # Tutorial title
277275
:card_description: A step-by-step guide to building a complete ML workflow with PyTorch. # Short description
278276
:image: _static/img/thumbnails/cropped/60-min-blitz.png # Image that appears with the card
279-
:link: beginner/basics/intro.html
280-
:tags: Getting-Started
277+
:link: beginner/basics/intro.html
278+
:tags: Getting-Started
281279
```
282280

283281

@@ -340,7 +338,7 @@ test your tutorial when you submit your PR.
340338

341339

342340
## Submit the PR ##
343-
341+
344342
NOTE: Please do not use [ghstack](https://github.com/ezyang/ghstack). We
345343
do not support ghstack in the [`pytorch/tutorials`](https://github.com/pytorch/tutorials) repo.
346344

@@ -368,5 +366,5 @@ build. You can see an example Netlify preview at the following URL:
368366
369367
## Do not merge the PR yourself ##
370368
371-
Please **DO NOT MERGE** your own PR; the tutorial won't be published. In order to avoid potential build breaks with the tutorials site, only certain maintainers can authorize publishing.
369+
Please **DO NOT MERGE** your own PR; the tutorial won't be published. In order to avoid potential build breaks with the tutorials site, only certain maintainers can authorize publishing.
372370
37.6 KB
Loading

advanced_source/coding_ddpg.py

Lines changed: 8 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -58,26 +58,24 @@
5858
# Imports and setup
5959
# -----------------
6060
#
61+
# .. code-block:: bash
62+
#
63+
# %%bash
64+
# pip3 install torchrl mujoco glfw
6165

6266
import torchrl
67+
import torch
68+
import tqdm
69+
from typing import Tuple
6370

6471
# sphinx_gallery_start_ignore
6572
import warnings
66-
from typing import Tuple
67-
6873
warnings.filterwarnings("ignore")
6974
# sphinx_gallery_end_ignore
7075

71-
import torch.cuda
72-
import tqdm
73-
74-
import torch.multiprocessing
75-
7676
###############################################################################
7777
# We will execute the policy on CUDA if available
78-
device = (
79-
torch.device("cpu") if torch.cuda.device_count() == 0 else torch.device("cuda:0")
80-
)
78+
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
8179
collector_device = torch.device("cpu") # Change the device to ``cuda`` to use CUDA
8280

8381
###############################################################################

distributed/home.rst

Lines changed: 19 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,7 @@ PyTorch with each method having their advantages in certain use cases:
1313

1414
* `DistributedDataParallel (DDP) <#learn-ddp>`__
1515
* `Fully Sharded Data Parallel (FSDP) <#learn-fsdp>`__
16+
* `Device Mesh <#device-mesh>`__
1617
* `Remote Procedure Call (RPC) distributed training <#learn-rpc>`__
1718
* `Custom Extensions <#custom-extensions>`__
1819

@@ -51,7 +52,7 @@ Learn DDP
5152
:link: https://pytorch.org/tutorials/advanced/generic_join.html?utm_source=distr_landing&utm_medium=generic_join
5253
:link-type: url
5354

54-
This tutorial describes the Join context manager and
55+
This tutorial describes the Join context manager and
5556
demonstrates it's use with DistributedData Parallel.
5657
+++
5758
:octicon:`code;1em` Code
@@ -83,6 +84,23 @@ Learn FSDP
8384
+++
8485
:octicon:`code;1em` Code
8586

87+
.. _device-mesh:
88+
89+
Learn DeviceMesh
90+
----------------
91+
92+
.. grid:: 3
93+
94+
.. grid-item-card:: :octicon:`file-code;1em`
95+
Getting Started with DeviceMesh
96+
:link: https://pytorch.org/tutorials/recipes/distributed_device_mesh.html?highlight=devicemesh
97+
:link-type: url
98+
99+
In this tutorial you will learn about `DeviceMesh`
100+
and how it can help with distributed training.
101+
+++
102+
:octicon:`code;1em` Code
103+
86104
.. _learn-rpc:
87105

88106
Learn RPC

0 commit comments

Comments
 (0)