Skip to content

Commit 83d4665

Browse files
authored
Minor editorial and formatting fixes
1 parent 70c7434 commit 83d4665

File tree

1 file changed

+10
-10
lines changed

1 file changed

+10
-10
lines changed

intermediate_source/compiled_autograd_tutorial.rst

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -162,11 +162,11 @@ You can use different compiler configs for the two compilations, for example, th
162162

163163
.. code:: python
164164
165-
def train(model, x):
166-
model = torch.compile(model)
167-
loss = model(x).sum()
168-
torch._dynamo.config.compiled_autograd = True
169-
torch.compile(lambda: loss.backward(), fullgraph=True)()
165+
def train(model, x):
166+
model = torch.compile(model)
167+
loss = model(x).sum()
168+
torch._dynamo.config.compiled_autograd = True
169+
torch.compile(lambda: loss.backward(), fullgraph=True)()
170170
171171
Or you can use the context manager, which will apply to all autograd calls within its scope.
172172

@@ -213,8 +213,8 @@ Compiled Autograd addresses certain limitations of AOTAutograd
213213
assert(torch._dynamo.utils.counters["stats"]["unique_graphs"] == 1)
214214
215215
216-
In the ``1. base torch.compile`` case, we see that 3 backward graphs were produced due to the 2 graph breaks in the compiled function ``fn``.
217-
Whereas in ``2. torch.compile with compiled autograd``, we see that a full backward graph was traced despite the graph breaks.
216+
In the first ``torch.compile`` case, we see that 3 backward graphs were produced due to the 2 graph breaks in the compiled function ``fn``.
217+
Whereas in the second ``torch.compile`` with compiled autograd case, we see that a full backward graph was traced despite the graph breaks.
218218

219219
2. Backward hooks are not captured
220220

@@ -231,7 +231,7 @@ Whereas in ``2. torch.compile with compiled autograd``, we see that a full backw
231231
with torch._dynamo.compiled_autograd.enable(torch.compile(backend="aot_eager")):
232232
loss.backward()
233233
234-
There should be a ``call_hook`` node in the graph, which dynamo will later inline into
234+
There should be a ``call_hook`` node in the graph, which dynamo will later inline into the following:
235235

236236
.. code:: python
237237
@@ -249,7 +249,7 @@ There should be a ``call_hook`` node in the graph, which dynamo will later inlin
249249
250250
Common recompilation reasons for Compiled Autograd
251251
--------------------------------------------------
252-
1. Due to changes in the autograd structure of the loss value
252+
1. Due to changes in the autograd structure of the loss value:
253253

254254
.. code:: python
255255
@@ -274,7 +274,7 @@ In the example above, we call a different operator on each iteration, leading to
274274
...
275275
"""
276276
277-
2. Due to tensors changing shapes
277+
2. Due to tensors changing shapes:
278278

279279
.. code:: python
280280

0 commit comments

Comments
 (0)