Skip to content

Commit ff0cfa1

Browse files
rhtholly1238
andauthored
beginner/blitz/nn: Fix misleading typo on which term to be differentiated against (#726)
Co-authored-by: holly1238 <77758406+holly1238@users.noreply.github.com>
1 parent 424f027 commit ff0cfa1

File tree

1 file changed

+3
-2
lines changed

1 file changed

+3
-2
lines changed

beginner_source/blitz/neural_networks_tutorial.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -176,8 +176,9 @@ def num_flat_features(self, x):
176176
# -> loss
177177
#
178178
# So, when we call ``loss.backward()``, the whole graph is differentiated
179-
# w.r.t. the loss, and all Tensors in the graph that have ``requires_grad=True``
180-
# will have their ``.grad`` Tensor accumulated with the gradient.
179+
# w.r.t. the neural net parameters, and all Tensors in the graph that have
180+
# ``requires_grad=True`` will have their ``.grad`` Tensor accumulated with the
181+
# gradient.
181182
#
182183
# For illustration, let us follow a few steps backward:
183184

0 commit comments

Comments
 (0)