Skip to content

Commit fa83e33

Browse files
tholopholly1238
andauthored
Fix typo. (#1082)
Co-authored-by: holly1238 <77758406+holly1238@users.noreply.github.com>
1 parent f35c04a commit fa83e33

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

beginner_source/blitz/neural_networks_tutorial.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -176,7 +176,7 @@ def num_flat_features(self, x):
176176
# -> loss
177177
#
178178
# So, when we call ``loss.backward()``, the whole graph is differentiated
179-
# w.r.t. the loss, and all Tensors in the graph that has ``requires_grad=True``
179+
# w.r.t. the loss, and all Tensors in the graph that have ``requires_grad=True``
180180
# will have their ``.grad`` Tensor accumulated with the gradient.
181181
#
182182
# For illustration, let us follow a few steps backward:

0 commit comments

Comments
 (0)