We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 6fa035e commit f2579c6Copy full SHA for f2579c6
beginner_source/introyt/autogradyt_tutorial.py
@@ -269,7 +269,7 @@ def forward(self, x):
269
##########################################################################
270
# One thing you might notice is that we never specify
271
# ``requires_grad=True`` for the model’s layers. Within a subclass of
272
-# ``torch.nn.module``, it’s assumed that we want to track gradients on the
+# ``torch.nn.Module``, it’s assumed that we want to track gradients on the
273
# layers’ weights for learning.
274
#
275
# If we look at the layers of the model, we can examine the values of the
0 commit comments