File tree Expand file tree Collapse file tree 2 files changed +3
-3
lines changed Expand file tree Collapse file tree 2 files changed +3
-3
lines changed Original file line number Diff line number Diff line change 58
58
# A function that we apply to tensors to construct computational graph is
59
59
# in fact an object of class ``Function``. This object knows how to
60
60
# compute the function in the *forward* direction, and also how to compute
61
- # it's derivative during the *backward propagation* step. A reference to
61
+ # its derivative during the *backward propagation* step. A reference to
62
62
# the backward propagation function is stored in ``grad_fn`` property of a
63
63
# tensor. You can find more information of ``Function`` `in the
64
64
# documentation <https://pytorch.org/docs/stable/autograd.html#function>`__.
Original file line number Diff line number Diff line change @@ -67,7 +67,7 @@ def forward(self, x):
67
67
68
68
##############################################
69
69
# We create an instance of ``NeuralNetwork``, and move it to the ``device``, and print
70
- # it's structure.
70
+ # its structure.
71
71
72
72
model = NeuralNetwork ().to (device )
73
73
print (model )
@@ -119,7 +119,7 @@ def forward(self, x):
119
119
# nn.Linear
120
120
# ^^^^^^^^^^^^^^^^^^^^^^
121
121
# The `linear layer <https://pytorch.org/docs/stable/generated/torch.nn.Linear.html>`_
122
- # is a module that applies a linear transformation on the input using it's stored weights and biases.
122
+ # is a module that applies a linear transformation on the input using its stored weights and biases.
123
123
#
124
124
layer1 = nn .Linear (in_features = 28 * 28 , out_features = 20 )
125
125
hidden1 = layer1 (flat_image )
You can’t perform that action at this time.
0 commit comments