Skip to content

Commit 669107c

Browse files
Fix two usage errors: its vs it's (#1484)
Co-authored-by: holly1238 <77758406+holly1238@users.noreply.github.com>
1 parent a06b381 commit 669107c

File tree

2 files changed

+3
-3
lines changed

2 files changed

+3
-3
lines changed

beginner_source/basics/autogradqs_tutorial.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,7 @@
5858
# A function that we apply to tensors to construct computational graph is
5959
# in fact an object of class ``Function``. This object knows how to
6060
# compute the function in the *forward* direction, and also how to compute
61-
# it's derivative during the *backward propagation* step. A reference to
61+
# its derivative during the *backward propagation* step. A reference to
6262
# the backward propagation function is stored in ``grad_fn`` property of a
6363
# tensor. You can find more information of ``Function`` `in the
6464
# documentation <https://pytorch.org/docs/stable/autograd.html#function>`__.

beginner_source/basics/buildmodel_tutorial.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -67,7 +67,7 @@ def forward(self, x):
6767

6868
##############################################
6969
# We create an instance of ``NeuralNetwork``, and move it to the ``device``, and print
70-
# it's structure.
70+
# its structure.
7171

7272
model = NeuralNetwork().to(device)
7373
print(model)
@@ -119,7 +119,7 @@ def forward(self, x):
119119
# nn.Linear
120120
# ^^^^^^^^^^^^^^^^^^^^^^
121121
# The `linear layer <https://pytorch.org/docs/stable/generated/torch.nn.Linear.html>`_
122-
# is a module that applies a linear transformation on the input using it's stored weights and biases.
122+
# is a module that applies a linear transformation on the input using its stored weights and biases.
123123
#
124124
layer1 = nn.Linear(in_features=28*28, out_features=20)
125125
hidden1 = layer1(flat_image)

0 commit comments

Comments
 (0)