Skip to content

Commit a9656f2

Browse files
authored
Update Tensor Gradients and Jacobian Products example (#2071)
Shape of gradient argument of `x.backwars` method should be that of `x`. Modify example, to make input tensor a 5x4 matrix and transpose it in the output Fixes #2065
1 parent 1a946b0 commit a9656f2

File tree

1 file changed

+5
-5
lines changed

1 file changed

+5
-5
lines changed

beginner_source/basics/autogradqs_tutorial.py

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -203,14 +203,14 @@
203203
# compute the product:
204204
#
205205

206-
inp = torch.eye(5, requires_grad=True)
207-
out = (inp+1).pow(2)
208-
out.backward(torch.ones_like(inp), retain_graph=True)
206+
inp = torch.eye(4, 5, requires_grad=True)
207+
out = (inp+1).pow(2).t()
208+
out.backward(torch.ones_like(out), retain_graph=True)
209209
print(f"First call\n{inp.grad}")
210-
out.backward(torch.ones_like(inp), retain_graph=True)
210+
out.backward(torch.ones_like(out), retain_graph=True)
211211
print(f"\nSecond call\n{inp.grad}")
212212
inp.grad.zero_()
213-
out.backward(torch.ones_like(inp), retain_graph=True)
213+
out.backward(torch.ones_like(out), retain_graph=True)
214214
print(f"\nCall after zeroing gradients\n{inp.grad}")
215215

216216

0 commit comments

Comments
 (0)