Closed
Description
In the tutorial AUTOMATIC DIFFERENTIATION WITH TORCH.AUTOGRAD a working toy example is presented in Optional Reading: Tensor Gradients and Jacobian Products.
However, the shape of the input tensor inp
is used when specifying the vector out
, as highlighted in the image below.
This code runs correctly because the input and output tensors happen to be the same size in this example. However, if the output tensor is a different shape then using the input shape in the backwards() call leads to an exception. I believe it would be more instructive to use out
in place of inp
in the positions highlighted. This would also better match the text above the code snippet.
cc @suraj813