File tree Expand file tree Collapse file tree 1 file changed +3
-3
lines changed Expand file tree Collapse file tree 1 file changed +3
-3
lines changed Original file line number Diff line number Diff line change 153
153
#######################################################################
154
154
# This ``grad_fn`` gives us a hint that when we execute the
155
155
# backpropagation step and compute gradients, we’ll need to compute the
156
- # derivative of :math:`sin(x)` for all this tensor’s inputs.
156
+ # derivative of :math:`\ sin(x)` for all this tensor’s inputs.
157
157
#
158
158
# Let’s perform some more computations:
159
159
#
222
222
# out = d.sum()
223
223
#
224
224
# Adding a constant, as we did to compute ``d``, does not change the
225
- # derivative. That leaves :math:`c = 2 * b = 2 * sin(a)`, the derivative
226
- # of which should be :math:`2 * cos(a)`. Looking at the graph above,
225
+ # derivative. That leaves :math:`c = 2 * b = 2 * \ sin(a)`, the derivative
226
+ # of which should be :math:`2 * \ cos(a)`. Looking at the graph above,
227
227
# that’s just what we see.
228
228
#
229
229
# Be aware that only *leaf nodes* of the computation have their gradients
You can’t perform that action at this time.
0 commit comments