Skip to content

Commit 4b2fff8

Browse files
author
Svetlana Karslioglu
authored
Merge branch 'main' into fix-2289
2 parents 17229b4 + 3bb7d5b commit 4b2fff8

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

beginner_source/introyt/autogradyt_tutorial.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -153,7 +153,7 @@
153153
#######################################################################
154154
# This ``grad_fn`` gives us a hint that when we execute the
155155
# backpropagation step and compute gradients, we’ll need to compute the
156-
# derivative of :math:`sin(x)` for all this tensor’s inputs.
156+
# derivative of :math:`\sin(x)` for all this tensor’s inputs.
157157
#
158158
# Let’s perform some more computations:
159159
#
@@ -222,8 +222,8 @@
222222
# out = d.sum()
223223
#
224224
# Adding a constant, as we did to compute ``d``, does not change the
225-
# derivative. That leaves :math:`c = 2 * b = 2 * sin(a)`, the derivative
226-
# of which should be :math:`2 * cos(a)`. Looking at the graph above,
225+
# derivative. That leaves :math:`c = 2 * b = 2 * \sin(a)`, the derivative
226+
# of which should be :math:`2 * \cos(a)`. Looking at the graph above,
227227
# that’s just what we see.
228228
#
229229
# Be aware that only *leaf nodes* of the computation have their gradients

0 commit comments

Comments
 (0)