File tree Expand file tree Collapse file tree 1 file changed +4
-4
lines changed Expand file tree Collapse file tree 1 file changed +4
-4
lines changed Original file line number Diff line number Diff line change 9
9
All of deep learning is computations on tensors, which are
10
10
generalizations of a matrix that can be indexed in more than 2
11
11
dimensions. We will see exactly what this means in-depth later. First,
12
- lets look what we can do with tensors.
12
+ let's look what we can do with tensors.
13
13
"""
14
14
# Author: Robert Guthrie
15
15
162
162
# other operation, etc.)
163
163
#
164
164
# If ``requires_grad=True``, the Tensor object keeps track of how it was
165
- # created. Lets see it in action.
165
+ # created. Let's see it in action.
166
166
#
167
167
168
168
# Tensor factory methods have a ``requires_grad`` flag
187
187
# But how does that help us compute a gradient?
188
188
#
189
189
190
- # Lets sum up all the entries in z
190
+ # Let's sum up all the entries in z
191
191
s = z .sum ()
192
192
print (s )
193
193
print (s .grad_fn )
222
222
223
223
224
224
######################################################################
225
- # Lets have Pytorch compute the gradient, and see that we were right:
225
+ # Let's have Pytorch compute the gradient, and see that we were right:
226
226
# (note if you run this block multiple times, the gradient will increment.
227
227
# That is because Pytorch *accumulates* the gradient into the .grad
228
228
# property, since for many models this is very convenient.)
You can’t perform that action at this time.
0 commit comments