File tree Expand file tree Collapse file tree 2 files changed +9
-9
lines changed Expand file tree Collapse file tree 2 files changed +9
-9
lines changed Original file line number Diff line number Diff line change 36
36
37
37
``Variable`` and ``Function`` are interconnected and build up an acyclic
38
38
graph, that encodes a complete history of computation. Each variable has
39
- a ``.creator `` attribute that references a ``Function`` that has created
39
+ a ``.grad_fn `` attribute that references a ``Function`` that has created
40
40
the ``Variable`` (except for Variables created by the user - their
41
- ``creator is None``).
41
+ ``grad_fn is None``).
42
42
43
43
If you want to compute the derivatives, you can call ``.backward()`` on
44
44
a ``Variable``. If ``Variable`` is a scalar (i.e. it holds a one element
61
61
print (y )
62
62
63
63
###############################################################
64
- # ``y`` was created as a result of an operation, so it has a creator .
65
- print (y .creator )
64
+ # ``y`` was created as a result of an operation, so it has a ``grad_fn`` .
65
+ print (y .grad_fn )
66
66
67
67
###############################################################
68
68
# Do more operations on y
Original file line number Diff line number Diff line change @@ -157,15 +157,15 @@ def num_flat_features(self, x):
157
157
# For example:
158
158
159
159
output = net (input )
160
- target = Variable (torch .range (1 , 10 )) # a dummy target, for example
160
+ target = Variable (torch .arange (1 , 11 )) # a dummy target, for example
161
161
criterion = nn .MSELoss ()
162
162
163
163
loss = criterion (output , target )
164
164
print (loss )
165
165
166
166
########################################################################
167
167
# Now, if you follow ``loss`` in the backward direction, using it’s
168
- # ``.creator `` attribute, you will see a graph of computations that looks
168
+ # ``.grad_fn `` attribute, you will see a graph of computations that looks
169
169
# like this:
170
170
#
171
171
# ::
@@ -181,9 +181,9 @@ def num_flat_features(self, x):
181
181
#
182
182
# For illustration, let us follow a few steps backward:
183
183
184
- print (loss .creator ) # MSELoss
185
- print (loss .creator . previous_functions [0 ][0 ]) # Linear
186
- print (loss .creator . previous_functions [0 ][0 ].previous_functions [0 ][0 ]) # ReLU
184
+ print (loss .grad_fn ) # MSELoss
185
+ print (loss .grad_fn . next_functions [0 ][0 ]) # Linear
186
+ print (loss .grad_fn . next_functions [0 ][0 ].next_functions [0 ][0 ]) # ReLU
187
187
188
188
########################################################################
189
189
# Backprop
You can’t perform that action at this time.
0 commit comments