File tree Expand file tree Collapse file tree 4 files changed +8
-7
lines changed Expand file tree Collapse file tree 4 files changed +8
-7
lines changed Original file line number Diff line number Diff line change @@ -49,6 +49,7 @@ if [[ "${CIRCLE_JOB}" == *worker_* ]]; then
49
49
python $DIR /remove_runnable_code.py advanced_source/static_quantization_tutorial.py advanced_source/static_quantization_tutorial.py || true
50
50
python $DIR /remove_runnable_code.py beginner_source/hyperparameter_tuning_tutorial.py beginner_source/hyperparameter_tuning_tutorial.py || true
51
51
python $DIR /remove_runnable_code.py beginner_source/audio_preprocessing_tutorial.py beginner_source/audio_preprocessing_tutorial.py || true
52
+ python $DIR /remove_runnable_code.py beginner_source/dcgan_faces_tutorial.py beginner_source/dcgan_faces_tutorial.py || true
52
53
python $DIR /remove_runnable_code.py intermediate_source/tensorboard_profiler_tutorial.py intermediate_source/tensorboard_profiler_tutorial.py || true
53
54
# Temp remove for mnist download issue. (Re-enabled for 1.8.1)
54
55
# python $DIR/remove_runnable_code.py beginner_source/fgsm_tutorial.py beginner_source/fgsm_tutorial.py || true
Original file line number Diff line number Diff line change 47
47
#
48
48
# In this network, ``w`` and ``b`` are **parameters**, which we need to
49
49
# optimize. Thus, we need to be able to compute the gradients of loss
50
- # function with respect to those variables. In orded to do that, we set
50
+ # function with respect to those variables. In order to do that, we set
51
51
# the ``requires_grad`` property of those tensors.
52
52
53
53
#######################################################################
Original file line number Diff line number Diff line change 9
9
All of deep learning is computations on tensors, which are
10
10
generalizations of a matrix that can be indexed in more than 2
11
11
dimensions. We will see exactly what this means in-depth later. First,
12
- lets look what we can do with tensors.
12
+ let's look what we can do with tensors.
13
13
"""
14
14
# Author: Robert Guthrie
15
15
162
162
# other operation, etc.)
163
163
#
164
164
# If ``requires_grad=True``, the Tensor object keeps track of how it was
165
- # created. Lets see it in action.
165
+ # created. Let's see it in action.
166
166
#
167
167
168
168
# Tensor factory methods have a ``requires_grad`` flag
187
187
# But how does that help us compute a gradient?
188
188
#
189
189
190
- # Lets sum up all the entries in z
190
+ # Let's sum up all the entries in z
191
191
s = z .sum ()
192
192
print (s )
193
193
print (s .grad_fn )
222
222
223
223
224
224
######################################################################
225
- # Lets have Pytorch compute the gradient, and see that we were right:
225
+ # Let's have Pytorch compute the gradient, and see that we were right:
226
226
# (note if you run this block multiple times, the gradient will increment.
227
227
# That is because Pytorch *accumulates* the gradient into the .grad
228
228
# property, since for many models this is very convenient.)
Original file line number Diff line number Diff line change @@ -497,7 +497,7 @@ Additional Resources
497
497
:header: PyTorch Cheat Sheet
498
498
:description: Quick overview to essential PyTorch elements.
499
499
:button_link: beginner/ptcheat.html
500
- :button_text: Download
500
+ :button_text: Open
501
501
502
502
.. customcalloutitem ::
503
503
:header: Tutorials on GitHub
@@ -509,7 +509,7 @@ Additional Resources
509
509
:header: Run Tutorials on Google Colab
510
510
:description: Learn how to copy tutorial data into Google Drive so that you can run tutorials on Google Colab.
511
511
:button_link: beginner/colab.html
512
- :button_text: Download
512
+ :button_text: Open
513
513
514
514
.. End of callout section
515
515
You can’t perform that action at this time.
0 commit comments