Skip to content

Commit 8d19c5c

Browse files
janeyx99svekars
andauthored
Update PT Cheat page to stop referencing torchscript (#3139)
Co-authored-by: Svetlana Karslioglu <svekars@meta.com>
1 parent 4e9296e commit 8d19c5c

File tree

1 file changed

+4
-17
lines changed

1 file changed

+4
-17
lines changed

beginner_source/ptcheat.rst

Lines changed: 4 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -22,27 +22,12 @@ Neural Network API
2222
import torch.nn as nn # neural networks
2323
import torch.nn.functional as F # layers, activations and more
2424
import torch.optim as optim # optimizers e.g. gradient descent, ADAM, etc.
25-
from torch.jit import script, trace # hybrid frontend decorator and tracing jit
2625
2726
See `autograd <https://pytorch.org/docs/stable/autograd.html>`__,
2827
`nn <https://pytorch.org/docs/stable/nn.html>`__,
2928
`functional <https://pytorch.org/docs/stable/nn.html#torch-nn-functional>`__
3029
and `optim <https://pytorch.org/docs/stable/optim.html>`__
3130

32-
TorchScript and JIT
33-
-------------------
34-
35-
.. code-block:: python
36-
37-
torch.jit.trace() # takes your module or function and an example
38-
# data input, and traces the computational steps
39-
# that the data encounters as it progresses through the model
40-
41-
@script # decorator used to indicate data-dependent
42-
# control flow within the code being traced
43-
44-
See `Torchscript <https://pytorch.org/docs/stable/jit.html>`__
45-
4631
ONNX
4732
----
4833

@@ -225,8 +210,10 @@ Optimizers
225210
226211
opt = optim.x(model.parameters(), ...) # create optimizer
227212
opt.step() # update weights
228-
optim.X # where X is SGD, Adadelta, Adagrad, Adam,
229-
# AdamW, SparseAdam, Adamax, ASGD,
213+
opt.zero_grad() # clear the gradients
214+
optim.X # where X is SGD, AdamW, Adam,
215+
# Adafactor, NAdam, RAdam, Adadelta,
216+
# Adagrad, SparseAdam, Adamax, ASGD,
230217
# LBFGS, RMSprop or Rprop
231218
232219
See `optimizers <https://pytorch.org/docs/stable/optim.html>`__

0 commit comments

Comments
 (0)