Skip to content

Commit a912b81

Browse files
authored
Merge branch 'master' into recipes-breadcrumbs
2 parents 5427598 + 9dbc8fc commit a912b81

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

beginner_source/examples_nn/two_layer_net_optim.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@
3333

3434
# Use the optim package to define an Optimizer that will update the weights of
3535
# the model for us. Here we will use Adam; the optim package contains many other
36-
# optimization algoriths. The first argument to the Adam constructor tells the
36+
# optimization algorithms. The first argument to the Adam constructor tells the
3737
# optimizer which Tensors it should update.
3838
learning_rate = 1e-4
3939
optimizer = torch.optim.Adam(model.parameters(), lr=learning_rate)

0 commit comments

Comments
 (0)