Skip to content

Commit 03b8905

Browse files
committed
Fix log-softmax unused issue
Fixes: #800
1 parent 9e00157 commit 03b8905

File tree

1 file changed

+4
-1
lines changed

1 file changed

+4
-1
lines changed

beginner_source/transformer_tutorial.py

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,10 @@
4141
# the earlier positions in the sequence. For the language modeling task, any
4242
# tokens on the future positions should be masked. To produce a probability
4343
# distribution over output words, the output of the ``nn.TransformerEncoder``
44-
# model is passed through a linear layer followed by a log-softmax function.
44+
# model is passed through a linear layer to output unnormalized logits.
45+
# The log-softmax function isn't applied here due to the later use of
46+
# `CrossEntropyLoss <https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html>`__,
47+
# which requires the inputs to be unnormalized logits.
4548
#
4649

4750
import math

0 commit comments

Comments
 (0)