Skip to content

Commit f453a23

Browse files
author
Youshaa Murhij
authored
Update transformer_tutorial.py
1 parent 51e989f commit f453a23

File tree

1 file changed

+1
-8
lines changed

1 file changed

+1
-8
lines changed

beginner_source/transformer_tutorial.py

Lines changed: 1 addition & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -112,14 +112,7 @@ def generate_square_subsequent_mask(sz: int) -> Tensor:
112112
# The ``math.log(10000.0)`` term in the exponent represents the maximum effective
113113
# input length (in this case, ``10000``). Dividing this term by ``d_model`` scales
114114
# the values to be within a reasonable range for the exponential function.
115-
# The negative sign in front of the logarithm ensures that the values decrease exponentially.
116-
# The reason for writing ``math.log(10000.0)`` instead of ``4`` in the code is to make it clear
117-
# that this value represents the logarithm of the maximum effective input length
118-
# (in this case, ``10000``). This makes the code more readable and easier to understand.
119-
# Using ``math.log(10000.0)`` instead of ``4`` also makes it easier to change the maximum effective
120-
# input length if needed. If you want to use a different value for the maximum effective
121-
# input length, you can simply change the argument of the ``math.log``
122-
# function instead of recalculating the logarithm manually.
115+
#
123116

124117
class PositionalEncoding(nn.Module):
125118

0 commit comments

Comments
 (0)