Skip to content

Commit 3b6d83b

Browse files
Change paper reference to a paper matching the model used (#2424)
Fixes #1642 Signed-off-by: BJ Hargrave <hargrave@us.ibm.com> Co-authored-by: sekyondaMeta <127536312+sekyondaMeta@users.noreply.github.com>
1 parent fc7494d commit 3b6d83b

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

intermediate_source/seq2seq_translation_tutorial.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@
4545
:alt:
4646
4747
To improve upon this model we'll use an `attention
48-
mechanism <https://arxiv.org/abs/1409.0473>`__, which lets the decoder
48+
mechanism <https://arxiv.org/abs/1508.04025>`__, which lets the decoder
4949
learn to focus over a specific range of the input sequence.
5050
5151
**Recommended Reading:**
@@ -66,8 +66,8 @@
6666
Statistical Machine Translation <https://arxiv.org/abs/1406.1078>`__
6767
- `Sequence to Sequence Learning with Neural
6868
Networks <https://arxiv.org/abs/1409.3215>`__
69-
- `Neural Machine Translation by Jointly Learning to Align and
70-
Translate <https://arxiv.org/abs/1409.0473>`__
69+
- `Effective Approaches to Attention-based Neural Machine
70+
Translation <https://arxiv.org/abs/1508.04025>`__
7171
- `A Neural Conversational Model <https://arxiv.org/abs/1506.05869>`__
7272
7373
You will also find the previous tutorials on

0 commit comments

Comments
 (0)