diff --git a/intermediate_source/seq2seq_translation_tutorial.py b/intermediate_source/seq2seq_translation_tutorial.py index 776197fbbd1..ea583821f85 100644 --- a/intermediate_source/seq2seq_translation_tutorial.py +++ b/intermediate_source/seq2seq_translation_tutorial.py @@ -45,7 +45,7 @@ :alt: To improve upon this model we'll use an `attention -mechanism `__, which lets the decoder +mechanism `__, which lets the decoder learn to focus over a specific range of the input sequence. **Recommended Reading:** @@ -66,8 +66,8 @@ Statistical Machine Translation `__ - `Sequence to Sequence Learning with Neural Networks `__ -- `Neural Machine Translation by Jointly Learning to Align and - Translate `__ +- `Effective Approaches to Attention-based Neural Machine + Translation `__ - `A Neural Conversational Model `__ You will also find the previous tutorials on