The Luong Attention Mechanism

The Luong attention sought to introduce several improvements over the Bahdanau model for neural machine translation, notably by introducing two new classes of attentional mechanisms: a global approach that attends to all source words and a local approach that only attends to a selected subset of words in predicting the target sentence. 

In this tutorial, you will discover the Luong attention mechanism for neural machine translation. 

After completing this tutorial, you will know:

  • The operations performed by the Luong attention algorithm
  • How the global and local attentional models work.
  • How the Luong attention compares to the Bahdanau attention

Kick-start your project with my book Building Transformer Models with Attention. It provides self-study tutorials with

 

 

To finish reading, please visit source site