Encoder-Decoder Long Short-Term Memory Networks

Last Updated on August 14, 2019

Gentle introduction to the Encoder-Decoder LSTMs for
sequence-to-sequence prediction with example Python code.

The Encoder-Decoder LSTM is a recurrent neural network designed to address sequence-to-sequence problems, sometimes called seq2seq.

Sequence-to-sequence prediction problems are challenging because the number of items in the input and output sequences can vary. For example, text translation and learning to execute programs are examples of seq2seq problems.

In this post, you will discover the Encoder-Decoder LSTM architecture for sequence-to-sequence prediction.

After completing this post, you will know:

  • The challenge of sequence-to-sequence prediction.
  • The Encoder-Decoder architecture and the limitation in LSTMs that it was designed to address.
  • How to implement the Encoder-Decoder LSTM model architecture in Python with Keras.

Kick-start your project with my new book Long Short-Term Memory Networks With Python, including step-by-step tutorials and the Python source code files for all examples.

Let’s get started.

Encoder-Decoder Long Short-Term Memory Networks

Encoder-Decoder Long Short-Term Memory Networks
Photo by slashvee, some rights reserved.

Sequence-to-Sequence Prediction Problems

Sequence prediction often involves forecasting the next value in a real valued
To finish reading, please visit source site