How to Learn to Echo Random Integers with LSTMs in Keras

Last Updated on August 27, 2020

Long Short-Term Memory (LSTM) Recurrent Neural Networks are able to learn the order dependence in long sequence data.

They are a fundamental technique used in a range of state-of-the-art results, such as image captioning and machine translation.

They can also be difficult to understand, specifically how to frame a problem to get the most out of this type of network.

In this tutorial, you will discover how to develop a simple LSTM recurrent neural network to learn how to echo back the number in an ad hoc sequence of random integers. Although a trivial problem, developing this network will provide the skills needed to apply LSTM on a range of sequence prediction problems.

After completing this tutorial, you will know:

  • How to develop a LSTM for the simpler problem of echoing any given input.
  • How to avoid the beginner’s mistake when applying LSTMs to sequence problems like echoing integers.
  • How to develop a robust LSTM to echo the last observation from ad hoc sequences of random integers.

Kick-start your project with my new book Long Short-Term Memory Networks With Python, including step-by-step tutorials and the Python source code files for all
To finish reading, please visit source site