5 Examples of Simple Sequence Prediction Problems for LSTMs

Last Updated on August 14, 2019

Sequence prediction is different from traditional classification and regression problems.

It requires that you take the order of observations into account and that you use models like Long Short-Term Memory (LSTM) recurrent neural networks that have memory and that can learn any temporal dependence between observations.

It is critical to apply LSTMs to learn how to use them on sequence prediction problems, and for that, you need a suite of well-defined problems that allow you to focus on different problem types and framings. It is critical so that you can build up your intuition for how sequence prediction problems are different and how sophisticated models like LSTMs can be used to address them.

In this tutorial, you will discover a suite of 5 narrowly defined and scalable sequence prediction problems that you can use to apply and learn more about LSTM recurrent neural networks.

After completing this tutorial, you will know:

  • Simple memorization tasks to test the learned memory capability of LSTMs.
  • Simple echo tasks to test the learned temporal dependence capability of LSTMs.
  • Simple arithmetic tasks to test the interpretation capability of LSTMs.

Kick-start your project with my new book To finish reading, please visit source site