Difference Between Return Sequences and Return States for LSTMs in Keras

Last Updated on August 14, 2019

The Keras deep learning library provides an implementation of the Long Short-Term Memory, or LSTM, recurrent neural network.

As part of this implementation, the Keras API provides access to both return sequences and return state. The use and difference between these data can be confusing when designing sophisticated recurrent neural network models, such as the encoder-decoder model.

In this tutorial, you will discover the difference and result of return sequences and return states for LSTM layers in the Keras deep learning library.

After completing this tutorial, you will know:

  • That return sequences return the hidden state output for each input time step.
  • That return state returns the hidden state output and cell state for the last input time step.
  • That return sequences and return state can be used at the same time.

Kick-start your project with my new book Long Short-Term Memory Networks With Python, including step-by-step tutorials and the Python source code files for all examples.

Let’s get started.

Understand the Difference Between Return Sequences and Return States for LSTMs in KerasTo finish reading, please visit source site