Crash Course in Recurrent Neural Networks for Deep Learning

Last Updated on August 14, 2019

There is another type of neural network that is dominating difficult machine learning problems that involve sequences of inputs called recurrent neural networks.

Recurrent neural networks have connections that have loops, adding feedback and memory to the networks over time. This memory allows this type of network to learn and generalize across sequences of inputs rather than individual patterns.

A powerful type of Recurrent Neural Network called the Long Short-Term Memory Network has been shown to be particularly effective when stacked into a deep configuration, achieving state-of-the-art results on a diverse array of problems from language translation to automatic captioning of images and videos.

In this post you will get a crash course in recurrent neural networks for deep learning, acquiring just enough understanding to start using LSTM networks in Python with Keras.

After reading this post, you will know:

  • The limitations of Multilayer Perceptrons that are addressed by recurrent neural networks.
  • The problems that must be addressed to make Recurrent Neural networks useful.
  • The details of the Long Short-Term Memory networks used in applied deep learning.

Kick-start your project with my new book Long Short-Term Memory Networks With Python, including
To finish reading, please visit source site