Mini-Course on Long Short-Term Memory Recurrent Neural Networks with Keras

Last Updated on August 14, 2019 Long Short-Term Memory (LSTM) recurrent neural networks are one of the most interesting types of deep learning at the moment. They have been used to demonstrate world-class results in complex problem domains such as language translation, automatic image captioning, and text generation. LSTMs are different to multilayer Perceptrons and convolutional neural networks in that they are designed specifically for sequence prediction problems. In this mini-course, you will discover how you can quickly bring LSTM […]

Read more

Stacked Long Short-Term Memory Networks

Last Updated on August 14, 2019 Gentle introduction to the Stacked LSTMwith example code in Python. The original LSTM model is comprised of a single hidden LSTM layer followed by a standard feedforward output layer. The Stacked LSTM is an extension to this model that has multiple hidden LSTM layers where each layer contains multiple memory cells. In this post, you will discover the Stacked LSTM model architecture. After completing this tutorial, you will know: The benefit of deep neural […]

Read more

CNN Long Short-Term Memory Networks

Last Updated on August 14, 2019 Gentle introduction to CNN LSTM recurrent neural networkswith example Python code. Input with spatial structure, like images, cannot be modeled easily with the standard Vanilla LSTM. The CNN Long Short-Term Memory Network or CNN LSTM for short is an LSTM architecture specifically designed for sequence prediction problems with spatial inputs, like images or videos. In this post, you will discover the CNN LSTM architecture for sequence prediction. After completing this post, you will know: […]

Read more

Encoder-Decoder Long Short-Term Memory Networks

Last Updated on August 14, 2019 Gentle introduction to the Encoder-Decoder LSTMs forsequence-to-sequence prediction with example Python code. The Encoder-Decoder LSTM is a recurrent neural network designed to address sequence-to-sequence problems, sometimes called seq2seq. Sequence-to-sequence prediction problems are challenging because the number of items in the input and output sequences can vary. For example, text translation and learning to execute programs are examples of seq2seq problems. In this post, you will discover the Encoder-Decoder LSTM architecture for sequence-to-sequence prediction. After […]

Read more

Gentle Introduction to Generative Long Short-Term Memory Networks

Last Updated on August 14, 2019 The Long Short-Term Memory recurrent neural network was developed for sequence prediction. In addition to sequence prediction problems. LSTMs can also be used as a generative model In this post, you will discover how LSTMs can be used as generative models. After completing this post, you will know: About generative models, with a focus on generative models for text called language modeling. Examples of applications where LSTM Generative models have been used. Examples of […]

Read more

How to Make Predictions with Long Short-Term Memory Models in Keras

Last Updated on August 14, 2019 The goal of developing an LSTM model is a final model that you can use on your sequence prediction problem. In this post, you will discover how to finalize your model and use it to make predictions on new data. After completing this post, you will know: How to train a final LSTM model. How to save your final LSTM model, and later load it again. How to make predictions on new data. Kick-start […]

Read more

How to Reshape Input Data for Long Short-Term Memory Networks in Keras

Last Updated on August 14, 2019 It can be difficult to understand how to prepare your sequence data for input to an LSTM model. Often there is confusion around how to define the input layer for the LSTM model. There is also confusion about how to convert your sequence data that may be a 1D or 2D matrix of numbers to the required 3D format of the LSTM input layer. In this tutorial, you will discover how to define the […]

Read more

How to Diagnose Overfitting and Underfitting of LSTM Models

Last Updated on January 8, 2020 It can be difficult to determine whether your Long Short-Term Memory model is performing well on your sequence prediction problem. You may be getting a good model skill score, but it is important to know whether your model is a good fit for your data or if it is underfit or overfit and could do better with a different configuration. In this tutorial, you will discover how you can diagnose the fit of your […]

Read more

Making Predictions with Sequences

Last Updated on August 14, 2019 Sequence prediction is different from other types of supervised learning problems. The sequence imposes an order on the observations that must be preserved when training models and making predictions. Generally, prediction problems that involve sequence data are referred to as sequence prediction problems, although there are a suite of problems that differ based on the input and output sequences. In this tutorial, you will discover the different types of sequence prediction problems. After completing […]

Read more

A Gentle Introduction to RNN Unrolling

Last Updated on August 14, 2019 Recurrent neural networks are a type of neural network where the outputs from previous time steps are fed as input to the current time step. This creates a network graph or circuit diagram with cycles, which can make it difficult to understand how information moves through the network. In this post, you will discover the concept of unrolling or unfolding recurrent neural networks. After reading this post, you will know: The standard conception of […]

Read more
1 2 3 4