Data Preparation for Variable Length Input Sequences

Last Updated on August 14, 2019 Deep learning libraries assume a vectorized representation of your data. In the case of variable length sequence prediction problems, this requires that your data be transformed such that each sequence has the same length. This vectorization allows code to efficiently perform the matrix operations in batch for your chosen deep learning algorithms. In this tutorial, you will discover techniques that you can use to prepare your variable length sequence data for sequence prediction problems […]

Read more

A Gentle Introduction to Backpropagation Through Time

Last Updated on August 14, 2020 Backpropagation Through Time, or BPTT, is the training algorithm used to update weights in recurrent neural networks like LSTMs. To effectively frame sequence prediction problems for recurrent neural networks, you must have a strong conceptual understanding of what Backpropagation Through Time is doing and how configurable variations like Truncated Backpropagation Through Time will affect the skill, stability, and speed when training your network.In this post, you will get a gentle introduction to Backpropagation Through […]

Read more

Techniques to Handle Very Long Sequences with LSTMs

Last Updated on August 14, 2019 Long Short-Term Memory or LSTM recurrent neural networks are capable of learning and remembering over long sequences of inputs. LSTMs work very well if your problem has one output for every input, like time series forecasting or text translation. But LSTMs can be challenging to use when you have very long input sequences and only one or a handful of outputs. This is often called sequence labeling, or sequence classification. Some examples include: Classification […]

Read more

How to Prepare Sequence Prediction for Truncated BPTT in Keras

Last Updated on August 14, 2019 Recurrent neural networks are able to learn the temporal dependence across multiple timesteps in sequence prediction problems. Modern recurrent neural networks like the Long Short-Term Memory, or LSTM, network are trained with a variation of the Backpropagation algorithm called Backpropagation Through Time. This algorithm has been modified further for efficiency on sequence prediction problems with very long sequences and is called Truncated Backpropagation Through Time. An important configuration parameter when training recurrent neural networks […]

Read more

Attention in Long Short-Term Memory Recurrent Neural Networks

Last Updated on August 14, 2019 The Encoder-Decoder architecture is popular because it has demonstrated state-of-the-art results across a range of domains. A limitation of the architecture is that it encodes the input sequence to a fixed length internal representation. This imposes limits on the length of input sequences that can be reasonably learned and results in worse performance for very long input sequences. In this post, you will discover the attention mechanism for recurrent neural networks that seeks to […]

Read more

A Tour of Recurrent Neural Network Algorithms for Deep Learning

Last Updated on August 14, 2019 Recurrent neural networks, or RNNs, are a type of artificial neural network that add additional weights to the network to create cycles in the network graph in an effort to maintain an internal state. The promise of adding state to neural networks is that they will be able to explicitly learn and exploit context in sequence prediction problems, such as problems with an order or temporal component. In this post, you are going take […]

Read more

How to One Hot Encode Sequence Data in Python

Last Updated on August 14, 2019 Machine learning algorithms cannot work with categorical data directly. Categorical data must be converted to numbers. This applies when you are working with a sequence classification type problem and plan on using deep learning methods such as Long Short-Term Memory recurrent neural networks. In this tutorial, you will discover how to convert your input or output sequence data to a one hot encoding for use in sequence classification problems with deep learning in Python. […]

Read more

Gentle Introduction to Models for Sequence Prediction with RNNs

Last Updated on August 25, 2019 Sequence prediction is a problem that involves using historical sequence information to predict the next value or values in the sequence. The sequence may be symbols like letters in a sentence or real values like those in a time series of prices. Sequence prediction may be easiest to understand in the context of time series forecasting as the problem is already generally understood. In this post, you will discover the standard sequence prediction models […]

Read more

5 Examples of Simple Sequence Prediction Problems for LSTMs

Last Updated on August 14, 2019 Sequence prediction is different from traditional classification and regression problems. It requires that you take the order of observations into account and that you use models like Long Short-Term Memory (LSTM) recurrent neural networks that have memory and that can learn any temporal dependence between observations. It is critical to apply LSTMs to learn how to use them on sequence prediction problems, and for that, you need a suite of well-defined problems that allow […]

Read more

Get the Most out of LSTMs on Your Sequence Prediction Problem

Last Updated on August 14, 2019 Long Short-Term Memory (LSTM) Recurrent Neural Networks are a powerful type of deep learning suited for sequence prediction problems. A possible concern when using LSTMs is if the added complexity of the model is improving the skill of your model or is in fact resulting in lower skill than simpler models. In this post, you will discover simple experiments you can run to ensure you are getting the most out of LSTMs on your […]

Read more
1 2 3 4