Implementation Patterns for the Encoder-Decoder RNN Architecture with Attention

Last Updated on August 14, 2019 The encoder-decoder architecture for recurrent neural networks is proving to be powerful on a host of sequence-to-sequence prediction problems in the field of natural language processing. Attention is a mechanism that addresses a limitation of the encoder-decoder architecture on long sequences, and that in general speeds up the learning and lifts the skill of the model on sequence-to-sequence prediction problems. In this post, you will discover patterns for implementing the encoder-decoder model with and […]

Read more

How to Develop a Deep Learning Bag-of-Words Model for Sentiment Analysis (Text Classification)

Last Updated on September 3, 2020 Movie reviews can be classified as either favorable or not. The evaluation of movie review text is a classification problem often called sentiment analysis. A popular technique for developing sentiment analysis models is to use a bag-of-words model that transforms documents into vectors where each word in the document is assigned a score. In this tutorial, you will discover how you can develop a deep learning predictive model using the bag-of-words representation for movie […]

Read more

Best Practices for Text Classification with Deep Learning

Last Updated on August 24, 2020 Text classification describes a general class of problems such as predicting the sentiment of tweets and movie reviews, as well as classifying email as spam or not. Deep learning methods are proving very good at text classification, achieving state-of-the-art results on a suite of standard academic benchmark problems. In this post, you will discover some best practices to consider when developing deep learning models for text classification. After reading this post, you will know: […]

Read more

Difference Between Return Sequences and Return States for LSTMs in Keras

Last Updated on August 14, 2019 The Keras deep learning library provides an implementation of the Long Short-Term Memory, or LSTM, recurrent neural network. As part of this implementation, the Keras API provides access to both return sequences and return state. The use and difference between these data can be confusing when designing sophisticated recurrent neural network models, such as the encoder-decoder model. In this tutorial, you will discover the difference and result of return sequences and return states for […]

Read more

How to Index, Slice and Reshape NumPy Arrays for Machine Learning

Last Updated on June 13, 2020 Machine learning data is represented as arrays. In Python, data is almost universally represented as NumPy arrays. If you are new to Python, you may be confused by some of the pythonic ways of accessing data, such as negative indexing and array slicing. In this tutorial, you will discover how to manipulate and access your data correctly in NumPy arrays. After completing this tutorial, you will know: How to convert your list data to […]

Read more

How to Develop a Seq2Seq Model for Neural Machine Translation in Keras

Last Updated on August 7, 2019 The encoder-decoder model provides a pattern for using recurrent neural networks to address challenging sequence-to-sequence prediction problems, such as machine translation. Encoder-decoder models can be developed in the Keras Python deep learning library and an example of a neural machine translation system developed with this model has been described on the Keras blog, with sample code distributed with the Keras project. In this post, you will discover how to define an encoder-decoder sequence-to-sequence prediction […]

Read more

How to Use the Keras Functional API for Deep Learning

Last Updated on May 28, 2020 The Keras Python library makes creating deep learning models fast and easy. The sequential API allows you to create models layer-by-layer for most problems. It is limited in that it does not allow you to create models that share layers or have multiple inputs or outputs. The functional API in Keras is an alternate way of creating models that offers a lot more flexibility, including creating more complex models. In this tutorial, you will […]

Read more

Deep Convolutional Neural Network for Sentiment Analysis (Text Classification)

Last Updated on September 3, 2020 Develop a Deep Learning Model to Automatically Classify Movie Reviewsas Positive or Negative in Python with Keras, Step-by-Step. Word embeddings are a technique for representing text where different words with similar meaning have a similar real-valued vector representation. They are a key breakthrough that has led to great performance of neural network models on a suite of challenging natural language processing problems. In this tutorial, you will discover how to develop word embedding models […]

Read more

Gentle Introduction to Global Attention for Encoder-Decoder Recurrent Neural Networks

Last Updated on August 14, 2019 The encoder-decoder model provides a pattern for using recurrent neural networks to address challenging sequence-to-sequence prediction problems such as machine translation. Attention is an extension to the encoder-decoder model that improves the performance of the approach on longer sequences. Global attention is a simplification of attention that may be easier to implement in declarative deep learning libraries like Keras and may achieve better results than the classic attention mechanism. In this post, you will […]

Read more

Gentle Introduction to Statistical Language Modeling and Neural Language Models

Last Updated on August 7, 2019 Language modeling is central to many important natural language processing tasks. Recently, neural-network-based language models have demonstrated better performance than classical methods both standalone and as part of more challenging natural language processing tasks. In this post, you will discover language modeling for natural language processing. After reading this post, you will know: Why language modeling is critical to addressing tasks in natural language processing. What a language model is and some examples of […]

Read more
1 761 762 763 764 765 861