Python for NLP: Word Embeddings for Deep Learning in Keras

python_tutorials

This is the 16th article in my series of articles on Python for NLP. In my previous article I explained how N-Grams technique can be used to develop a simple automatic text filler in Python. N-Gram model is basically a way to convert text data into numeric form so that it can be used by statisitcal algorithms.

Before N-Grams, I explained the bag of words and TF-IDF approaches, which can also be used to generate numeric feature vectors from text data. Till now we have been using machine learning appraoches to perform different NLP tasks such as text classification, topic modeling, sentimental analysis, text summarization, etc. In this article we will start our discussion about deep learning techniques for NLP.

Deep learning approaches consist of different types of densely connected neural networks. These approaches have been proven efficient to solve several complex tasks such as self-driving cars, image generation, image segmentation, etc. Deep learning approaches have also been proven quite efficient for NLP tasks.

In this article, we will study word embeddings for NLP tasks that involve deep learning. We will see how word embeddings can be used to perform simple classification task

To finish reading, please visit source site