Part 6: Step by Step Guide to Master NLP – Word2Vec

This article was published as a part of the Data Science Blogathon

Introduction

This article is part of an ongoing blog series on Natural Language Processing (NLP). In the previous article of this series, we completed the statistical or frequency-based word embedding techniques, which are pre-word embedding era techniques. So, in this article, we will discuss the recent word-era embedding techniques.

NOTE: In recent word-era embedding, there are many such techniques but in this article, we will discuss only the Word2Vec technique, which is the most used and popular technique from all of the techniques.

This is part-6 of the blog series on the Step by Step Guide to Natural Language Processing.

 

Table of Contents

1. Pre-requisites to follow this article

2. Recap of Word Embedding

3. What is Prediction-based Embedding?

4. Different Model Architectures for Word representation

  • FeedForward Neural Net Language Model (NNLM)
  • Recurrent Neural Net Language Model (RNNLM)

5. What is Word2Vec Model?

6. Different algorithms included in Word2Vec