Word2Vec: A Comparison Between CBOW, SkipGram & SkipGramSI

Word2Vec is a widely used word representation technique that uses neural networks under the hood. The resulting word representation or embeddings can be used to infer semantic similarity between words and phrases, expand queries, surface related concepts and more. The sky is the limit when it comes to how you can use these embeddings for different NLP tasks.

In this article, we will look at how the different neural network architectures for training a Word2Vec model behave in practice. The idea here is to help you make an informed decision on which architecture to use given the problem you are trying to solve.

 

To finish reading, please visit source site

Leave a Reply