What Are Word Embeddings for Text?

Last Updated on August 7, 2019

Word embeddings are a type of word representation that allows words with similar meaning to have a similar representation.

They are a distributed representation for text that is perhaps one of the key breakthroughs for the impressive performance of deep learning methods on challenging natural language processing problems.

In this post, you will discover the word embedding approach for representing text data.

After completing this post, you will know:

  • What the word embedding approach for representing text is and how it differs from other feature extraction methods.
  • That there are 3 main algorithms for learning a word embedding from text data.
  • That you can either train a new embedding or use a pre-trained embedding on your natural language processing task.

Kick-start your project with my new book Deep Learning for Natural Language Processing, including step-by-step tutorials and the Python source code files for all examples.

Let’s get started.

What Are Word Embeddings for Text?

What Are Word Embeddings for Text?
Photo by Heather, some rights reserved.

Overview

This post
To finish reading, please visit source site