Linear Discriminant Analysis for Dimensionality Reduction in Python

Last Updated on August 18, 2020

Reducing the number of input variables for a predictive model is referred to as dimensionality reduction.

Fewer input variables can result in a simpler predictive model that may have better performance when making predictions on new data.

Linear Discriminant Analysis, or LDA for short, is a predictive modeling algorithm for multi-class classification. It can also be used as a dimensionality reduction technique, providing a projection of a training dataset that best separates the examples by their assigned class.

The ability to use Linear Discriminant Analysis for dimensionality reduction often surprises most practitioners.

In this tutorial, you will discover how to use LDA for dimensionality reduction when developing predictive models.

After completing this tutorial, you will know:

  • Dimensionality reduction involves reducing the number of input variables or columns in modeling data.
  • LDA is a technique for multi-class classification that can be used to automatically perform dimensionality reduction.
  • How to evaluate predictive models that use an LDA projection as input and make predictions with new raw data.

Kick-start your project with my new book Data Preparation for Machine Learning, including step-by-step tutorials and the Python source code files for all examples.

Let’s
To finish reading, please visit source site