One-vs-Rest and One-vs-One for Multi-Class Classification

Last Updated on September 7, 2020

Not all classification predictive models support multi-class classification.

Algorithms such as the Perceptron, Logistic Regression, and Support Vector Machines were designed for binary classification and do not natively support classification tasks with more than two classes.

One approach for using binary classification algorithms for multi-classification problems is to split the multi-class classification dataset into multiple binary classification datasets and fit a binary classification model on each. Two different examples of this approach are the One-vs-Rest and One-vs-One strategies.

In this tutorial, you will discover One-vs-Rest and One-vs-One strategies for multi-class classification.

After completing this tutorial, you will know:

  • Binary classification models like logistic regression and SVM do not support multi-class classification natively and require meta-strategies.
  • The One-vs-Rest strategy splits a multi-class classification into one binary classification problem per class.
  • The One-vs-One strategy splits a multi-class classification into one binary classification problem per each pair of classes.

Let’s get started.

How to Use One-vs-Rest and One-vs-One for Multi-Class Classification

How to Use One-vs-Rest and One-vs-One for Multi-Class Classification
Photo by To finish reading, please visit source site