LOOCV for Evaluating Machine Learning Algorithms

Last Updated on August 26, 2020

The Leave-One-Out Cross-Validation, or LOOCV, procedure is used to estimate the performance of machine learning algorithms when they are used to make predictions on data not used to train the model.

It is a computationally expensive procedure to perform, although it results in a reliable and unbiased estimate of model performance. Although simple to use and no configuration to specify, there are times when the procedure should not be used, such as when you have a very large dataset or a computationally expensive model to evaluate.

In this tutorial, you will discover how to evaluate machine learning models using leave-one-out cross-validation.

After completing this tutorial, you will know:

  • The leave-one-out cross-validation procedure is appropriate when you have a small dataset or when an accurate estimate of model performance is more important than the computational cost of the method.
  • How to use the scikit-learn machine learning library to perform the leave-one-out cross-validation procedure.
  • How to evaluate machine learning algorithms for classification and regression using leave-one-out cross-validation.

Kick-start your project with my new book Machine Learning Mastery With Python, including step-by-step tutorials and the Python source code files for all examples.

Let’s get
To finish reading, please visit source site