A Gentle Introduction to Degrees of Freedom in Machine Learning

Last Updated on August 19, 2020

Degrees of freedom is an important concept from statistics and engineering.

It is often employed to summarize the number of values used in the calculation of a statistic, such as a sample statistic or in a statistical hypothesis test.

In machine learning, the degrees of freedom may refer to the number of parameters in the model, such as the number of coefficients in a linear regression model or the number of weights in a deep learning neural network.

The concern is that if there are more degrees of freedom (model parameters) in machine learning, then the model is expected to overfit the training dataset. This is the common understanding from statistics. This expectation can be overcome through the use of regularization techniques, such as regularization linear regression and the suite of regularization methods available for deep learning neural network models.

In this post, you will discover degrees of freedom in statistics and machine learning.

After reading this post, you will know:

  • Degrees of freedom generally represents the number of points of control of a system.
  • In statistics, degrees of freedom is the number of observations used to calculate a statistic.
  • In machine
    To finish reading, please visit source site