# A Gentle Introduction to Jensen’s Inequality

Last Updated on July 31, 2020

It is common in statistics and machine learning to create a linear transform or mapping of a variable.

An example is a linear scaling of a feature variable. We have the natural intuition that the mean of the scaled values is the same as the scaled value of the mean raw variable values. This makes sense.

Unfortunately, we bring this intuition with us when using nonlinear transformations of variables where this relationship no longer holds. Fixing this intuition involves the discovery of Jensen’s Inequality, which provides a standard mathematical tool used in function analysis, probability, and statistics.

In this tutorial, you will discover Jensen’s Inequality.

After completing this tutorial, you will know:

• The intuition of linear mappings does not hold for nonlinear functions.
• The mean of a convex function of a variable is always greater than the function of the mean variable, called Jensen’s Inequality.
• A common application of the inequality is in the comparison of arithmetic and geometric means when averaging the financial returns for a time interval.

Kick-start your project with my new book Probability for Machine Learning, including step-by-step tutorials and the Python source code files for all