# Jensen's inequality

In mathematics, Jensen's inequality, named after the Danish mathematician Johan Jensen, relates the value of a convex function of an integral to the integral of the convex function.

 Contents

## General form

The inequality can be stated quite generally using measure theory. It can also be stated equally generally in the language of probability theory. The two statements say exactly the same thing.

### In the language of measure theory

Let μ be a positive measure on a set Ω, such that μ(Ω) = 1. If g is a real-valued function that is Lebesgue integrable, and if φ is convex function on the range of g, then

[itex]\varphi\left(\int_{\Omega} g\, d\mu\right) \le \int_\Omega \varphi \circ g\, d\mu. [itex]

### In the language of probability theory

In the terminology of probability theory, μ is a probability measure. The function g is replaced by a real-valued random variable X (just another name for the same thing, as long as the context remains one of "pure" mathematics). The integral of any function over the space Ω with respect to the probability measure μ becomes an expected value. The inequality then says that if φ is any convex function, then

[itex]\varphi\left(E(X)\right) \leq E(\varphi(X)).\,[itex]

## Special cases

### Form involving a probability density function

Suppose Ω is a measurable subset of the real line and f(x) is a non-negative function such that

[itex]\int_{-\infty}^\infty f(x)\,dx = 1.[itex]

In probabilistic language, f is a probability density function.

Then Jensen's inequality becomes the following statement about convex integrals:

If g is any real-valued measurable function and φ is convex over the range of g, then

[itex] \varphi\left(\int_{-\infty}^\infty g(x)f(x)\, dx\right) \le \int_{-\infty}^\infty \varphi(g(x)) f(x)\, dx. [itex]

If g(x) = x, then this form of the inequality reduces to a commonly used special case:

[itex]\varphi\left(\int_{-\infty}^\infty x\, f(x)\, dx\right) \le \int_{-\infty}^\infty \varphi(x)\,f(x)\, dx.[itex]

### Finite form

If [itex]\Omega[itex] is some finite set [itex]\{x_1,x_2,\ldots,x_n\}[itex], and if [itex]\mu[itex] is a normalized counting measure on [itex]\Omega[itex], then the general form reduces to a statement about sums:

[itex] \varphi\left(\sum_{i=1}^{n} g(x_i)\lambda_i \right) \le \sum_{i=1}^{n} \varphi(g(x_i))\lambda_i, [itex]

provided that [itex] \lambda_1 + \lambda_2 + \cdots + \lambda_n = 1, \lambda_i \ge 0. [itex]

If φ is concave, each of the inequalities is simply reversed.

Suppose [itex]x_1, x_2, \ldots, x_n[itex] are real and positive, [itex]g(x)=x[itex], [itex]\lambda_i = 1/n[itex] and [itex]\varphi(x) = \log(x)[itex]. The above sum becomes

[itex] \log\left(\sum_{i=1}^{n} \frac{x_i}{n} \right) \ge \sum_{i=1}^{n} \frac{\log(x_i)}{n},[itex]

The exponential values of both sides give the familiar AM-GM inequality:

[itex] \frac{x_1 + x_2 + \cdots + x_n}{n} \ge \sqrt[n]{x_1 x_2 \cdots x_n}.[itex]

There is also an infinite discrete form.

### Statistical physics

Jensen's equality is of particular importance in statistical physics when the convex function is an exponential:

[itex] e^{\langle X \rangle} \leq \left\langle e^X \right\rangle, [itex]

where angle brackets denote expected values with respect to some probability distribution in the random variable X. The proof in this case is very simple (cf. Chandler, Sec. 5.5); using the inequality

[itex] e^X \geq 1+X \, [itex]

in the third exponential of the identity

[itex] \left\langle e^X \right\rangle

= e^{\langle X \rangle} \left\langle e^{X - \langle X \rangle} \right\rangle, [itex]

the desired inequality follows directly.

## University logo

Jensen's inequality serves as logo for the mathematics department of Copenhagen University.

## References

• Art and Cultures
• Countries of the World (http://www.academickids.com/encyclopedia/index.php/Countries)
• Space and Astronomy