Prove Jensen's Inequality: Var[X] ≥ (E[X])^2

  • Thread starter jetoso
  • Start date
  • Tags
    Inequality
In summary, Jensen's Inequality is a fundamental theorem that states the convex function of the average of a set of values is always greater than or equal to the average of the convex function of those values. It has applications in various fields and helps in proving other theorems. Var[X] and E[X] refer to the variance and expected value of a random variable in the context of Jensen's Inequality. The inequality can be proven using the concept of convexity. While it always holds true for convex functions, it can be reversed for concave functions.
  • #1
jetoso
73
0
The variance can be written as Var[X]=E[X^2]-(E[X])^2. Use this form to prove that the Var[X] is always non-negative, i.e., show that E[X^2]>=(E[X])^2.
Use Jensen's Inequality.

Any sugestions? I just tried to prove that a function g(t) is continuous and twice differentiable, such that g''(t) > 0 which must imply it is convex.
Then, I am stuck with the proof.
 
Physics news on Phys.org
  • #2
I am not sure how to use Jensen's inequality. However by using the relationship:

E((X-E(X))2)=E(X2)-(E(X))2, the result is obvious.
 
  • #3


Sure, I can help with the proof. First, let's define the function g(t) as g(t) = E[(X-t)^2]. This function represents the expected value of the squared deviations of X from t. We can rewrite this as g(t) = E[X^2] - 2tE[X] + t^2.

Now, let's take the second derivative of g(t) with respect to t:

g''(t) = 2E[X] - 2E[X] = 2(E[X] - E[X]) = 2Var[X]

Since Var[X] ≥ 0, we can conclude that g''(t) ≥ 0, which means that g(t) is a convex function. This is because the second derivative of a convex function is always non-negative.

Now, let's use Jensen's Inequality, which states that for any convex function g(t) and any random variable X, we have E[g(X)] ≥ g(E[X]). Applying this to our function g(t), we have:

E[(X-E[X])^2] ≥ (E[X]-E[X])^2

Simplifying, we get:

E[(X-E[X])^2] ≥ 0

But this is just the definition of Var[X]. Therefore, we have shown that Var[X] ≥ 0, or in other words, Var[X] is always non-negative.

To summarize, we used Jensen's Inequality and the convexity of the function g(t) to prove that Var[X] is always non-negative. This implies that Var[X] ≥ (E[X])^2, which is the desired result.
 

FAQ: Prove Jensen's Inequality: Var[X] ≥ (E[X])^2

What is Jensen's Inequality?

Jensen's Inequality is a fundamental theorem in mathematics that relates the expectation of a convex function to the convex function of an expectation. In other words, it states that the convex function of the average of a set of values is always greater than or equal to the average of the convex function of those values.

What is the significance of Jensen's Inequality?

Jensen's Inequality has many applications in various fields such as economics, statistics, and physics. It helps in proving other important theorems and has practical implications in decision-making processes.

What is Var[X] and E[X] in the context of Jensen's Inequality?

Var[X] refers to the variance of a random variable X, which measures the spread of its values around the mean. E[X] is the expected value or mean of the random variable X, which represents the average value of the variable over a large number of trials.

How is Jensen's Inequality proven?

Jensen's Inequality can be proven using the concept of convexity, which states that the line connecting any two points on a convex function always lies above the graph of the function. By applying this concept to the expectation of a convex function, we can prove the inequality.

Is Jensen's Inequality always true?

Yes, Jensen's Inequality is always true for any convex function and any set of values. However, it can be reversed for concave functions, where the convex function of the average is always less than or equal to the average of the concave function.

Back
Top