Expected value of a function of a random variable

In summary, using indicator random variables allows for a quick solution to the problem of proving Markov's inequality.
  • #1
AllRelative
42
2

Homework Statement


Let X be a random variable. It is not specified if it is continuous or discrete. Let g(x) alway positive and strictly increasing. Deduce this inequality:
$$P(X\geqslant x) \leqslant \frac{Eg(X)}{g(x)} \: $$
where x is real.

Homework Equations


I know that if X is discrete
$$E(X) =\sum_{n=1}^{\infty} g(x_i)x_i$$

And if X is continuous,
$$\int_{-\infty}^{\infty} g(x)f(x) dx$$

The Attempt at a Solution


Is there a way to answer the question without proving the two cases (cont and discrete). Thanks!
 
Last edited:
Physics news on Phys.org
  • #2
AllRelative said:

The Attempt at a Solution


Is there a way to answer the question without proving the two cases (cont and discrete). Thanks!

Yes -- use indicator random variables and recognize that your problem is actually asking for a standard proof of Markov's Inequality which takes just one or two lines.

Note: if you want something more concrete, let ##g## be the exponential function.
 
  • Like
Likes AllRelative
  • #3
StoneTemplePython said:
Yes -- use indicator random variables and recognize that your problem is actually asking for a standard proof of Markov's Inequality which takes just one or two lines.

Note: if you want something more concrete, let ##g## be the exponential function.
I just read up of Markov's inequality. I see that it is the same problem except for the function. I'm just unsure about what to do with the function g.

$$g(x) P(X)\geq E(g(I_{X\geq x}))$$

How does an idicator function behave in a function? That's what confuses me. If g wasn't there I could finish the proof...
 
  • #4
AllRelative said:
I just read up of Markov's inequality. I see that it is the same problem except for the function. I'm just unsure about what to do with the function g.

$$g(x) P(X)\geq E(g(I_{X\geq x}))$$

How does an idicator function behave in a function? That's what confuses me. If g wasn't there I could finish the proof...

some of the things look backwards here?
- - - -
Suggestion: break it into two parts.

(part 1) let random variable ##Y := g(X)##. Now prove markov's inequality for ##Y##.

(part 2) after you have done the above, reason through -- how can you relate Y and X? i.e. if I say an experiment occurs in the sample space and I know the outcome is ##Y(\omega) \geq g(c)## that immediately tells you something of interest about ##X(\omega)##. Again, de-abstracting this and having ##g## be the exponential function may be useful for now...
 
  • Like
Likes AllRelative
  • #5
Oh right... Thanks man!
 
  • #6
AllRelative said:

Homework Statement


Let X be a random variable. It is not specified if it is continuous or discrete. Let g(x) alway positive and strictly increasing. Deduce this inequality:
$$P(X\geqslant x) \leqslant \frac{Eg(X)}{g(x)} \: $$
where x is real.

Homework Equations


I know that if X is discrete
$$E(X) =\sum_{n=1}^{\infty} g(x_i)x_i$$

And if X is continuous,
$$\int_{-\infty}^{\infty} g(x)f(x) dx$$

The Attempt at a Solution


Is there a way to answer the question without proving the two cases (cont and discrete). Thanks!

Now that you have done the question I can show you may favorite, quick way of doing it. For ##a > 0## let ##v_a(x) = g(x)/g(a)## and let
$$u_a(x) = 1\{ x \geq a \} = \begin{cases} 0 & \text{if} \; x < a \\
1 & \text{if} \; x \geq a
\end{cases}$$
For all ##x \geq 0## we have ##0 \leq u_a(x) \leq v_a(x)##, with ##u_a(a) = v_a(a) = 1.## Thus
$$P(X \geq a) = E u_a(X) \leq E v_a(X) = E g(X)/g(a).$$

This just uses elementary properties of expectation, and works the same way whether ##X## is discrete, continuous, or mixed discrete-continuous.

Here is a drawing that shows the situation.
 

Attachments

  • Markov_inequal.pdf
    25.5 KB · Views: 219
Last edited:

FAQ: Expected value of a function of a random variable

What is the definition of expected value of a function of a random variable?

The expected value of a function of a random variable is the average value of the function over all possible outcomes of the random variable, weighted by their respective probabilities. It represents the long-term average value that would be obtained if the experiment were repeated many times.

How is the expected value of a function of a random variable calculated?

The expected value of a function of a random variable is calculated by multiplying each possible outcome of the random variable by its respective probability, and then summing these products together. This can be written as E[f(X)] = ∑ f(x)*P(X=x), where f(x) is the function and P(X=x) is the probability of the random variable taking on the value x.

What is the significance of the expected value of a function of a random variable?

The expected value of a function of a random variable has several important applications in statistics and decision making. It is used to measure the central tendency of a distribution, to compare the performance of different strategies or decisions, and to make predictions about future outcomes.

Can the expected value of a function of a random variable be negative?

Yes, the expected value of a function of a random variable can be negative. This can occur if the function itself takes on negative values, or if the probabilities of the outcomes are weighted in a way that results in a negative expected value.

How does the expected value of a function of a random variable change with different probability distributions?

The expected value of a function of a random variable is affected by the shape and parameters of the probability distribution of the random variable. For example, if the distribution is skewed, the expected value may be different from the median or mode. Additionally, changing the parameters of the distribution, such as the mean or variance, can also impact the expected value of the function.

Similar threads

Back
Top