Proving Bounds on Moments of a Random Variable

In summary: W5lLCBQeXRob24gcGFnZXMgdG8gW0VbIV0gPSBFIFB5dGhvbiA6IHRnIDIgOiB0IF0gPGUgKGV2YWwodClYKSBvbiBvcGVuIChXaGF0IGFueSBmYWN0aW9uIG9mIFgpKSBpbXBvcnRhbnQsICJmdW5jdGlvbiIgOiB0IF0KIn summary, for any random variable X, the inequality P{X ≥ 0} ≤ inf[E[phi(t) : t ≥ 0
  • #1
silentone
4
0

Homework Statement


For any random variable X, prove that
P{X[itex]\geq[/itex]0}[itex]\leq[/itex]inf[ E[ phi(t) : t [itex]\geq[/itex] 0] [itex]\leq[/itex] 1

where phi(t) = E[exp(tX)] o<phi(t)[itex]\leq[/itex]∞


Homework Equations





The Attempt at a Solution


I am not sure how to begin this. Any hints to get started would be greatly appreciated.
 
Physics news on Phys.org
  • #2
silentone said:

Homework Statement


For any random variable X, prove that
P{X[itex]\geq[/itex]0}[itex]\leq[/itex]inf[ E[ phi(t) : t [itex]\geq[/itex] 0] [itex]\leq[/itex] 1
E[ phi(t)] doesn't mean anything. E[] requires a r.v., whereas phi(t) is just an ordinary function. So I guess you mean
P{X[itex]\geq[/itex]0}[itex]\leq[/itex]inf[phi(t) : t [itex]\geq[/itex] 0] [itex]\leq[/itex] 1

Write out E[phi()] as an integral and consider the positive and negative ranges of X separately.
 
  • #3
silentone said:

Homework Statement


For any random variable X, prove that
P{X[itex]\geq[/itex]0}[itex]\leq[/itex]inf[ E[ phi(t) : t [itex]\geq[/itex] 0] [itex]\leq[/itex] 1

where phi(t) = E[exp(tX)] o<phi(t)[itex]\leq[/itex]∞


Homework Equations





The Attempt at a Solution


I am not sure how to begin this. Any hints to get started would be greatly appreciated.

Broad hint: P{X ≥ 0} = E H(X), where H(x) = 0 for x < 0 and H(x) = 1 for x ≥ 0. So, you are really comparing expectations of two different functions of X.

RGV
 

FAQ: Proving Bounds on Moments of a Random Variable

What is a random variable?

A random variable is a variable whose possible values are outcomes of a random phenomenon. It can take on different values based on the outcome of a particular event or experiment.

What are moments of a random variable?

Moments of a random variable are numerical measures that describe the shape and distribution of the variable's probability distribution. The first four moments are mean, variance, skewness, and kurtosis.

Why is proving bounds on moments important?

Proving bounds on moments of a random variable is important because it allows us to make inferences about the behavior and characteristics of the variable. By determining the upper and lower limits of the moments, we can better understand the underlying probability distribution and make predictions about future outcomes.

What methods are used to prove bounds on moments?

There are various methods used to prove bounds on moments, including Chebyshev's inequality, Markov's inequality, and Jensen's inequality. These methods use different mathematical approaches to determine the upper and lower limits of the moments of a random variable.

How can proving bounds on moments be applied in real-world scenarios?

Proving bounds on moments of a random variable can be applied in various fields such as finance, economics, and engineering. It can help in risk assessment, portfolio management, and decision-making processes by providing insights into the potential range of outcomes for a given variable. It can also aid in evaluating the performance of models and predicting future outcomes.

Similar threads

Back
Top