- #1
WMDhamnekar
MHB
- 379
- 28
- TL;DR Summary
- Let ##m(t) =E[X^t]## The moment bound states that for a > 0, ##P\{ X \geq a \}\leq m(t)a^{-t} \forall t > 0##. How would you prove this result using importance sampling identity?
Let ##X## be a non-negative random variable and let a > 0. We want to bound the probability ##P\{X \geq a\}## in terms of the moments of X.
- Define a function ##h(x) = \mathbb{1}\{x \geq a\}##, where ##\mathbb{1}\{\cdot\}## is the indicator function that returns 1 if the argument is true and 0 otherwise. Then, we have ##P\{X \geq a\} = E_f[h(X)]##, where ##E_f## denotes the expected value with respect to the pdf of X.
- Choose another random variable Y with probability density function (pdf) ##f_Y(y)## such that ##f_Y(y) > 0## whenever ##f_X(y) > 0##, where ##f_X(x)## is the pdf of X. This is called the importance distribution. Define the importance weight as ##w(x) = f_X(x)/f_Y(x)##.
- Apply the importance sampling identity to write ##E_f[h(X)] = E_g\left[\frac{h(Y)w(Y)}{g(Y)}\right]## where the expectation on the right-hand side is taken with respect to Y.
Now how to proceed further? Can we use here Jensen's Inequality?
- Define a function ##h(x) = \mathbb{1}\{x \geq a\}##, where ##\mathbb{1}\{\cdot\}## is the indicator function that returns 1 if the argument is true and 0 otherwise. Then, we have ##P\{X \geq a\} = E_f[h(X)]##, where ##E_f## denotes the expected value with respect to the pdf of X.
- Choose another random variable Y with probability density function (pdf) ##f_Y(y)## such that ##f_Y(y) > 0## whenever ##f_X(y) > 0##, where ##f_X(x)## is the pdf of X. This is called the importance distribution. Define the importance weight as ##w(x) = f_X(x)/f_Y(x)##.
- Apply the importance sampling identity to write ##E_f[h(X)] = E_g\left[\frac{h(Y)w(Y)}{g(Y)}\right]## where the expectation on the right-hand side is taken with respect to Y.
Now how to proceed further? Can we use here Jensen's Inequality?