Applying Chernoff bound on normal distribution

In summary, the conversation discusses finding a bound on the deviation of a normal distributed variable from its mean. The Chebyshev inequality is applied to the mean of n iid normally distributed variables. The question is raised on how to calculate the expectation E[e^{s(m_n-\mu)^2}]. It is noted that s can be treated as a constant number. The conversation also mentions the possibility of using bounds on the tail probability of a normal distribution.
  • #1
phonic
28
0
Dear all,

I am trying to find out a good bound on the deveation of a normal distributed variable from its mean.

The noramly distributed variables [tex]X_t \sim N(\mu, \sigma^2), t= 1,2,...,n [/tex] are iid. Applying the Chebyshev inequality on the mean of these n iid variables:

[tex]m_n = \frac{1}{n} \sum_{t=1}^n X_t[/tex]

[tex]P(m_n - \mu \geq \epsilon ) = \frac{1}{2} P(e^{s(m_n - \mu)^2} \geq e^{s\epsilon^2} ) \leq \frac{1}{2}e^{-s \epsilon^2} E[e^{s(m_n-\mu)^2}] [/tex]

The question is how to calculate this expectaion
[tex] E[e^{s(m_n-\mu)^2}] [/tex]

Can anybody give some hints? Thanks a lot!

Since [tex]m_n \sim N(\mu, \frac{\sigma^2}{n} ) [/tex],
[tex] E[(m_n-\mu)^2}] = \frac{\sigma^2}{n} [/tex]. But
[tex] E[e^{s(m_n-\mu)^2}] [/tex] seems not easy.

Phonic
 
Last edited:
Physics news on Phys.org
  • #2
Are you treating s as a constant? Can you? Isn't it a r.v., e.g. s = sn?
 
  • #3
s can be considered as a constant number. Since the Markov inequality holds for any s.

Is there some bounds on the tail probability of a normal distribution?
 
  • #4
But in [itex]P(m_n - \mu \geq \epsilon ) = \frac{1}{2} P(e^{s(m_n - \mu)^2} \geq e^{s\epsilon^2} ) \leq \frac{1}{2}e^{-s \epsilon^2} E[e^{s(m_n-\mu)^2}] [/itex], you have moved s out of the E[] in [itex]e^{-s \epsilon^2} [/itex] but then left it inside the E[] in [itex]E[e^{s(m_n-\mu)^2}][/itex], is that legit? More to the point, can you also take it out of the latter, and if you can, would that make the job easier?
 
Last edited:

FAQ: Applying Chernoff bound on normal distribution

What is the Chernoff bound on normal distribution?

The Chernoff bound on normal distribution is a mathematical technique used to estimate the probability that the sum of a large number of independent random variables will deviate significantly from its expected value. It is a type of tail inequality that provides an upper bound on the probability of a large deviation.

How is the Chernoff bound calculated for normal distribution?

The Chernoff bound is calculated using the moment-generating function (MGF) of the random variables. The MGF is a function that uniquely characterizes the probability distribution of a random variable. By taking the derivative of the MGF and setting it equal to a specific value, the Chernoff bound can be calculated.

What is the significance of applying Chernoff bound on normal distribution?

Applying the Chernoff bound on normal distribution has several practical applications in statistics and machine learning. It allows for the quantification of the probability of large deviations, which is useful in decision making and risk management. It is also used in the analysis of algorithms and in the design of efficient data structures.

Can the Chernoff bound be applied to any type of distribution?

No, the Chernoff bound is specifically designed for normal distribution. However, there are similar techniques, such as Hoeffding's inequality and Bernstein's inequality, that can be applied to other types of distributions.

How does the sample size affect the accuracy of the Chernoff bound?

The accuracy of the Chernoff bound is directly affected by the sample size. As the sample size increases, the bound becomes tighter and more accurate. This is because the probability of large deviations decreases as the sample size grows, making the bound more precise.

Similar threads

Replies
6
Views
2K
Replies
0
Views
1K
Replies
12
Views
2K
Replies
2
Views
1K
Replies
1
Views
2K
Replies
5
Views
2K
Back
Top