Taylor approximation (probability)

In summary, we can use the dominated convergence theorem to prove that \mathbb{E}\left|g(X+\varepsilon^h)-g(X)\right| converges to zero as h goes to zero, with the same rate as the first moment of \varepsilon^h. I hope this helps.Best, [Your Name]
  • #1
Pere Callahan
586
1
I have the following problem: Assume g is a (smooth enough) function, X a random variable and [itex]\varepsilon^h[/itex] a sequence of random variables, whose moments converge to 0 as h goes to zero.

I would then like to prove that
[tex]
\mathbb{E}\left|g(X+\varepsilon^h)-g(X)\right|
[/tex]

converges to zero as well as h goes to 0, if possible at the same rate as the first moment of [itex]\varepsilon^h[/itex].

I tried using Taylor's theorem which states that
[tex]
g(X+\varepsilon^h)-g(X) = g'(X)\varepsilon^h + R(X,\varepsilon^h),
[/tex]

where the absolute value of the remainder satisfies [itex]\left|R(X,\varepsilon^h)\right|\leq C(X)|\varepsilon^h|^2[/itex]. Using this and the Cauchy-Schwarz inequality I could show that
[tex]
\mathbb{E}\left|g(X+\varepsilon^h)-g(X)\right|\leq\sqrt{\mathbb{E}(g'(X)^2)}\sqrt{\mathbb{E}(\varepsilon^h)^2} + \sqrt{\mathbb{E}(C(X)^2)}\sqrt{\mathbb{E}(\varepsilon^h)^4}.
[/tex]

For some reason, however, I'm not quite sure if that's correct:smile: In particular I don't know how to argue that C(X) should have finite variance. Maybe i must just assume that.

Anyway, I'd appreciate any input on how to bound the expected value of the increment in term of the moments of [itex]\varepsilon^h[/itex]. Thanks,

PereEDIT:

I think what I did is actually not correct because C(X) might also depend on on the value of [itex]\varepsilon^h[/itex]...Intuitively, however, I find it quite plausible that [itex]\mathbb{E}\left|g(X+\varepsilon^h)-g(X)\right|[/itex] should not go to zero more slowly than [itex]\mathbb{E}|\varepsilon^h}[/itex] .. Thanks again
 
Last edited:
Physics news on Phys.org
  • #2
for any input!Dear Pere,

Thank you for sharing your problem on the forum. Your approach using Taylor's theorem is a good start, but as you mentioned, it may not be entirely correct. In order to prove that \mathbb{E}\left|g(X+\varepsilon^h)-g(X)\right| converges to zero, we need to show that the expected value of the increment can be bounded by the moments of \varepsilon^h.

One approach that may work is to use the dominated convergence theorem. This theorem states that if we have a sequence of random variables that converge in probability to a constant, and we can find a dominating function that bounds the absolute value of the sequence, then the expected value of the sequence also converges to the same constant.

In your case, we can consider the sequence g(X+\varepsilon^h)-g(X) as a sequence of random variables that converge in probability to zero. This is because the moments of \varepsilon^h converge to zero, and as h goes to zero, the values of \varepsilon^h also get closer to zero.

To apply the dominated convergence theorem, we need to find a dominating function that bounds the absolute value of g(X+\varepsilon^h)-g(X). One way to do this is to use the Lipschitz continuity of g. This means that there exists a constant L such that for all x and y, we have |g(x)-g(y)|\leq L|x-y|. Using this, we can rewrite the absolute value of the increment as:

|g(X+\varepsilon^h)-g(X)| = |g(X+\varepsilon^h)-g(X)|\leq L|\varepsilon^h|.

Now, we can use the Cauchy-Schwarz inequality to bound the expected value of the increment as:

\mathbb{E}\left|g(X+\varepsilon^h)-g(X)\right|\leq\sqrt{\mathbb{E}(L^2)}\sqrt{\mathbb{E}(\varepsilon^h)^2}.

Since L is a constant, it has finite variance and the expected value of the increment is bounded by the moments of \varepsilon^h. This means that as h goes to zero, the expected value of the increment also goes to zero at the same rate as the first
 

Related to Taylor approximation (probability)

1. What is Taylor approximation in probability?

Taylor approximation in probability is a method used to approximate the probability distribution of a random variable using a polynomial function. It is based on the Taylor series expansion, which represents a function as an infinite sum of terms.

2. Why is Taylor approximation used in probability?

Taylor approximation is used in probability because it allows us to approximate complex probability distributions with simpler, polynomial functions. This makes it easier to calculate probabilities and perform statistical analyses.

3. How does Taylor approximation work in probability?

Taylor approximation works by using a polynomial function to approximate a probability distribution. The polynomial is chosen based on the Taylor series expansion of the probability distribution, and the number of terms in the polynomial can be adjusted to improve the accuracy of the approximation.

4. What are the advantages of using Taylor approximation in probability?

One advantage of using Taylor approximation in probability is that it can simplify complex probability distributions, making them easier to analyze and work with. It can also provide a good estimate of the true probability distribution, especially when more terms in the Taylor series are used.

5. Are there any limitations to using Taylor approximation in probability?

Yes, there are some limitations to using Taylor approximation in probability. It may not work well for highly skewed or asymmetric distributions, and the accuracy of the approximation depends on the number of terms used in the polynomial. Additionally, it may not provide an exact representation of the true probability distribution, but rather an approximation.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
150
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
350
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
193
  • Topology and Analysis
Replies
2
Views
341
  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
2K
  • Differential Equations
Replies
1
Views
961
Replies
7
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
944
Back
Top