Convergence in Distribution for Random Vectors

In summary, the question asks to show that if X_n is a sequence of p-dimensional random vectors that converges in distribution to N_p(mu, Sigma), then a'X_n converges in distribution to N_1(a' mu, a' Sigma a). The attempt at a solution involves finding the limit of E(e^(atX_n)), which leads to the conclusion that a'X_n converges to N(a' mu, a' Sigma a). The proof needs to include the details and correctly state the equality with a limit involved.
  • #1
cse63146
452
0

Homework Statement



Let Xn be a sequence of p dimensional random vectors. Show that

Xn converges in distribution to [tex]N_p(\mu,\Sigma)[/tex] iff [tex]a'X_n[/tex] converges in distribution to [tex]N_1(a' \mu, a' \Sigma a).[/tex]

Homework Equations





The Attempt at a Solution



[tex]E(e^{(a'X_n)t} = E(e^{(a't)X_n}) = e^{a't \mu + 0.5t^2(a' \Sigma a)}[/tex]

Hence, {a'Xn} converges [tex]N(a' \mu, a' \Sigma a).[/tex] in distribution.

Is that it?
 
Physics news on Phys.org
  • #2
hey, i can't seem to get this question.
what is the sum of: SIGMA (i=1 to n) of i(i+1)(i+2)... is the answer just infinity or is it some kind of weird expression that i have to find?
 
  • #3
pstar - you need to start your own thread - intruding into another's isn't appropriate.

To the OP:
You have the outline, but the rough edges need to be smoothed. For example, stating this equality

[tex]
E(e^{(a't)X_n}) = E^{a't\mu + 0.5t^2 (a' \Sigma a)}
[/tex]

isn't correct - there is a limit involved, correct?

The proof isn't long, and you've got the basic idea, but the details need to be included.
 

FAQ: Convergence in Distribution for Random Vectors

What is convergence in distribution?

Convergence in distribution is a concept in mathematical statistics that refers to the behavior of a sequence of random variables as the number of variables increases. It describes how the probability distribution of these variables approaches a limiting distribution as the number of variables approaches infinity.

What is the difference between convergence in distribution and convergence in probability?

Convergence in distribution and convergence in probability are both types of convergence in statistics, but they differ in their focus. Convergence in distribution looks at the behavior of the probability distribution of a sequence of random variables, while convergence in probability looks at the behavior of the actual values of the variables themselves.

What is the importance of convergence in distribution in statistical analysis?

Convergence in distribution is important because it allows us to make inferences about the behavior of a large number of random variables without having to observe every single one. It also allows us to use simpler, more manageable distributions to approximate the behavior of more complex ones.

What is the role of the central limit theorem in convergence in distribution?

The central limit theorem is a fundamental result in probability theory that states that the sum of a large number of independent and identically distributed random variables will tend towards a normal distribution. This is important in convergence in distribution because it provides a framework for understanding how the probability distributions of a sequence of random variables will behave as the number of variables increases.

How is convergence in distribution tested or measured?

Convergence in distribution can be tested or measured through various statistical methods, such as the Kolmogorov-Smirnov test, the Cramer-von Mises test, or the Anderson-Darling test. These tests compare the observed data to the limiting distribution and provide a measure of how closely they match. Other measures, such as the Kullback-Leibler divergence, can also be used to quantify the distance between two probability distributions.

Similar threads

Replies
1
Views
1K
Replies
7
Views
2K
Replies
18
Views
1K
Replies
1
Views
1K
Replies
1
Views
79
Replies
2
Views
3K
Replies
3
Views
2K
Back
Top