Proof that covariance matrix is positive semidefinite

In summary, the proof that a covariance matrix is "positive semidefinite" involves using three basic facts about vectors and matrices to show that $v^{\mathsf{T}}Cv \geqslant0$.
  • #1
AlanTuring
6
0
Hello,

i am having a hard time understanding the proof that a covariance matrix is "positive semidefinite" ...

i found a numbe of different proofs on the web, but they are all far too complicated / and/ or not enogh detailed for me.

View attachment 3290

Such as in the last anser of the link :
probability - What is the proof that covariance matrices are always semi-definite? - Mathematics Stack Exchange

(last answer, in particular i don't understad how they can passe from an expression
E{u T (x−x ¯ )(x−x ¯ ) T u}
to
E{s^2 }

... from where does thes s^2 "magically" appear ?

;)
 

Attachments

  • covmatrix.jpg
    covmatrix.jpg
    22.8 KB · Views: 117
Physics news on Phys.org
  • #2
Machupicchu said:
Hello,

i am having a hard time understanding the proof that a covariance matrix is "positive semidefinite" ...

i found a numbe of different proofs on the web, but they are all far too complicated / and/ or not enogh detailed for me.

View attachment 3290

Such as in the last anser of the link :
probability - What is the proof that covariance matrices are always semi-definite? - Mathematics Stack Exchange

(last answer, in particular i don't understad how they can passe from an expression
E{u T (x−x ¯ )(x−x ¯ ) T u}
to
E{s^2 }

... from where does thes s^2 "magically" appear ?

;)
Hi Machupicchu, and welcome to MHB!

Three basic facts about vectors and matrices: (1) if $w$ is a column vector then $w^{\mathsf{T}}w \geqslant0$; (2) for matrices $A,B$ with product $AB$, the transpose of the product is the product of the transposes in reverse order, in other words $(AB)^{\mathsf{T}} = B^{\mathsf{T}}A^{\mathsf{T}}$; (3) taking the transpose twice gets you back where you started from, $(A^{\mathsf{T}})^{\mathsf{T}} = A$.

You want to show that $v^{\mathsf{T}}Cv\geqslant0$, where \(\displaystyle C = \frac1{n-1}\sum_{i=1}^n(\mathbf{x}_i - \mathbf{\mu}) (\mathbf{x}_i - \mathbf{\mu})^{\mathsf{T}}\). Since \(\displaystyle v^{\mathsf{T}}Cv = \frac1{n-1}\sum_{i=1}^nv^{\mathsf{T}}(\mathbf{x}_i - \mathbf{\mu}) (\mathbf{x}_i - \mathbf{\mu})^{\mathsf{T}}v\), it will be enough to show that $v^{\mathsf{T}}(\mathbf{x}_i - \mathbf{\mu}) (\mathbf{x}_i - \mathbf{\mu})^{\mathsf{T}}v$ (for each $i$). But by those basic facts above, $v^{\mathsf{T}}(\mathbf{x}_i - \mathbf{\mu}) (\mathbf{x}_i - \mathbf{\mu})^{\mathsf{T}}v = \bigl((\mathbf{x}_i - \mathbf{\mu})^{\mathsf{T}}v\bigr)^{\mathsf{T}} \bigl((\mathbf{x}_i - \mathbf{\mu})^{\mathsf{T}}v\bigr) \geqslant0$.
 

FAQ: Proof that covariance matrix is positive semidefinite

1. What is a covariance matrix?

A covariance matrix is a square matrix that contains the variances and covariances of a set of variables. It is used to measure the relationship between two or more variables and is commonly used in statistics and data analysis.

2. How do you determine if a covariance matrix is positive semidefinite?

A covariance matrix is positive semidefinite if all of its eigenvalues are greater than or equal to zero. This can be determined by calculating the eigenvalues of the matrix or by checking if the matrix is symmetric and all of its leading principal minors are non-negative.

3. Why is it important for a covariance matrix to be positive semidefinite?

A positive semidefinite covariance matrix ensures that the variances and covariances of the variables are non-negative, which is necessary for accurate statistical analysis. It also allows for the use of certain mathematical techniques and algorithms that require positive semidefinite matrices.

4. What are the implications if a covariance matrix is not positive semidefinite?

If a covariance matrix is not positive semidefinite, it means that at least one of its eigenvalues is negative. This can lead to errors in statistical analysis, as well as difficulties in using certain mathematical techniques. It may also indicate that the variables are not properly related to each other.

5. Can a covariance matrix ever be negative semidefinite?

No, a covariance matrix cannot be negative semidefinite. Since variances and covariances must be non-negative, a negative semidefinite matrix would imply that at least one of these values is negative, which is not possible.

Similar threads

Back
Top