Trace of a matrix and an expansion of eigenvectors

In summary, the conversation discusses deriving the Kullback-Leibler divergence between two multi-variate gaussian distributions and proving a property related to orthonormal eigenvectors and eigenvalues. The solution involves using the orthonormal eigenvectors as a basis and transforming back to a general basis to obtain the desired result.
  • #1
CuppoJava
24
0
Hi,
I'm trying to derive the Kullback-Leibler divergence between two multi-variate gaussian distributions, and I need the following property. Is there a simple way to understand this?

Prove that:
Given that E has orthonormal eigenvectors [tex]u_{i}[/tex] and eigenvalues [tex]\lambda_{i}[/tex]

Then:
[tex]trace(A*E) = \sum_{i}u^{T}_{i}*A*u_{i}*\lambda_{i}[/tex]

I'm not quite sure how to start. I suspected that it can be proven by looking at block matrices but I didn't get anywhere with that. Thanks a lot for your help.
-Patrick
 
Physics news on Phys.org
  • #2
Hi Patrick! :smile:

(have a sigma: ∑ and a lambda: λ and try using the X2 tag just above the Reply box :wink:)

If the ui are orthonormal, then they can be used as a basis, and in that basis tr(AE) = ∑ λiAii

Then transform back to a general basis, and you get the result given.

(there's probably a more direct way of doing it, also! :wink:)
 
  • #3
Thanks Tiny_Tim. It took me a while to figure out the details but I finally got it. Your advice worked perfectly!

PS: And thanks for the generous greek letters. =)
 

FAQ: Trace of a matrix and an expansion of eigenvectors

What is the trace of a matrix?

The trace of a matrix is the sum of its diagonal elements. It is denoted by tr(A) or simply by the letter T. The trace of a square matrix is a scalar value and can be calculated by adding the elements on the main diagonal.

How is the trace of a matrix related to its eigenvalues?

The trace of a matrix is equal to the sum of its eigenvalues. This means that if we know the eigenvalues of a matrix, we can calculate its trace by simply adding them together. Similarly, if we know the trace and one eigenvalue, we can find the other eigenvalue by subtracting the known eigenvalue from the trace.

What is an expansion of eigenvectors?

An expansion of eigenvectors is a way to express a given vector as a linear combination of eigenvectors of a matrix. This allows us to write a vector in terms of the directions defined by the eigenvalues of the matrix. It is especially useful in matrix operations and calculations.

How do you find the expansion of eigenvectors?

To find the expansion of eigenvectors, we first need to find the eigenvalues and eigenvectors of the given matrix. Once we have the eigenvalues, we can use them to construct a diagonal matrix with the eigenvalues on the main diagonal. The expansion of eigenvectors is then found by multiplying this diagonal matrix with the eigenvectors of the matrix.

Is the expansion of eigenvectors unique?

No, the expansion of eigenvectors is not unique. This is because there are infinitely many ways to express a vector as a linear combination of eigenvectors. However, the eigenvalues and eigenvectors of a matrix are unique, meaning that the diagonal matrix used in the expansion will be the same regardless of the chosen eigenvectors.

Back
Top