- #1
CuppoJava
- 24
- 0
Hi,
I'm trying to derive the Kullback-Leibler divergence between two multi-variate gaussian distributions, and I need the following property. Is there a simple way to understand this?
Prove that:
Given that E has orthonormal eigenvectors [tex]u_{i}[/tex] and eigenvalues [tex]\lambda_{i}[/tex]
Then:
[tex]trace(A*E) = \sum_{i}u^{T}_{i}*A*u_{i}*\lambda_{i}[/tex]
I'm not quite sure how to start. I suspected that it can be proven by looking at block matrices but I didn't get anywhere with that. Thanks a lot for your help.
-Patrick
I'm trying to derive the Kullback-Leibler divergence between two multi-variate gaussian distributions, and I need the following property. Is there a simple way to understand this?
Prove that:
Given that E has orthonormal eigenvectors [tex]u_{i}[/tex] and eigenvalues [tex]\lambda_{i}[/tex]
Then:
[tex]trace(A*E) = \sum_{i}u^{T}_{i}*A*u_{i}*\lambda_{i}[/tex]
I'm not quite sure how to start. I suspected that it can be proven by looking at block matrices but I didn't get anywhere with that. Thanks a lot for your help.
-Patrick