- #1
emob2p
- 56
- 1
Hi,
I'm looking for a proof of the following theorem:
If A is a hermitian matrix with eigenvalues a_1, a_2...a_n, then the secular equation holds:
(A - a_1)(A - a_2)...(A - a_n) = 0.
The proof escapes me right now but I think it has to do with diagonalizing the hermitian matrix. I'm just struggling to put together the details. Assume non-degeneracy. Thanks.
I'm looking for a proof of the following theorem:
If A is a hermitian matrix with eigenvalues a_1, a_2...a_n, then the secular equation holds:
(A - a_1)(A - a_2)...(A - a_n) = 0.
The proof escapes me right now but I think it has to do with diagonalizing the hermitian matrix. I'm just struggling to put together the details. Assume non-degeneracy. Thanks.
Last edited: