Shankar - Simultaneous Diagonalisation of Hermitian Matrices

In summary, the conversation discusses determining the common eigenvalues and eigenvectors of Hermitian matrices, and the process of simultaneously diagonalizing them through a unitary transformation. It also mentions the importance of choosing the non-degenerate matrix and generating an orthonormal set of eigenvectors.
  • #1
bugatti79
794
1
Asked to determine the eigenvalues and eigenvectors common to both of these matrices of

[tex]\Omega=\begin{bmatrix}1 &0 &1 \\ 0& 0 &0 \\ 1& 0 & 1\end{bmatrix}[/tex] and [tex]\Lambda=\begin{bmatrix}2 &1 &1 \\ 1& 0 &-1 \\ 1& -1 & 2\end{bmatrix}[/tex]

and then to verify under a unitary transformation that both can be simultaneously diagonalised. Since Omega is degenerate and Lambda is not, you must be prudent in deciding which matrix dictates the choice of basis.

1)What does he mean by being prudent in the choice of matrix?

2)There is only one common eigenvalue which is [tex]\lambda=2[/tex] I expect the same eigenvector for both matrices for this value of [tex]\lambda=2[/tex]? Wolfram alpha shows different eigenvectors for the same [tex]\lambda[/tex] value.
eigenvector '{'1,0,1'}','{'0,0,0'}','{'1,0,1'}' - Wolfram|Alpha

eigenvector '{'2,1,1'}','{'1,0,-1'}','{'1,-1,2'}' - Wolfram|Alpha3) To show Simultaneous Diagonalisation I applied the unitary transformation as

[tex]U^{\dagger} \Omega U[/tex] and [tex]U^{\dagger} \Lambda U[/tex] to diagonalise the matrices with its entries being the eigenvalues where [tex]U[/tex] are the corresponding columns of eigenvectors.

However, wolfram shows

'{''{'1, 0, 1'}', '{'-1, 0, 1'}', '{'0, 1, 0'}''}''*''{''{'1,0,1'}','{'0,0,0'}','{'1,0,1'}''}''*''{''{'1,-1,0'}','{'0,0,1'}','{'1,1,0'}''}' - Wolfram|Alpha

'{''{'1, 0, 1'}', '{'-1, -1, 1'}', '{'-1, 2, 1'}''}''*''{''{'2,1,1'}','{'1,0,-1'}','{'1,-1,2'}''}''*''{''{'1, -1, -1'}', '{'0, -1, 2'}', '{'1, 1, 1'}''}' - Wolfram|Alpha

Any ideas?
 
Physics news on Phys.org
  • #2
Couple of comments:

1. Pick the non-degenerate matrix to get your eigenvectors. You're not guaranteed that the degenerate matrix's eigenvectors will span the space and be a basis.

2. You need the transformation to be unitary, which means the eigenvectors need to be orthonormal. Use Gram-Schmidt to orthonormalize the eigenbasis.
 
  • #3
Ackbach said:
Couple of comments:

1. Pick the non-degenerate matrix to get your eigenvectors. You're not guaranteed that the degenerate matrix's eigenvectors will span the space and be a basis.

2. You need the transformation to be unitary, which means the eigenvectors need to be orthonormal. Use Gram-Schmidt to orthonormalize the eigenbasis.

2) I thought that hermitian matrices were orthgonal as per the 4th point of properties in link wiki https://en.wikipedia.org/wiki/Hermitian_matrix
Thats why i didn't orthogonalise them...

Eigenvectors of Hermitian Matrix
eigenvectors '{''{'2,1,1'}','{'1,0,-1'}','{'1,-1,2'}''}' - Wolfram|Alpha

Check are eigenvectors orthogonal ( I put the eigenvectors into a matrix)
'{''{'1, 0, 1'}', '{'-1, -1, 1'}', '{'-1, 2, 1'}''}' orthogonal - Wolfram|Alpha

Is the wiki wrong?
 
  • #4
bugatti79 said:
2) I thought that hermitian matrices were orthgonal as per the 4th point of properties in link wiki https://en.wikipedia.org/wiki/Hermitian_matrix
Thats why i didn't orthogonalise them...

Eigenvectors of Hermitian Matrix
eigenvectors '{''{'2,1,1'}','{'1,0,-1'}','{'1,-1,2'}''}' - Wolfram|Alpha

Check are eigenvectors orthogonal ( I put the eigenvectors into a matrix)
'{''{'1, 0, 1'}', '{'-1, -1, 1'}', '{'-1, 2, 1'}''}' orthogonal - Wolfram|Alpha

Is the wiki wrong?

Ok, fair point about orthogonality. However, orthogonal does not imply orthonormal. An orthonormal basis is orthogonal AND every vector has length 1. You need an orthonormal set of eigenvectors to form an orthogonal matrix. That's not a typo: orthogonal matrix implies the columns are orthonormal.

So your process is simpler if your original matrices are Hermitian: once you get the eigenvectors, just normalize them.
 

FAQ: Shankar - Simultaneous Diagonalisation of Hermitian Matrices

What is "Shankar - Simultaneous Diagonalisation of Hermitian Matrices"?

"Shankar - Simultaneous Diagonalisation of Hermitian Matrices" is a mathematical concept that involves finding a set of matrices that can be simultaneously diagonalized, meaning that they can be transformed into diagonal matrices with the same transformation matrix.

Why is simultaneous diagonalisation of Hermitian matrices important?

Simultaneous diagonalisation of Hermitian matrices is important because it allows for simplification and easier computation of certain mathematical problems, particularly in quantum mechanics. It also has applications in other fields such as signal processing and statistics.

What are Hermitian matrices?

Hermitian matrices are square matrices with complex entries that are equal to their own conjugate transpose. In other words, the element in the i-th row and j-th column is equal to the complex conjugate of the element in the j-th row and i-th column.

Can non-Hermitian matrices be simultaneously diagonalized?

No, non-Hermitian matrices cannot be simultaneously diagonalized. This is because they do not have the property of being equal to their own conjugate transpose, which is a necessary condition for simultaneous diagonalization.

What are the applications of simultaneous diagonalisation of Hermitian matrices?

Aside from its applications in quantum mechanics, simultaneous diagonalisation of Hermitian matrices also has uses in solving systems of linear equations, finding the eigenvalues and eigenvectors of matrices, and in machine learning algorithms.

Back
Top