Confusion Regarding a Spectral Decomposition

In summary, the equation states that the matrix elements of the operator "A" are the sum of the matrix elements of the operator "1" and the operator "A" itself.
  • #1
ARoyC
56
11
Hi. I am not being able to understand how we are getting the following spectral decomposition. It would be great if someone can explain it to me. Thank you in advance.
Screenshot 2023-07-07 145114.jpg
 
Physics news on Phys.org
  • #2
It's simply a non-sensical equation. On the one side you write down a matrix, depicting matrix elements of an operator and on the other the operator itself. Correct is
$$\hat{A}=v_3 |0 \rangle \langle 0| + (v_1-\mathrm{i} v_2) |0 \rangle \langle 1| + (v_1+\mathrm{i} v_2) |1 \rangle \langle 0| - v_3 |1 \rangle \langle 1|.$$
The matrix elements in your matrix are then taken with respect to the basis ##(|0 \rangle,|1 \rangle)##.
$$(A_{jk})=\langle j|\hat{A}|k \rangle, \quad j,k \in \{0,1 \}.$$
To see this, simply use ##\langle j|k \rangle=\delta_{jk}##. Then you get, e.g.,
$$A_{01}=\langle 0|\hat{A}|1 \rangle=v_1-\mathrm{i} v_2.$$
 
  • Like
Likes strangerep and dextercioby
  • #3
vanhees71 said:
It's simply a non-sensical equation. On the one side you write down a matrix, depicting matrix elements of an operator and on the other the operator itself. Correct is
$$\hat{A}=v_3 |0 \rangle \langle 0| + (v_1-\mathrm{i} v_2) |0 \rangle \langle 1| + (v_1+\mathrm{i} v_2) |1 \rangle \langle 0| - v_3 |1 \rangle \langle 1|.$$
The matrix elements in your matrix are then taken with respect to the basis ##(|0 \rangle,|1 \rangle)##.
$$(A_{jk})=\langle j|\hat{A}|k \rangle, \quad j,k \in \{0,1 \}.$$
To see this, simply use ##\langle j|k \rangle=\delta_{jk}##. Then you get, e.g.,
$$A_{01}=\langle 0|\hat{A}|1 \rangle=v_1-\mathrm{i} v_2.$$
Oh! Then we can go to the LHS of the equation from the RHS. Can't we do the reverse?
 
  • #4
Sure:
$$\hat{A}=\sum_{j,k} |j \rangle \langle j|\hat{A}|k \rangle \langle k| = \sum_{jk} A_{jk} |j \rangle \langle k|.$$
The mapping from operators to matrix elements with respect to a complete orthonormal system is one-to-one. As very many formal manipulations in QT, it's just using the completeness relation,
$$\sum_j |j \rangle \langle j|=\hat{1}.$$
 
  • #5
vanhees71 said:
Sure:
$$\hat{A}=\sum_{j,k} |j \rangle \langle j|\hat{A}|k \rangle \langle k| = \sum_{jk} A_{jk} |j \rangle \langle k|.$$
The mapping from operators to matrix elements with respect to a complete orthonormal system is one-to-one. As very many formal manipulations in QT, it's just using the completeness relation,
$$\sum_j |j \rangle \langle j|=\hat{1}.$$
How are we getting the very first equality that is A = Σ|j><j|A|k><k| ?
 
  • #6
$$\hat{A}=\hat{1}\hat{A}\hat{1}=\left(\sum_{j} |j \rangle \langle j|\right)\hat{A}\left(\sum_{k} |k \rangle \langle k|\right)=\sum_{j,k} |j \rangle \langle j|\hat{A}|k \rangle \langle k| = \sum_{jk} A_{jk} |j \rangle \langle k|.$$
 
  • Like
Likes vanhees71
  • #7
Haborix said:
$$\hat{A}=\hat{1}\hat{A}\hat{1}=\left(\sum_{j} |j \rangle \langle j|\right)\hat{A}\left(\sum_{k} |k \rangle \langle k|\right)=\sum_{j,k} |j \rangle \langle j|\hat{A}|k \rangle \langle k| = \sum_{jk} A_{jk} |j \rangle \langle k|.$$
Oh, okay, thanks a lot!
 

FAQ: Confusion Regarding a Spectral Decomposition

What is spectral decomposition in the context of linear algebra?

Spectral decomposition, also known as eigendecomposition, is a method where a matrix is broken down into its constituent elements based on its eigenvalues and eigenvectors. For a square matrix \(A\), it can be expressed as \(A = V \Lambda V^{-1}\), where \(V\) is a matrix whose columns are the eigenvectors of \(A\), and \(\Lambda\) is a diagonal matrix containing the eigenvalues of \(A\).

How is spectral decomposition used in practical applications?

Spectral decomposition is used in various fields such as quantum mechanics, vibration analysis, and facial recognition. It helps in simplifying complex matrix operations, solving differential equations, and in principal component analysis (PCA) where it is used to reduce the dimensionality of data while preserving its variance.

What are the requirements for a matrix to be spectrally decomposed?

For a matrix to be spectrally decomposed, it must be a square matrix. Additionally, the matrix should have a complete set of linearly independent eigenvectors. This is always possible for normal matrices (matrices that commute with their conjugate transpose), including symmetric matrices, Hermitian matrices, and unitary matrices.

What is the difference between spectral decomposition and singular value decomposition (SVD)?

While both spectral decomposition and singular value decomposition (SVD) are techniques for matrix factorization, they are used in different contexts. Spectral decomposition is applicable only to square matrices and involves eigenvalues and eigenvectors. In contrast, SVD can be applied to any m x n matrix and decomposes it into \(U \Sigma V^*\), where \(U\) and \(V\) are orthogonal matrices and \(\Sigma\) is a diagonal matrix containing the singular values.

Can non-diagonalizable matrices undergo spectral decomposition?

Non-diagonalizable matrices cannot undergo spectral decomposition in the strict sense because they do not have a full set of linearly independent eigenvectors. However, they can be decomposed using the Jordan canonical form, which generalizes the concept of eigendecomposition by including Jordan blocks corresponding to each eigenvalue.

Back
Top