Using Multiple Eigenvectors for the Same Eigenvalue

In summary, algebraic multiplicity is the number of times an eigenvalue appears as a root of the characteristic polynomial of a matrix. It differs from geometric multiplicity, which is the number of linearly independent eigenvectors corresponding to that eigenvalue. This concept is important in determining the diagonalizability of a matrix and finding the Jordan canonical form. It is also related to the trace and determinant of a matrix. The algebraic multiplicity can be greater than the geometric multiplicity when the matrix is not diagonalizable.
  • #1
Fermat1
187
0
I am trying to prove the spectral decomposition theorem for normal compact operators. Now, my book says the space H is the closure of the direct sum of $F_{t}$ where we the $F_{t}$ are eigenspaces and we sum over all eigenvalues $t$.
My question concerns what happens when there are 2 linearly independent eigenvectors associated to one eigenvalue. The above says we can write any $x$ in H as

$x=a_{1}e_{1}+a_{2}e_{2}+...$ where the $a_{i}$ are scalars and the $e_{i}$ are the eigenvectors, one from each eigenspace. So this tells me that 2 eigenvectors associated to the same eigenvalue cannot be used in the same linear combination. BUt I know that's not right.
 
Last edited:
Physics news on Phys.org
  • #2

Thank you for your question regarding the spectral decomposition theorem for normal compact operators. I can understand your confusion about the statement in your book that eigenvectors associated to the same eigenvalue cannot be used in the same linear combination.

First, let me clarify that the spectral decomposition theorem states that any normal compact operator on a Hilbert space can be decomposed into a sum of projections onto its eigenspaces. This means that for each eigenvalue, there exists an eigenspace associated with it, and any vector in the Hilbert space can be written as a linear combination of eigenvectors from different eigenspaces.

Now, going back to your question about using two eigenvectors associated to the same eigenvalue in the same linear combination - this is actually possible. The key here is to remember that eigenvectors associated to the same eigenvalue are not necessarily linearly independent. In fact, if there are two linearly independent eigenvectors associated to the same eigenvalue, then they can be used in the same linear combination to represent any vector in the Hilbert space.

To see this, let's take a simple example. Consider a 2x2 matrix with eigenvalue $\lambda$ and two linearly independent eigenvectors $e_1$ and $e_2$. Then, any vector $x$ in the Hilbert space can be written as $x = ae_1 + be_2$, where $a$ and $b$ are scalars. Since both $e_1$ and $e_2$ are eigenvectors associated to the same eigenvalue $\lambda$, we can write this as $x = (a+b)e_1 = (a+b)e_2$, which shows that we can indeed use both eigenvectors in the same linear combination.

In summary, the statement in your book is correct, but it is important to remember that eigenvectors associated to the same eigenvalue may not be linearly independent and can therefore be used in the same linear combination. I hope this helps to clarify your understanding of the spectral decomposition theorem. Good luck with your research!
 

FAQ: Using Multiple Eigenvectors for the Same Eigenvalue

What is algebraic multiplicity?

Algebraic multiplicity refers to the number of times a particular eigenvalue appears as a root of the characteristic polynomial of a given matrix.

How is algebraic multiplicity different from geometric multiplicity?

The algebraic multiplicity of an eigenvalue is the number of times it appears as a root of the characteristic polynomial, while the geometric multiplicity is the number of linearly independent eigenvectors corresponding to that eigenvalue.

Why is algebraic multiplicity important?

Algebraic multiplicity plays a crucial role in determining the diagonalizability of a matrix and finding the Jordan canonical form.

How is algebraic multiplicity related to the trace and determinant of a matrix?

The algebraic multiplicity of an eigenvalue is equal to the exponent to which it appears in the characteristic polynomial, which is also equal to the number of times it appears on the diagonal of the Jordan canonical form. This, in turn, is related to the trace and determinant of the matrix.

Can the algebraic multiplicity of an eigenvalue be greater than its geometric multiplicity?

Yes, it is possible for the algebraic multiplicity to be greater than the geometric multiplicity. This occurs when the matrix is not diagonalizable and there are fewer linearly independent eigenvectors than the algebraic multiplicity of the corresponding eigenvalue.

Back
Top