Show that V has a basis of eigenvectors

In summary, the basis of eigenvectors of a linear operator T is created if and only if T has a basis B consisting of eigenvectors. The basis of eigenvectors is created if and only if the operator has a single eigenvalue for each column of the basis.
  • #1
evilpostingmong
339
0

Homework Statement


Let T: V---->V be a linear operator where dim V=n. Show that V
has a basis of eigenvectors if and only if V has a basis B such that
TB is diagonal.

Homework Equations


The Attempt at a Solution


Let T=[a1,1...an,1] ai,j=/=0
[a1,n...an,n]
Let TB=[a1,1v1...0n,1]
[01,n...an,nvn]
Since this is diagonal, and ai,j=/=0,
then we have a basis of eigenvectors (these are meant to be vertical)<[v1,...0]...[0...vn]>
that, after being mulitplied by T, formed the matrix TB.
To show that they are eigenvectors, a possible linear combination is
[v1,...0] and when multiplied by T gives a1,1[v1,...0]+...+a1,n[v1,...0]
=(a1,1+...+a1,n)[v1,...0]. Since [v1,...0] is an eigenvector, and
the others follow the same logic, mulitplying the basis of eigenvectors by T
should produce a diagonal matrix, as shown.
 
Last edited:
Physics news on Phys.org
  • #2
evilpostingmong said:

Homework Statement


Let T: V---->V be a linear operator where dim V=n. Show that V
has a basis of eigenvectors if and only if V has a basis B such that
TB is diagonal.


Homework Equations





The Attempt at a Solution


Let T=[a1,1...an,1] ai,j=/=0
[a1,n...an,n]
Let TB=[a1,1v1...0n,1]
[01,n...an,nvn]
Since this is diagonal, and ai,j=/=0,
then we have a basis of eigenvectors (these are meant to be vertical)<[v1,...0]...[0...vn]>
that, after being mulitplied by T, formed the matrix TB.
To show that they are eigenvectors, a possible linear combination is
[v1,...0] and when multiplied by T gives a1,1[v1,...0]+...+a1,n[v1,...0]
=(a1,1+...+a1,n)[v1,...0]. Since [v1,...0] is an eigenvector, and
the others follow the same logic, mulitplying the basis of eigenvectors by T
should produce a diagonal matrix, as shown.

Suppose V has a basis [tex]B=\{b_1,\ldots,b_n\}[/tex] consisting of eigenvectors of T. Then for each j, [tex]T b_j = \lambda_j b_j[/tex]. If I express this equation in terms of the basis B, then
[tex][T]_B [b_j]_B = \lambda_j [b_j]_B[/tex]
But [tex][b_j]_B[/tex] is simply a column vector containing all zeros except for a 1 in the j'th position. Carrying out the matrix multiplication then shows that
[tex][T]_B = \Lambda[/tex]
where [tex]\Lambda[/tex] is the diagonal matrix whose diagonal entries are [tex]\{\lambda_1,\ldots,\lambda_n\}[/tex]. This gives you one direction of the proof. For the other direction, you can essentially reverse the steps above to verify that the [tex]\lambda_j[/tex]'s must be eigenvalues and the [tex]b_j[/tex]'s must be eigenvectors.
 
  • #3
Yeah, I did the other direction by assuming that TB is diagonal then showing that
the basis consists of eigenvectors and that TB consists of eigenvalues, though I
think I messed up when I mentioned a1,1+...+a1,n is an eigenvalue since I multiplied wrong.
It should be a1,1 for the eigenvalue of [v1...0]. Made a mistake with multiplication.
Oh and when I said "then we have a basis of eigenvectors" I shouldn't've called
the basis of eigenvectors a basis of eigenvectors, since I was proving it in the first place,
and I should've called the scalar an eigenvalue. It's these small mistakes that make a big
difference.
 
Last edited:

FAQ: Show that V has a basis of eigenvectors

How do you define a basis of eigenvectors?

A basis of eigenvectors is a set of linearly independent vectors that span the vector space and can be used to represent all the possible states of a system. These vectors are associated with eigenvalues, which are scalar values that represent the amount of stretching or shrinking that occurs when the corresponding eigenvectors are multiplied by a linear transformation.

Why is it important to show that V has a basis of eigenvectors?

Showing that V has a basis of eigenvectors is important because it allows us to decompose a linear transformation into simpler, diagonalizable components. This makes it easier to understand and analyze the behavior of the transformation and its effect on the vector space.

How do you prove that a vector space V has a basis of eigenvectors?

To prove that a vector space V has a basis of eigenvectors, we need to show that there exists a set of linearly independent vectors in V that can be used to represent all the possible states of the system. This can be done by finding the eigenvalues and corresponding eigenvectors of a linear transformation that maps V onto itself.

What are some properties of a basis of eigenvectors?

A basis of eigenvectors has a few important properties. Firstly, the eigenvectors are linearly independent, meaning that none of them can be expressed as a linear combination of the others. Secondly, the eigenvectors span the vector space, meaning that any vector in the space can be written as a linear combination of the eigenvectors. Lastly, the eigenvalues associated with the eigenvectors must all be distinct.

Can any vector space have a basis of eigenvectors?

No, not all vector spaces have a basis of eigenvectors. In order for a vector space to have a basis of eigenvectors, it must be finite-dimensional and have a linear transformation that maps the space onto itself. Additionally, the linear transformation must have distinct eigenvalues and the associated eigenvectors must span the entire space.

Back
Top