Prove Eigenvectors Linearly Independent: v & w

In summary: It will vanish because A is invertible. So c and d must be zero. Now hit the equation with (A+bI). It will also vanish because A+bI is invertible. So c and d must be the same. So c=d.
  • #1
evilpostingmong
339
0

Homework Statement


If v and w are eigenvectors with different (nonzero) eigenvalues, prove that they are
linearly independent.

Homework Equations


The Attempt at a Solution


Define an operator A such that a is an nxn matrix, and Av=cIv with
c an eigenvalue and v and eigenvector. Define a basis
<v1...vn> in that v=vi and w=vk 1<=k<=n and 1<=i<=n,
and let ci,i, be an element in A. I is the identity matrix.

Consider I*v, a 1xn column matrix with its lonely nonzero (1) at position 1,i.
Let the value at ci,i=c.Multiplying I*v by A gives c*Iv . If I*w (another 1xn column
matrix) had 1 at position 1,i, it would
correspond with ci,i on A and we would get c*Iw. But we assume w has a different
eigenvalue. Therefore I*w must have its 1 at a different position to correspond with
a different value on A (call it k). Since I*v must have 1 at a row different from I*w,
let c*Ia1v+k*Ia2w=0, and since 1 is at different rows, and c and I and k are not zero,
a1 and a2 must be 0, so we have c*I*0*v+I*k*0*w=0*v+0*w=0, thus
a1 and a2 are trivial so v and w are linearly independent.

I kind of have a gut feeling that this may be too wordy.
 
Last edited:
Physics news on Phys.org
  • #2
What you have looks correct, but - as you said - perhaps you are overdoing a little :)

I would just start with stating that v and w are eigenvectors:
(1a) A v = c v
(1b) A w = d w
for some numbers c and d. We know that c is not equal to d and neither is equal to 0.

Now not being linearly independent means that there does not exist a number k such that w = k v. This is not pleasant to work with, so a proof by contradiction suggests itself. Suppose that there does exist a k such that
(2) w = k v.

Now can you derive a contradiction?
(Note: the rest of the proof is rather straightforward, because all you have to work with are equations (1a), (1b) and (2)).
 
  • #3
This is a special case of the last problem you posted. You don't need a basis and you don't need a matrix. If Av=av and Au=bu (a not equal b) you want to show that if cv+du=0 then both c and d are zero. Hit that equation with (A-aI).
 

FAQ: Prove Eigenvectors Linearly Independent: v & w

What are eigenvectors?

Eigenvectors are special vectors that are associated with a linear transformation. When multiplied by a specific matrix, eigenvectors remain in the same direction but may be scaled by a constant factor.

What does it mean for eigenvectors to be linearly independent?

When eigenvectors are linearly independent, it means that none of the eigenvectors can be expressed as a linear combination of the others. This is important because it allows us to find unique solutions when solving for eigenvalues and eigenvectors.

How do we prove that eigenvectors are linearly independent?

To prove that eigenvectors are linearly independent, we can use the definition of linear independence which states that a set of vectors is linearly independent if the only way to get a linear combination of the vectors to equal zero is by setting all the coefficients to zero. In the case of eigenvectors, we can show this by setting the characteristic equation equal to zero and solving for the eigenvalues. If the only solution is when all eigenvalues are equal to zero, then the eigenvectors are linearly independent.

Why is it important to prove that eigenvectors are linearly independent?

Proving that eigenvectors are linearly independent is important because it allows us to find unique solutions when solving for eigenvalues and eigenvectors. This is essential in many areas of science and engineering, including quantum mechanics, computer graphics, and data analysis.

Can eigenvectors be linearly dependent?

Yes, eigenvectors can be linearly dependent. This means that one or more eigenvectors can be expressed as a linear combination of the others. In this case, it is not possible to find unique solutions when solving for eigenvalues and eigenvectors. It is important to check for linear independence when working with eigenvectors to ensure accurate and meaningful results.

Back
Top