Eigenvalues of an Invertible Matrix

In summary, The determinant of a square matrix is not equal to 0 if any eigenvalue is not zero. If a matrix has an inverse, its determinant is 1-1. A basis for a matrix is a set of vectors that are linearly independent. If a matrix is singular, it does not have an inverse.
  • #1
cookiesyum
78
0

Homework Statement



Prove that a square matrix is invertible if and only if no eigenvalue is zero.


Homework Equations





The Attempt at a Solution



If a matrix has an inverse then its determinant is not equal to 0. Eigenvalues form pivots in the matrix. If any of the pivots are zero, then the determinant will be 0?...Is this correct logic? If so, how do I write it as a formal proof?
 
Physics news on Phys.org
  • #2
How do we find eigenvalues? t is an eigenvector if det(A-tI) = 0, right?

Does this help?
 
  • #3
Feldoh said:
How do we find eigenvalues? t is an eigenvector if det(A-tI) = 0, right?

Does this help?

I'm still not really sure, sorry. Is it because we set up the problem so that there are non-trivial solutions, but if one of the eigenvalues is 0 there is a trivial solution?
 
  • #4
What is det(A-tI) if t=0?
 
  • #5
gabbagabbahey said:
What is det(A-tI) if t=0?

Oh, if t=0, then det(A-tI)=det(A)=0, which means A is not invertible.
 
  • #6
If 0 is an eigenvalue with eigenvector v then Av=0. Right? And A(2v)=2A(v)=0. Forget determinants. Just think about what 'invertible' means.
 
  • #7
A function is invertible if it is 1-1 and onto. Here is a sketch of a possible proof (you will have to fill in the details)

Let M be a n x n matrix with no zero eigenvalues. (M:Rn->Rn)

(1-1) Suppose for the sake of contradiction that M is not 1-1. Then there are distinct vectors x and y such that Mx = My.

... (fill in this part) ...

This is a contradiction. Therefore M is 1-1.

(onto) Let {b1, b2, ..., bn} be a basis for Rn. By the definition of a basis, the bi's are linearly independent, so a1 b1 + a2 b2 + ... + an bn =/= 0 for any nontrivial choice of scalars ai (ai's can be anything except for all a1=a2=...=an=0).

... (fill in this part) ...

Therefore {Mb1, Mb2, ..., Mbn} are a basis for Rn, so M is onto.
 
  • #8
That is WAY too complicated. Even my example was too complicated. If A(0)=0 and A(v)=0, then A is not invertible. Period.
 
  • #9
Dick said:
That is WAY too complicated. Even my example was too complicated. If A(0)=0 and A(v)=0, then A is not invertible. Period.

you also need to show that it is onto. For example, if the matrix was not square then it could be 1-1 and still noninvertible. (consider [1 0; 0 1; 0 0])
 
  • #10
And if det(A)=0 there's no eigenvalues right? Because the matrix isn't invertible. Does this mean that you can't write a characteristic polynomial of the matrix?
 
  • #11
maze said:
you also need to show that it is onto. For example, if the matrix was not square then it could be 1-1 and still noninvertible. (consider [1 0; 0 1; 0 0])

The problem said that the matrix was square.
 
  • #12
schlynn said:
And if det(A)=0 there's no eigenvalues right? Because the matrix isn't invertible. Does this mean that you can't write a characteristic polynomial of the matrix?

If det(A)=0 then there is a zero eigenvalue. That's all. The matrix is NOT invertible, and yes, it does have a characteristic polynomial. They ALL do. Why would you think not?
 
  • #13
Dick said:
The problem said that the matrix was square.

Right exactly, but you still need to show that a 1-1 square matrix is also onto.
 
  • #14
Right you are as well. I was sort of forgetting the 'if and only if' part.
 
  • #15
While thinking of a matrix inverse as an inverse of a function, there are many things to be aware of with algebraic structure (matrix multiplication not being commutative for one). Admissible but not recommended. There are a lot more tools that can make this proof much easier. I applaud the one side where you observed that the det(A) would be zero if lambda=0. This proves that if you have a zero eigenvalue then your matrix is singular and hence, does not have an inverse.

Now, you must assume (hint think of the other proof backwards) you have a singular matrix (non-invertible--which means what about the determinant?) I think you have it from this point.
 
Last edited:
  • #16
kdmckale said:
While thinking of a matrix inverse as an inverse of a function, there are many things to be aware of with algebraic structure (matrix multiplication not being commutative for one). Admissible but not recommended. There are a lot more tools that can make this proof much easier. I applaud the one side where you observed that the det(A) would be zero if lambda=0. This proves that if you have a zero eigenvalue then your matrix is singular and hence, does not have an inverse.

Now, you must assume (hint think of the other proof backwards) you have a singular matrix (non-invertible--which means what about the determinant?) I think you have it from this point.
In fact, if you're clever enough you can write the whole proof as an if and only if proof instead of a two-sided proof.
 
  • #17
cookiesyum said:
I'm still not really sure, sorry. Is it because we set up the problem so that there are non-trivial solutions, but if one of the eigenvalues is 0 there is a trivial solution?
This appears to be your basic difficulty. All of the responses here, as well as the problem itself, assume you know what an "eigenvalue" is! Do you?
 
  • #18
The determinant of a square matrix equals the product of the eigenvalues of the matrix. That alone is enough to prove A is invertible if and only if there are no eigenvalues equal to zero.
 
  • #19
statdad said:
The determinant of a square matrix equals the product of the eigenvalues of the matrix. That alone is enough to prove A is invertible if and only if there are no eigenvalues equal to zero.

Ahhh...an even slicker way...I should have thought of that last night when I was using that exact fact on my ODE homework...huge kudos :)
 
  • #20
HallsofIvy said:
This appears to be your basic difficulty. All of the responses here, as well as the problem itself, assume you know what an "eigenvalue" is! Do you?

I would propose the answer is "no" after reading most of the posts :) This really is a very simple problem (as statdad has shown)...there's just so many way to show it.

I suspect the issue is lack of enrollment in a linear algebra course. As important as it is, so few people take the course. *sigh*
 

FAQ: Eigenvalues of an Invertible Matrix

What are eigenvalues of an invertible matrix?

An eigenvalue of an invertible matrix is a scalar that represents the scaling factor of the corresponding eigenvector. It is a special value that when multiplied by the eigenvector, gives back the same vector but scaled by a factor.

How do I calculate eigenvalues of an invertible matrix?

Eigenvalues can be calculated by finding the roots of the characteristic polynomial of the matrix, which is obtained by subtracting the identity matrix multiplied by a scalar from the given matrix. This process involves using linear algebra techniques such as Gaussian elimination and determinant calculations.

What is the significance of eigenvalues in an invertible matrix?

The eigenvalues of an invertible matrix are important because they provide valuable information about the matrix, such as its determinant, trace, and diagonalizability. They also play a crucial role in solving systems of linear equations and analyzing the behavior of dynamical systems.

Can an invertible matrix have complex eigenvalues?

Yes, an invertible matrix can have complex eigenvalues. This is because the characteristic polynomial of a matrix can have complex roots, which correspond to complex eigenvalues. This is common in matrices with non-real entries.

What is the relationship between eigenvalues and eigenvectors in an invertible matrix?

The eigenvalues and eigenvectors of an invertible matrix are closely related. Each eigenvalue has a corresponding eigenvector, and the set of all eigenvectors for a given eigenvalue form a subspace of the original vector space. The eigenvectors associated with distinct eigenvalues are linearly independent, and the number of distinct eigenvalues is equal to the dimension of the matrix.

Back
Top