Must every linear operator have eigenvalues? If so, why?

In summary, the conversation discusses the existence of eigenvalues for linear operators on finite-dimensional vector spaces over the complex numbers. The pigeonhole principle is mentioned as a way to show that at least one vector in the vector space must be parallel to its image, thus proving the existence of at least one eigenvalue. The possibility of complex eigenvalues and infinite-dimensional vector spaces is also briefly mentioned. The conversation then shifts to discussing the relationship between eigenvalues and eigenvectors, and the use of the Fundamental Theorem of Algebra to prove the existence of eigenvalues. The conversation ends with a discussion on the availability of help on the topic.
  • #1
ygolo
30
0
It seems to me that http://en.wikipedia.org/wiki/Schur_decomposition" relies on the fact that every linear operator must have at least one eigenvalue...but how do we know this is true?

I have yet to find a linear operator without eigenvalues, so I believe every linear operator does have at least one eigenvalue.

Still how does one prove it?

Since we are looking for solutions to (A-Ia)|V>=|0>, wouldn't it be possible the A-Ia is always nonsingular and that the equation has only trivial solutions?
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
On a finite-dimensional vector space V over the complex numbers, it should be obvious that any linear operator must have eigenvalues, although some or all of those eigenvalues might be zero. Since the operator is a map from V to itself (or a subset), one can use the pigeonhole principle to show that at least one vector in V must be parallel to its image.

Over the real numbers, some operators do not have eigenvalues (e.g. rotation matrices in R^2), because the eigenvalues happen to be complex. But I don't know if this really counts.

On an infinite-dimensional vector space, they may be other subtleties that allow an exception. For example, in quantum mechanics, the momentum operator is linear, but its eigenstates are not normalizable, and hence not technically part of the Hilbert space. That is, the eigenvectors are not actually members of the vector space, and so it might be reasonable to say that the corresponding eigenvalues do not actually exist.
 
  • #3
Ben Niehoff said:
On a finite-dimensional vector space V over the complex numbers, it should be obvious that any linear operator must have eigenvalues, although some or all of those eigenvalues might be zero. Since the operator is a map from V to itself (or a subset), one can use the pigeonhole principle to show that at least one vector in V must be parallel to its image.

Point for Clarification: How does on use the pigeonhole principle in this case? It seems like you could be mapping an uncountably infinite number of vectors (though from a space of finite dimension).

What are the pigeonholes? (the vector directions? I don't really trust intuitions based on the arrow representations of vectors)

...and how do you assign the images of the operation to them?

---------

I just thought of something. If we can prove that a complex polynomial must have complex roots, then we can apply it to the characteristic polynomial.

...and this is, I believe (for non-constant polynomials), The Fundamental Theorem of Algebra.

http://en.wikipedia.org/wiki/Fundamental_theorem_of_algebra

Because of the diagonal entries of |A-aI|, I believe, the determinant will have to be a non-costant polynomial
 
Last edited:
  • #4
For every Eigenvalue must there be a non-trivial eigenvector? If so, why?

So then the next question becomes:

For every Eigenvalue must there be a non-trivial eigenvector? If so, why?
 
  • #5


ygolo said:
So then the next question becomes:

For every Eigenvalue must there be a non-trivial eigenvector? If so, why?


I think you cannot have one without the other. It is in the definition.
 
  • #6


I wonder why the science advisers and pf mentors have so far not replied this thread. Too basic?

I am actually not sure I answered my first question fully, since I did not rigorously prove that the characteristic polynomial is non-constant.

Reb said:
I think you cannot have one without the other. It is in the definition.

Is it? I think its slightly more subtle. Ax=ax, x could be trivial, that is |0>, even if there are non-trivial a, such that |A-aI|=0.

The solution to (A-aI)x=|0>, could have both parts be 0, no?
 
  • #7
No, part of the definition is that Ax = ax holds for some nontrivial x.

The characteristic polynomial cannot be constant. This is easy to prove by expanding the determinant |A-aI| by minors. Specifically, if A is an NxN matrix, then what is the coefficient of a^N?

I wonder why the science advisers and pf mentors have so far not replied this thread. Too basic?

What are we, chopped liver? And as far as I know, there is nobody being paid to sit at their desk and read everything that happens on PF so that they can help everybody who asks.
 
  • #8
ygolo said:
Point for Clarification: How does on use the pigeonhole principle in this case? It seems like you could be mapping an uncountably infinite number of vectors (though from a space of finite dimension).

What are the pigeonholes? (the vector directions? I don't really trust intuitions based on the arrow representations of vectors)

http://golem.ph.utexas.edu/category/2007/05/linear_algebra_done_right.html

The pigeon holes are the dimensions of the space. We've only got n dimensions. If we apply a transformation T over and over again to a point, we'll produce a linearly dependent set of vectors. I only have a fluff idea of how the rest of the argument works, but I got that much of it. The articles explains the rest.
 
  • #9
Ben Niehoff said:
No, part of the definition is that Ax = ax holds for some nontrivial x.

Well then the proof by the fundamental theorem of algebra falls short of proving the exisence of an eigenvalue then doesn't it?

Ben Niehoff said:
The characteristic polynomial cannot be constant. This is easy to prove by expanding the determinant |A-aI| by minors. Specifically, if A is an NxN matrix, then what is the coefficient of a^N?

Good point!

Ben Niehoff said:
What are we, chopped liver? And as far as I know, there is nobody being paid to sit at their desk and read everything that happens on PF so that they can help everybody who asks.

I was just curious, but I appreciate yours and the others' help.

Tac-Tics said:
http://golem.ph.utexas.edu/category/2007/05/linear_algebra_done_right.html

The pigeon holes are the dimensions of the space. We've only got n dimensions. If we apply a transformation T over and over again to a point, we'll produce a linearly dependent set of vectors. I only have a fluff idea of how the rest of the argument works, but I got that much of it. The articles explains the rest.

I'll try and decipher this after work.
 
  • #10
ygolo said:
Well then the proof by the fundamental theorem of algebra falls short of proving the exisence of an eigenvalue then doesn't it?

Are we talking in general or just numeral matrices under the reals?
 
  • #11
John Creighto said:
Are we talking in general or just numeral matrices under the reals?

General complex linear operators.

---
Silly me. If |A-aI|=0 then A-aI is sigular, and therefore not of full rank, and therefore has a (non trivial) null-space, which means A has an eigenvector.

Forgive me, it has been 13 years since I took linear algebra, and 12 years since Complex Analysis.
 
Last edited:
  • #12
By the way, the question of whether every linear operator (on a finite-dimensional F-vector space) has an eigenvalue is actually equivalent to the question of whether every polynomial with coefficients in F has a root in F.

I.e. if you can prove what the OP is asking for (over [itex]\mathbb{C}[/itex]) without using the fundamental theorem of algebra, then you can actually use your argument to prove the FTA!
 
Last edited:
  • #13
ygolo said:
It seems to me that http://en.wikipedia.org/wiki/Schur_decomposition" relies on the fact that every linear operator must have at least one eigenvalue...but how do we know this is true?

I have yet to find a linear operator without eigenvalues, so I believe every linear operator does have at least one eigenvalue.

Still how does one prove it?

Since we are looking for solutions to (A-Ia)|V>=|0>, wouldn't it be possible the A-Ia is always nonsingular and that the equation has only trivial solutions?

For finite dimensional vector space there is a classic theorem. It shows that your question boils down to whether a certain polynomial has roots in the underlying scalar field of the vector space. If the field is algebraically close-e.g. the complex numbers-then there is always a root.

If the field is not algebraically closed then there may not be eigenvalues. For instance rotation of the plane by 90 degrees has no real eigenvalues.
 
Last edited by a moderator:
  • #14
I started to think about this problem with the following quasi-informal statement

"Every linear operator can be represented as a matrix..."

which is for my taste, a very very very beautiful theorem. By the way I originally misstated it. The fields and the spaces are left out for a good reason. to just focus on the necessary part
 
  • #15
Every linear operator, over finite dimensional vector space V, can be represented by a matrix, given a specific basis on that vector space. (The matrix depends upon the choice of basis.) Given that, then you can write the "characteristic equation" for the matrix as a polynomial equation and the use the fundamental theorem of algebra: that every such equation over the complex numbers has a solution.
 
  • #16
ygolo said:
It seems to me that http://en.wikipedia.org/wiki/Schur_decomposition" relies on the fact that every linear operator must have at least one eigenvalue...but how do we know this is true?

I have yet to find a linear operator without eigenvalues, so I believe every linear operator does have at least one eigenvalue.

Still how does one prove it?

Since we are looking for solutions to (A-Ia)|V>=|0>, wouldn't it be possible the A-Ia is always nonsingular and that the equation has only trivial solutions?

Rotation by 90 degrees in the plane has no real eigenvalues - only complex.
Why do you think that every linear operator has at least one eigen value? Can you show me your argument?
 
Last edited by a moderator:
  • #17
In my opinion, the most revealing way to see this theorem for finite dimensional vector spaces is the following classical argument.

If L:V -> V is a linear map of the n dimensional vector space,V, then the vectors L^n(v),L^n-1(v),...,L(v),V are linearlly dependent so some linear combination of them is zero. This linear combination says that some polynomial in L sends V to zero. Consider the ideal generated by all such polynomials.

Since the ring of polynomials over a field is a principal ideal domain, there is a polynomial that divides all of the others in this ideal. This polynomial must be zero on the entire vector space since the above construction using the vector,V, can be performed on any vector.

If this polynomial it has a root over F, then it factors as (L - a)P. The linear factor (L - a) must be zero on some non-zero vector so a is an eigen-value and the vector is an eigen-vecor.
 

FAQ: Must every linear operator have eigenvalues? If so, why?

1. Does every linear operator have eigenvalues?

Yes, every linear operator has eigenvalues.

2. Why is it important for a linear operator to have eigenvalues?

Eigenvalues provide important information about the behavior and properties of a linear operator. They can help determine the stability, invertibility, and diagonalizability of the operator.

3. What is the significance of eigenvalues in linear algebra?

Eigenvalues play a crucial role in linear algebra as they are used to solve systems of linear equations, understand the dynamics of linear transformations, and determine the fundamental properties of linear operators.

4. Can a linear operator have more than one eigenvalue?

Yes, a linear operator can have multiple eigenvalues. In fact, most linear operators have multiple distinct eigenvalues, unless they are scalar multiples of the identity operator.

5. Are eigenvalues unique to each linear operator?

Yes, eigenvalues are unique to each linear operator. Even if two operators have the same matrix representation, they can have different eigenvalues and eigenvectors. However, the eigenvalues of similar operators will be the same.

Similar threads

Back
Top