How Does a Polynomial Transformation Affect Eigenvectors and Eigenvalues?

  • Thread starter ptolema
  • Start date
  • Tags
    Proof
In summary: Cayley-Hamilton theorem. (EDIT: I am NOT saying that theorem is relevant here, it's just the same sort of "put matrix into a polynomial" idea.) Then if x and c an eigenvector and eigenvalue of A, we are to proveg(A)x = g(c) x.
  • #1
ptolema
83
0

Homework Statement



Let T be a linear operator (T: V-->V) on a vector space V over the field F, and let g(t) be a polynomial with coefficients from F. Prove that if x is an eigenvector of T with corresponding eigenvalue λ, then g(T)(x) = g(λ)x. That is, x is an eigenvector of g(T) with corresponding eigenvalue g(λ).

Homework Equations



T(x)=λx

The Attempt at a Solution


I tried substituting λx directly into g, but that gave me an answer that I couldn't easily factor x out of to get g(T)(x) = g(λ)x. I don't know if there is a theorem regarding this, but I've hit a wall. Any suggestions?
 
Physics news on Phys.org
  • #2
I don't see how doing what you say wouldn't work directly.

Write g(x) as [itex]g(x)= a_nx^n+ a_{n-1}x^{n-1}+ \cdot\cdot\cdot+ a_1x+ a_0[/itex]. Then, for any vector x, [itex]g(T)v= a_nT^nx+ a_{n-1}T^{n-1}x+ \cdot\cdot\cdot+ a_1Tx+ a_0x[/itex]

The crucial point is that, because x is an eigenvector of T with eigenvalue [itex]\lambda[/itex], [itex]Tx= \lambda x[/itex], [itex]T^2x= T(Tx)= T(\lambda x)= \lambda T(x)= \lambda(\lambda x)= \lambda^2 x[/itex], to [itex]T^nx= \lambda^n x[/itex]. To prove that rigorously, use induction on n.
 
Last edited by a moderator:
  • #3
No, the previous answer was completely incorrect. First, (T(x))^2 is the quantity T(x) squared, which is not, in general, equal to T(T(x)), which is the quantity T(x) evaluated in T. Unless T is the squaring function, (T(x))^2 doesn't equal T^2(x).

Unfortunately, as you have it written, there is not a proof for such an equality, as it does not hold. consider g=x^2+1. Then g(T(x))=(T(x))^2+1= a^2x^2+1, which is obviously not equal to (a^2+1)x. (Where a is the eigenvalue to x). Even if we consider the notion given in the last post, the +1 term throws off the calculation, because g(a)x=xa^2+x, and g(T(x))=xa^2+1.
 
  • #4
Nah, HallsofIvy has it right. Let A be a matrix representation of T with respect to some basis. Then

g(A) = a_n A^n + ... + a_1 A + a_0 I.

It's the same sort of operator/matrix/polynomial idea as in the Cayley-Hamilton theorem. (EDIT: I am NOT saying that theorem is relevant here, it's just the same sort of "put matrix into a polynomial" idea.) Then if x and c an eigenvector and eigenvalue of A, we are to prove

g(A)x = g(c) x.
 
  • #5
halez12 said:
No, the previous answer was completely incorrect. First, (T(x))^2 is the quantity T(x) squared
No, it's not. We are working in a general vector space and multiplication (and so squaring) is not even defined. The only thing "[itex]T^2(x)[/itex]" (I never wrote "[itex](T(x))^2[/itex]") could mean is T(T(x)).

, which is not, in general, equal to T(T(x)), which is the quantity T(x) evaluated in T. Unless T is the squaring function, (T(x))^2 doesn't equal T^2(x).

Unfortunately, as you have it written, there is not a proof for such an equality, as it does not hold. consider g=x^2+1. Then g(T(x))=(T(x))^2+1= a^2x^2+1, which is obviously not equal to (a^2+1)x. (Where a is the eigenvalue to x). Even if we consider the notion given in the last post, the +1 term throws off the calculation, because g(a)x=xa^2+x, and g(T(x))=xa^2+1.
 

FAQ: How Does a Polynomial Transformation Affect Eigenvectors and Eigenvalues?

What is an eigenvector?

An eigenvector is a vector that, when multiplied by a matrix, results in a scalar multiple of itself. In other words, the direction of the eigenvector does not change, but its magnitude may be scaled.

What is an eigenvalue?

An eigenvalue is the scalar factor by which an eigenvector is scaled when multiplied by a matrix. It represents the amount of stretching or compression that occurs in the direction of the eigenvector when the matrix is applied.

Why is the concept of eigenvectors and eigenvalues important?

Eigenvectors and eigenvalues are important in many fields of science and mathematics, including physics, engineering, and data analysis. They allow us to analyze linear transformations and identify important patterns and structures in data.

How do you prove that a vector is an eigenvector of a given matrix?

To prove that a vector is an eigenvector of a matrix, we need to show that when the vector is multiplied by the matrix, the result is a scalar multiple of the original vector. This can be done by solving the equation Av = λv, where A is the matrix, v is the eigenvector, and λ is the eigenvalue.

What is the significance of the eigenvector/eigenvalue proof?

The eigenvector/eigenvalue proof is significant because it allows us to confidently identify and use eigenvectors and eigenvalues in various applications. It provides a rigorous mathematical foundation for understanding these concepts and their properties.

Back
Top