Are the eigenvectors of A and A^T related?

In summary, the largest eigenvalue and associated eigenvector of matrix A with real non-negative values can provide information about the eigenvector of A^T associated with the same eigenvalue, as the eigenvalues of A and A^T are the same and they have the same set of roots in their characteristic polynomials. However, this does not apply to the left and right eigenvectors, as shown in the example provided.
  • #1
Leo321
38
0
I have an (unknown) matrix A and with real non-negative values. I know its largest eigenvalue [tex]\lambda[/tex] and the associated eigenvector, v. (I know nothing about the other eigenvectors). Does this give me any information about the eigenvector of AT associated with [tex]\lambda[/tex] or is it completely independent of v?
 
Last edited:
Physics news on Phys.org
  • #2
The eigenvalues of A and A^T are the same. Consider the characteristic polynomial as the expansion of the determinant of A. The characteristic polynomials of A and A^T are identical so they have the same set of roots.

However you can't say anything about the left and right eigenvectors. For example let A =

2 1
3 0

Eigenvalues are 3 and -1
Correponding vectors are (1 1) and (1 -3)

A^T =
2 3
1 0

Eigenvalues are again 3 and -1
Corresponding eigenvectors are (3 1) and (1 -1)
 

FAQ: Are the eigenvectors of A and A^T related?

What are right and left eigenvectors?

Right and left eigenvectors are special vectors that are associated with a square matrix. Right eigenvectors are used to describe how the matrix transforms a vector in the same direction, while left eigenvectors describe how the matrix transforms a vector in the opposite direction.

How are right and left eigenvectors different?

The main difference between right and left eigenvectors is their direction. Right eigenvectors are used to describe the transformation of a vector in the same direction, while left eigenvectors describe the transformation in the opposite direction. Additionally, right eigenvectors are associated with eigenvalues on the right side of the matrix, while left eigenvectors are associated with eigenvalues on the left side.

Why are right and left eigenvectors important?

Right and left eigenvectors are important because they help us understand how a matrix transforms vectors. They are also used in solving systems of differential equations and understanding the behavior of complex systems.

How do you calculate right and left eigenvectors?

To calculate right and left eigenvectors, you first need to find the eigenvalues of the matrix. Then, for each eigenvalue, you solve the equation (A - λI)x = 0 to find the eigenvector. The right eigenvector is found by multiplying the eigenvector by the inverse of the matrix A - λI, while the left eigenvector is found by multiplying the inverse of A - λI by the eigenvector.

Can a matrix have different sets of right and left eigenvectors?

Yes, a matrix can have different sets of right and left eigenvectors. This is because eigenvectors are not unique and can be scaled by any non-zero scalar. However, the eigenvalues associated with these eigenvectors will be the same.

Similar threads

Replies
5
Views
832
Replies
3
Views
1K
Replies
12
Views
2K
Replies
1
Views
1K
Replies
1
Views
2K
Replies
6
Views
10K
Back
Top