Finding Eigenvectors for a Matrix A

  • Thread starter bluewhistled
  • Start date
  • Tags
    Eigenvectors
In summary, eigenvectors are vectors that maintain their direction but only change in magnitude when a linear transformation is applied to them. They are important for analyzing systems and making predictions, and can be found by solving the eigenvalue equation using methods like Gaussian elimination or the power iteration method. It is possible for a single eigenvalue to have multiple eigenvectors, and their correctness can be checked by plugging them into the eigenvalue equation or using the dot product property.
  • #1
bluewhistled
31
0

Homework Statement


a matrix A:
[1 3 0
3 1 0
0 0 -2]

Find Q and D where

QTAQ=D

The Attempt at a Solution



I found the eigenvalues of -4,2,2

When I plug them back in and rref the matrix I only get the trivial solution meaning the matrices are linearly independent. How do I get the eigenvectors if that's the case?
 
Physics news on Phys.org
  • #2


Check your eigenvalues.
 
  • #3


[L-1 -3 0
-3 L-1 0
0 0 L+2]

(L+2)((L-1)^2 - 9)
(L+2)(L^2 -2L +1 -9)
(L+2)(L^2 -2L -8)
(L^3 -2L^2 -8L +2L^2 -4L -16)
L^3 -12L -16
...
meh, got it nevermind. Thanks for the suggestion.
 

FAQ: Finding Eigenvectors for a Matrix A

What is an eigenvector?

An eigenvector is a vector that does not change direction when a linear transformation is applied to it. It only changes in magnitude (or length) by a scalar factor, known as the eigenvalue.

Why are eigenvectors important?

Eigenvectors are important because they reveal the underlying structure and behavior of a system. They are widely used in many fields, including physics, engineering, and data analysis, to identify patterns and make predictions.

How do I find the eigenvectors?

To find the eigenvectors of a matrix, you need to solve the eigenvalue equation (A - λI)v = 0, where A is the matrix, λ is the eigenvalue, and v is the eigenvector. This can be done using various methods such as Gaussian elimination or the power iteration method.

Can there be multiple eigenvectors for one eigenvalue?

Yes, it is possible for a single eigenvalue to have multiple eigenvectors associated with it. In fact, there can be an infinite number of eigenvectors for a given eigenvalue, as long as they all satisfy the eigenvalue equation.

How do I know if my eigenvectors are correct?

You can check if your eigenvectors are correct by plugging them back into the eigenvalue equation. If they satisfy the equation, then they are correct. Additionally, you can also check if the dot product of the eigenvector and the matrix is equal to the product of the eigenvalue and the eigenvector, as this is a property of eigenvectors.

Similar threads

Replies
7
Views
2K
Replies
2
Views
746
Replies
3
Views
2K
Replies
19
Views
3K
Replies
12
Views
2K
Back
Top