Eigenvalue and vector quick question

In summary, the conversation discusses the calculation of eigenvalues and eigenvectors for a given matrix A. The calculated eigenvalues were λ = 8 and 0, and the corresponding eigenvectors were found to be -1 and 3 for λ = 8, and 3 and -1 for λ = 0. However, the speaker realizes that the signs of the eigenvectors should be opposite, but is unsure of where they went wrong. The expert summarizer points out that the answers are still correct, as any constant multiple of an eigenvector is also an eigenvector. They also suggest checking the calculations by multiplying A with the eigenvectors to verify that they are indeed correct.
  • #1
Tzabcan
10
0
So, I have the matrix:

A = -1 -3
3 9

Eigenvalues i calculated to be λ = 8 and 0

Now when i calculate the Eigenvector for λ = 8, i get the answer -1
3

Then when solve for the eigenvector for λ= 0 I get the eigenvector 3
-1

Both incorrect, they're supposed to have the signs the opposite way round. I don't understand how I am getting it wrong.

This is how i am attemption to solve it.

IMG_1940.jpg


Any pointers would be appreciated.
 
Physics news on Phys.org
  • #2
Tzabcan said:
So, I have the matrix:

A = -1 -3
3 9

Eigenvalues i calculated to be λ = 8 and 0

Now when i calculate the Eigenvector for λ = 8, i get the answer -1
3

Then when solve for the eigenvector for λ= 0 I get the eigenvector 3
-1

Both incorrect, they're supposed to have the signs the opposite way round. I don't understand how I am getting it wrong.

This is how i am attemption to solve it.

IMG_1940.jpg


Any pointers would be appreciated.
I didn't check your work, but from what you said, your eigenvectors are fine. Your answers are the -1 multiples of the ones shown in your answers. If you have an eigenvector, any constant multiple of it will also be an eigenvector.

It's easy to check whether you work is correct. Calculate Ax, where A is your matrix and x is an eigenvector. If you get ##\lambda x##, where ##\lambda## is the eigenvalue associated with that eigenvector, your work is find. Note that if x is an eigenvector, -x will also be one as well.
 
  • #3
Mark44 said:
I didn't check your work, but from what you said, your eigenvectors are fine. Your answers are the -1 multiples of the ones shown in your answers. If you have an eigenvector, any constant multiple of it will also be an eigenvector.

It's easy to check whether you work is correct. Calculate Ax, where A is your matrix and x is an eigenvector. If you get ##\lambda x##, where ##\lambda## is the eigenvalue associated with that eigenvector, your work is find. Note that if x is an eigenvector, -x will also be one as well.
Oh ok, thank you :)
 

FAQ: Eigenvalue and vector quick question

What is an eigenvalue and eigenvector?

An eigenvalue is a scalar value that represents how a linear transformation changes the magnitude of a vector. An eigenvector is a non-zero vector that remains in the same direction after a linear transformation.

How is an eigenvalue and eigenvector calculated?

An eigenvalue and eigenvector can be calculated by solving the characteristic equation for a given linear transformation. The characteristic equation is det(A - λI) = 0, where A is the matrix representing the linear transformation, λ is the eigenvalue, and I is the identity matrix.

What is the significance of eigenvalues and eigenvectors in linear algebra?

Eigenvalues and eigenvectors are important in linear algebra because they provide a way to understand how a linear transformation affects a vector. They can also be used to find the diagonalization of a matrix, which simplifies many calculations.

Can a matrix have multiple eigenvalues and eigenvectors?

Yes, a matrix can have multiple eigenvalues and eigenvectors. In fact, most matrices have more than one eigenvalue and eigenvector. However, some matrices may have repeated eigenvalues or eigenvectors.

How are eigenvalues and eigenvectors used in real-world applications?

Eigenvalues and eigenvectors are used in many real-world applications, such as image processing, data compression, and quantum mechanics. They are also used in machine learning algorithms, such as principal component analysis, to reduce the dimensionality of data and identify important features.

Back
Top