Which statement about eigenvalues and eigenvectors is not true?

In summary, the correct answer to the question is that a matrix that can be diagonalized is not necessarily non-invertible. This can be seen with the example of the 2x2 matrix [0 -1; 1 0], which is both diagonalizable and invertible. The statement "A matrix that can be diagonalized is non-invertible" is therefore false.
  • #1
Yankel
395
0
One more question please...

which one of these statements is NOT true (only one can be false):

a. a square matrix nXn with n different eigenvalues can become diagonal.

b. A matrix that can be diagonal is irreversible.

c. Eigenvectors that correspond to different eigenvalues are linearly independent.

d. There are square matrices with no real eigenvalue.

I think that b is correct...

thanks
 
Physics news on Phys.org
  • #2
Yankel said:
One more question please...

which one of these statements is NOT true (only one can be false):

a. a square matrix nXn with n different eigenvalues can become diagonal.

b. A matrix that can be diagonal is irreversible.

c. Eigenvectors that correspond to different eigenvalues are linearly independent.

d. There are square matrices with no real eigenvalue.

I think that b is correct...

thanks

What does "irreversible" mean in this context? Do you mean invertible?
 
  • #3
yes, sorry, I mean not invertible, meaning, has no inverse.
 
  • #4
consider this: the matrix I is diagonal, yet it is invertible, so...
 
  • #5
I have eliminated most answers out, so I am left with 2...

the first is the invertible thing, and the second is that there are no squared matrices with no real eignvalue.

According to definition: D=P^-1 * A * P

So does A need to be invertible ? Why ?

Is it possible that a characteristic polynomial will have no real roots for a squared matrix ? I think so...
 
  • #6
Yankel said:
I have eliminated most answers out, so I am left with 2...

the first is the invertible thing, and the second is that there are no squared matrices with no real eignvalue.

According to definition: D=P^-1 * A * P

So does A need to be invertible ? Why ?

Is it possible that a characteristic polynomial will have no real roots for a squared matrix ? I think so...

How about
$$\begin{bmatrix}0 &-1\\ 1 &0\end{bmatrix}?$$
 
  • #7
Ackbach said:
How about
$$\begin{bmatrix}0 &-1\\ 1 &0\end{bmatrix}?$$
And just to be clear about the second option, you mean to write that

b. A matrix that can be diagonalized is non-invertible (or singular).

Is that the correct b. option?
 
  • #8
I don't know which one is correct, but I am quite convinced now it's b, your example with the 2X2 matrix was good, I think it's done, thanks !
 
  • #9
Yankel said:
I don't know which one is correct, but I am quite convinced now it's b, your example with the 2X2 matrix was good, I think it's done, thanks !

You're welcome!
 
  • #10
Yankel said:
I don't know which one is correct, but I am quite convinced now it's b, your example with the 2X2 matrix was good, I think it's done, thanks !

as i pointed out before, I is a diagonal matrix (thus it is "diagonalizable" you don't even have to do anything!) that is invertible (I is its own inverse), so b. must be false.
 

FAQ: Which statement about eigenvalues and eigenvectors is not true?

What are eigenvalues and eigenvectors?

Eigenvalues and eigenvectors are mathematical concepts that are used to understand and analyze linear transformations. Eigenvalues represent the scalar values that are multiplied to eigenvectors when a linear transformation is applied, resulting in the same vector but with a different magnitude. Eigenvectors are the vectors that do not change direction when a linear transformation is applied.

What is the importance of eigenvalues and eigenvectors?

Eigenvalues and eigenvectors are important in many areas of mathematics and science, including physics, engineering, and computer science. They allow us to understand the behavior of linear transformations and solve complex systems of equations. They also have applications in data analysis, such as in principal component analysis.

How do you calculate eigenvalues and eigenvectors?

The process of finding eigenvalues and eigenvectors involves solving a system of equations known as the characteristic equation. This involves finding the values of lambda that satisfy the equation det(A - lambda*I) = 0, where A is the matrix representing the linear transformation and I is the identity matrix. The corresponding eigenvectors can then be found by solving the system of equations (A - lambda*I)x = 0.

What is the relationship between eigenvalues and determinant?

The determinant of a matrix is equal to the product of its eigenvalues. This means that the determinant can be used to determine if a matrix has any eigenvalues that are equal to 0, which is important in understanding the invertibility of a matrix.

What are some real-world applications of eigenvalues and eigenvectors?

Eigenvalues and eigenvectors have many real-world applications, such as in image processing, signal processing, and quantum mechanics. In image processing, they can be used to analyze and compress images. In signal processing, they can be used to filter out noise from signals. In quantum mechanics, they are used to describe the behavior of quantum systems.

Similar threads

Back
Top