Showing a matrix A is diagonalisable

  • Thread starter squenshl
  • Start date
  • Tags
    Matrix
In summary, the conversation discusses proving the linear independence of a set of eigenvectors for a d x d matrix with distinct eigenvalues. It is shown that a single vector v1 is linearly independent, and the goal is to show that it is the only possible solution. Using the definition of eigenvalues and eigenvectors, the equation c1v1=0 is considered to prove this. It is concluded that v1 is linearly independent as required.
  • #1
squenshl
479
4
Homework Statement
Here we show that if a ##d\times d## matrix ##A## has distinct eigenvalues ##\lambda_1,\ldots,\lambda_d## with eigenvectors ##\mathbf{v}_1,\ldots,\mathbf{v}_d##, then it is diagonalisable, i.e., ##\mathbb{R}^d## has a basis of eigenvectors.

My question is suppose that ##\{\mathbf{v}_1,\ldots,\mathbf{v}_n\}##, ##n < d## is linearly independent, and show that ##\{\mathbf{v}_1,\ldots,\mathbf{v}_{n+1}\}## is linearly independent.
Relevant Equations
None
Show that ##\{\mathbf{v}_1\}## is linearly independent. Simple enough let's consider
$$c_1\mathbf{v}_1 = \mathbf{0}.$$
Our goal is to show that ##c_1 = 0##. By the definition of eigenvalues and eigenvectors we have ##A\mathbf{v}_1= \lambda_1\mathbf{v}_1##. Let's multiply the above equation and ##A## to get
$$\mathbf{0} = A\times \mathbf{0} = A(c_1\mathbf{v}_1) = c_1A\mathbf{v}_1 = c_1\lambda_1\mathbf{v}_1.$$
An eigenvector is by definition a nonzero vector, and hence ##\mathbf{v}_1\neq 0##. Thus, we must have ##c_1\lambda_1 = 0##. Since ##\lambda_1## is distinct, we must have ##c_1 = 0##. Hence, ##\{\mathbf{v}_1\}## is linearly independent as required.
 
Physics news on Phys.org
  • #2
squenshl said:
Homework Statement: Here we show that if a ##d\times d## matrix ##A## has distinct eigenvalues ##\lambda_1,\ldots,\lambda_d## with eigenvectors ##\mathbf{v}_1,\ldots,\mathbf{v}_d##, then it is diagonalisable, i.e., ##\mathbb{R}^d## has a basis of eigenvectors.

My question is suppose that ##\{\mathbf{v}_1,\ldots,\mathbf{v}_n\}##, ##n < d## is linearly independent, and show that ##\{\mathbf{v}_1,\ldots,\mathbf{v}_{n+1}\}## is linearly independent.
Homework Equations: None

Show that ##\{\mathbf{v}_1\}## is linearly independent.
This doesn't make sense to me. Generally when you're talking about linear dependence/independence, you're considering a set of two or more vectors, not just a single vector. It's almost trivial to prove that a single nonzero vector is a linearly independent set, just using the definition of lin. independence, but what's the point?
squenshl said:
Simple enough let's consider
$$c_1\mathbf{v}_1 = \mathbf{0}.$$
Our goal is to show that ##c_1 = 0##.
No it is not. The goal is to show that ##c_1 = 0## is the only possible solution. For example, if ##v_1 = <1, 2>## and ##v_2 = <2, 4>##, then the equation ##c_1v_1 + c_2v_2 = 0## is a true statement if ##c_1 = c_2 = 0##, but the two vectors are linearly dependent.
squenshl said:
By the definition of eigenvalues and eigenvectors we have ##A\mathbf{v}_1= \lambda_1\mathbf{v}_1##. Let's multiply the above equation and ##A## to get
$$\mathbf{0} = A\times \mathbf{0} = A(c_1\mathbf{v}_1) = c_1A\mathbf{v}_1 = c_1\lambda_1\mathbf{v}_1.$$
An eigenvector is by definition a nonzero vector, and hence ##\mathbf{v}_1\neq 0##. Thus, we must have ##c_1\lambda_1 = 0##. Since ##\lambda_1## is distinct, we must have ##c_1 = 0##. Hence, ##\{\mathbf{v}_1\}## is linearly independent as required.
 

FAQ: Showing a matrix A is diagonalisable

What does it mean for a matrix to be diagonalisable?

When a matrix A is diagonalisable, it means that there exists a diagonal matrix D and an invertible matrix P such that A = PDP-1. In other words, the matrix A can be transformed into a diagonal matrix by a change of basis.

How do I determine if a matrix is diagonalisable?

A matrix A is diagonalisable if it has n linearly independent eigenvectors, where n is the size of the matrix. This means that the matrix must have n distinct eigenvalues. Additionally, if the algebraic multiplicity of each eigenvalue is equal to its geometric multiplicity, then the matrix is also diagonalisable.

What is the importance of diagonalisability in linear algebra?

Diagonalisability is important because it simplifies calculations involving the matrix A. Since the matrix can be transformed into a diagonal matrix, operations such as exponentiation and matrix multiplication become much easier. Additionally, diagonalisability allows us to find the inverse of the matrix A more easily.

Can a non-square matrix be diagonalisable?

No, a non-square matrix cannot be diagonalisable. The definition of diagonalisability requires the matrix to have the same number of rows and columns, which is not possible for a non-square matrix.

How do I show that a matrix is diagonalisable?

To show that a matrix A is diagonalisable, you can follow these steps:

  • Find the eigenvalues of the matrix A.
  • For each eigenvalue, find its corresponding eigenvector.
  • If the matrix has n distinct eigenvalues and n linearly independent eigenvectors, then it is diagonalisable.
  • If the algebraic multiplicity of each eigenvalue is equal to its geometric multiplicity, then the matrix is also diagonalisable.
  • Finally, use the eigenvectors to construct the invertible matrix P and the diagonal matrix D such that A = PDP-1.

Similar threads

Back
Top