Proving A-1 = 3I - A for a Square Matrix A Satisfying A2 - 3A + I = 0

In summary: A(3I- A) = 3IA - A2 = 3A2 - A3 = 3(A2 - 3A + I) - 3I = 3(-I) - 3I = -I(3I- A)A = 3IA - A2 = 3A2 - A3 = 3(A2 - 3A + I) - 3I = 3(-I) - 3I = -ISince both A(3I- A) and (3I- A)A equal -I, and A is a square matrix, this shows that A is invertible and that its inverse is (3I- A
  • #1
iamsmooth
103
0
1. Show that if a square matrix A satisfies A2 - 3A + I = 0, then A-1 = 3I - A



2. A-1A = I and A-1A = I and more that I can't think of



3. 3A = A2 + I

A = (A2 + I)/3

?


This question is weird :eek:
Anyone know how to do it?
 
Physics news on Phys.org
  • #2
I'm so stupid I got the template wrong. I think I figured it out:

Show that if a square matrix A satisfies A2 - 3A + I = 0, then A-1 = 3I - A

A2 - 3A + I = 0
A2 - 3A = -I
A(A - 3I) = -I
A - 3I = -I/A

I/A = A-1, therefore:

A-1 = 3I - A
 
  • #3
i would take out the divide step, its not really defined for matricies, there is only multiplication by the inverse

so before that step, just muliply through by -1 and you're finished
 
  • #4
Oh you're right! Matrix multiplication doesn't actually exist >.>

Thanks for the input!
 
  • #5
iamsmooth said:
Oh you're right! Matrix multiplication doesn't actually exist >.>
Matrix multiplication actually DOES exist. It's matrix division that doesn't exist.
 
  • #6
iamsmooth said:
I'm so stupid I got the template wrong. I think I figured it out:

Show that if a square matrix A satisfies A2 - 3A + I = 0, then A-1 = 3I - A

A2 - 3A + I = 0
A2 - 3A = -I
A(A - 3I) = -I
A - 3I = -I/A

I/A = A-1, therefore:

A-1 = 3I - A

I buy it up to the step where you have A(A - 3I) = -I. After that, I'm not buying. First off, and as already noted, there is no matrix division, so -I/A doesn't make sense.

Second, you say that I/A = A-1. What makes you think that A has an inverse? All you are given is that A is a square matrix. There is nothing said about A being invertible (i.e., having an inverse).

So how can you justify your last statement, that A-1 = 3I - A?

Hint: A(A - 3I) = -I, or equivalently, A(3I - A) = I.
 
  • #7
Mark44 said:
I buy it up to the step where you have A(A - 3I) = -I. After that, I'm not buying. First off, and as already noted, there is no matrix division, so -I/A doesn't make sense.

Second, you say that I/A = A-1. What makes you think that A has an inverse? All you are given is that A is a square matrix. There is nothing said about A being invertible (i.e., having an inverse).

So how can you justify your last statement, that A-1 = 3I - A?

Hint: A(A - 3I) = -I, or equivalently, A(3I - A) = I.

Isn't it said that a matrix is invertible if the product is an identity matrix, whereas (3I - A) is the inverse of A, so you know it's invertible. Thus instead of dividing, you multiply both sides by the inverse of A which would make 3I - A = A-1

Or wait is it AB = BA = I, so it has to commute? I'm not too sure how to prove that A is invertible in this case, if this isn't right. :(
 
  • #8
What are A(3I- A) and (3I- A)A ?
 

FAQ: Proving A-1 = 3I - A for a Square Matrix A Satisfying A2 - 3A + I = 0

What is matrix algebra and why is it important?

Matrix algebra is a branch of mathematics that deals with the manipulation and study of matrices, which are rectangular arrays of numbers. It is important because it provides a powerful tool for solving problems in various fields such as physics, engineering, economics, and computer science.

What are the basic operations in matrix algebra?

The basic operations in matrix algebra include addition, subtraction, multiplication, and division. Addition and subtraction are performed by adding or subtracting corresponding elements of two matrices. Multiplication involves multiplying elements of one matrix by elements of another matrix and summing the products. Division is not defined for matrices, but matrix inversion is a related operation.

What is the difference between a row vector and a column vector?

A row vector is a matrix with only one row, while a column vector is a matrix with only one column. In other words, a row vector is a horizontal array of numbers, while a column vector is a vertical array of numbers.

How is matrix algebra used in data analysis?

Matrix algebra is used in data analysis to perform operations on large datasets, such as finding the mean, variance, and correlation between variables. It is also used in machine learning and statistical modeling to solve complex problems and make predictions based on data.

What are eigenvalues and eigenvectors in matrix algebra?

Eigenvalues and eigenvectors are important concepts in matrix algebra. Eigenvalues are a set of numbers that represent the scaling factor for the corresponding eigenvectors when a matrix is multiplied by them. Eigenvectors are vectors that do not change direction when multiplied by a matrix, but can be scaled by the corresponding eigenvalue. They have many applications in physics, engineering, and computer science.

Back
Top