Verifying Invertible Matrices Formulas

In summary, an invertible matrix is a square matrix that has a unique solution for every system of linear equations. This can be verified by checking the determinant or by ensuring that the matrix has a unique solution for every system of linear equations. The formula for finding the inverse of a matrix is A^-1 = (1/det(A)) * adj(A), and not all matrices can be inverted, only square matrices with a non-zero determinant. In linear algebra, invertible matrices are significant because they allow for solving systems of linear equations and play a crucial role in optimization problems and understanding linear transformations.
  • #1
Melawrghk
145
0

Homework Statement


Determine which of the formulas hold for all invertible nxn matrices A and B:
(a) (A+A-1)8=A8+A-8
(b) A + In in invertible
(c) AB=BA
(d) (A+B)(A-B) = A2 - B2
(e) (ABA-1)3 = AB3A-1
(f) A6B4 in invertible

The Attempt at a Solution


(a) I don't know this one, I don't think it's true, either. Because if you just square the thing, you'll get:
A2 + 2I + (A-1)2, and if you continue this way, you'll just get more and more of Is in there.
(b)A + In is invertible. I'm pretty sure with that one. Because I is a square matrix, same size as A (both are nxn) and if you add you will still have nxn, meaning it will be invertible.
(c)I don't think AB=BA. I tried it out with two sample matrices (just random matrices) and it doesn't work.
(d)Since AB=BA isn't true, then this should also be false. Because when you expand you will have -AB+BA and unless they equal to each other, these terms don't cancel out.
(e)I think this is true, because A*A-1 is I, so we essentially have (IB)3, which equals to IB3, and if we were to put A*A-1 back in, we'd see that the two sides do indeed equal.
(f)Yep. A and B are the same size and square. So no matter how many times we raise each one of them to whatever power, they will still remain the same size and will be invertible.

However, that least one of my "logical" answers is wrong. Can you point out what assumption I'm making that's wrong? Thanks!
 
Physics news on Phys.org
  • #2
For b), -I_n is invertible isn't it?
 
  • #3
When you think one of these is false (like the first one) try specific example: compute both the left and right side, and see whether the are equal.

Be wary of believing that the fact a matrix is square means it is invertible. This is especially true for item b

Your comments for c and d seem good - if you have a counter-example for c, try the same matrices in d for its counter-example.

your hunch for e is correct, but your reasoning isn't. simply expand

[tex]
\left(A^{-1} B A^{-1}\right)^3
[/tex]

(write out the product) and group terms together.

For the final one: again, simply because the powers of matrices are square, that is no guarantee they are invertible. But, if a matrix is invertible, so should any power of that matrix be? You need to propose what the inverse of [tex] A^6 B^4 [/tex] might look like (in terms of powers of inverses of [tex] A, B [/tex]) and then multiply to show that your conjecture is correct.
 
  • #4
Hm? -In? Isn't in invertible and just equals itself pretty much?
 
  • #5
Yes, [tex] -I_n [/tex] is invertible, but that, together with [tex] A [/tex] being invertible, will not ensure that [tex] A + I_n [/tex] is also invertible.

Think of it this way: it doesn't even work for numbers, where the multiplicative identity is [tex] 1 [/tex] - consider:

[tex]
\begin{align*}
-1 & \text{ is invertible}\\
1 & \text{ is invertible}\\
1 + (-1) = 0 & \text{ is not invertible}
\end{align*}
[/tex]
 
  • #6
statdad, okay.
So for (b), it might not be invertible, if somehow after summing the two, the det will equal zero. Which I guess could happen, although I'm having trouble visualizing that exact case.

(e)... If I expand it, will it equal A3B3A-3 or do matrices and powers work some other way? And after, am I allowed to group the two As at all? I mean, they're not beside each other, so won't I alter multiplication order if I group them together?

(f) So I guess from (b) this might also not be true if after multiplying the two , I will end up with a determinant equalling zero.

So the only answer that is right out of all the choices is (e)?
 
  • #7
The point for b) is that -I_n+I_n=0. Very NOT invertible. f) is fine. Think of the properties of determinants to show this.
 
  • #8
Again, regarding 'f'

If [tex] \Sigma [/tex] is an invertible matrix, then [tex] \Sigma^k [/tex] is also invertible, for any integer power [tex] k [/tex] is really what the problem is about. Once more, think of numbers: the inverse of [tex] 5^{10} [/tex] is [tex] 5^{-10} [/tex], and the way to prove this is

[tex]
5^{10} \cdot 5^{-10} = 5^{10 + (-10)} = 5^0 = 1
[/tex]

A similar method will work for matrices.
 

FAQ: Verifying Invertible Matrices Formulas

1. What is an invertible matrix?

An invertible matrix is a square matrix that has a unique solution for every system of linear equations. This means that the matrix can be multiplied by another matrix to produce the identity matrix, and vice versa.

2. How do you verify if a matrix is invertible?

To verify if a matrix is invertible, you can use the determinant. If the determinant is non-zero, then the matrix is invertible. Another way is to check if the matrix has a unique solution for every system of linear equations.

3. What is the formula for finding the inverse of a matrix?

The formula for finding the inverse of a matrix is:
A-1 = (1/det(A)) * adj(A), where A is the original matrix, det(A) is the determinant, and adj(A) is the adjugate matrix.

4. Can all matrices be inverted?

No, not all matrices can be inverted. Only square matrices (same number of rows and columns) can be inverted, and only if their determinant is non-zero.

5. What is the significance of invertible matrices in linear algebra?

Invertible matrices are important in linear algebra because they allow for solving systems of linear equations, which have a wide range of applications in fields such as engineering, physics, and economics. Invertible matrices also play a crucial role in finding solutions to optimization problems and in understanding the geometry of linear transformations.

Back
Top