Prove B is invertible if AB = I

  • Thread starter songoku
  • Start date
In summary: B) + dim (nul B^T) = nMy bad, I was thinking of the matrix as a linear transformation and not as a matrix. It's been a while since I've done this stuff. So, if B is not invertible, then it's not surjective, and since it's not surjective it's not injective. So, the null space of B is non-empty. Take a vector v in the null space of B. Then, we have Bv = 0. Now suppose BA =/= I, then there's some vector w such that BAw =/= w. It follows then that B(Aw) =/= Bw. However, Bw is an
  • #1
songoku
2,365
347
Homework Statement
If AB = I for square matrices A and B, show that B is invertible. (Do not assume A is invertible)
Relevant Equations
##A.A^{-1}=I##
Definition of inverse: Matrix ##P## is invertible if there is matrix ##Q## such that ##PQ = I## and ##QP = I##

So since ##AB = I## is given, first I need to show ##BA = I## to be able to prove that ##B## is invertible? If yes, how to show ##BA = I##?

I thought the answer to this question is straightforward, because ##AB = I##, it means that ##A## is inverse of ##B##, so ##B## has inverse then ##B## is invertible?

Thanks
 
  • Like
Likes Delta2
Physics news on Phys.org
  • #2
songoku said:
Homework Statement:: If AB = I for square matrices A and B, show that B is invertible. (Do not assume A is invertible)
Relevant Equations:: ##A.A^{-1}=I##

Definition of inverse: Matrix ##P## is invertible if there is matrix ##Q## such that ##PQ = I## and ##QP = I##

So since ##AB = I## is given, first I need to show ##BA = I## to be able to prove that ##B## is invertible? If yes, how to show ##BA = I##?

I thought the answer to this question is straightforward, because ##AB = I##, it means that ##A## is inverse of ##B##, so ##B## has inverse then ##B## is invertible?

Thanks
The answer is straightforward if we have a group, but this isn't on our list of assumptions.

I would determine the kernel of ##B## and reason with the finite dimensions.
 
  • Like
Likes songoku
  • #3
songoku said:
I thought the answer to this question is straightforward, because ##AB = I##, it means that ##A## is inverse of ##B##, so ##B## has inverse then ##B## is invertible?
If you simply treat matrices as elements of a ring, then it can't be proved - as there are rings with elements that have one-sided inverses.

That means you need to use something specific about matrices (over and above the ring axioms).
 
  • Like
Likes songoku and Delta2
  • #4
fresh_42 said:
The answer is straightforward if we have a group, but this isn't on our list of assumptions.

I would determine the kernel of ##B## and reason with the finite dimensions.

PeroK said:
If you simply treat matrices as elements of a ring, then it can't be proved - as there are rings with elements that have one-sided inverses.

That means you need to use something specific about matrices (over and above the ring axioms).
Sorry my lesson has not covered all those things. What I have learned so far are linear transformations, subspace, inverse, dimension and rank.

Is it possible to answer the question only based on what I learned?

Thanks
 
  • #5
songoku said:
Sorry my lesson has not covered all those things. What I have learned so far are linear transformations, subspace, inverse, dimension and rank.

Is it possible to answer the question only based on what I learned?

Thanks
You need the rank, or injectivity, and the dimension formula, or an embedding argument.
 
  • Like
Likes songoku
  • #6
I think the main point here is that you would like to just do matrix operations to go from BA and end up with I. But you cannot, it turns out there are other objects with addition and multiplication where being an inverse multiplying on the left does not imply being an inverse multiplying on the right. So you have to use something special about matrices being linear transformations. The best you can do with just matrix math is observing things like ##(BA)B=B(AB)=BI=B##.

So ##BA## is a matrix which when you multiply it on the right by ##B##, you still get ##B## (and ##B## isn't arbitrary here, it's your specific matrix). That doesn't feel like a lot, but if you think carefully about what it means for a square matrix to be full rank, you can prove that ##BA## must be the identity map (I suggest a proof by contradiction)
 
  • Like
Likes songoku, Delta2 and PeroK
  • #7
$$AB = I$$
$$ABB^{-1} = IB^{-1}$$
$$A = IB^{-1}$$
$$BA = BIB^{-1}$$
$$BA = BB^{-1}$$
$$BA = I$$

EDIT: Apparently you're not allowed to inverse neither A nor B, so this isn't a solution. You sure you read the question right?
 
Last edited:
  • Like
Likes Delta2
  • #8
Addez123 said:
$$AB = I$$
$$ABB^{-1} = IB^{-1}$$
$$A = IB^{-1}$$
$$BA = BIB^{-1}$$
$$BA = BB^{-1}$$
$$BA = I$$

You assumed B is invertible! This is not a solution.
 
  • Like
Likes PeroK
  • #9
Office_Shredder said:
You assumed B is invertible! This is not a solution.
Ohh haha, thought I A was the only non-invertable one.

Yeah no, then there's no solutions I can think of.
 
  • Like
Likes Delta2
  • #10
Office_Shredder said:
You assumed B is invertible! This is not a solution.
@Addez123 ... what by the way holds true for your equation ##AA^{-1}=I##, too, in your first post. You must not assume that ##A## is invertibel.

Again: Is ##B## injective? What do you know about linear (injective) transformations?
 
  • #11
Addez is not the person who started the thread :)
 
  • #12
Oops!
 
  • Haha
Likes Delta2 and Addez123
  • #13
fresh_42 said:
Oops!
I was slightly confused at the reponse :p
 
  • #14
Addez123 said:
I was slightly confused at the reponse :p
Yeah, I'm sorry. Things happen :wink:
 
  • #15
fresh_42 said:
You need the rank
Office_Shredder said:
I think the main point here is that you would like to just do matrix operations to go from BA and end up with I. But you cannot, it turns out there are other objects with addition and multiplication where being an inverse multiplying on the left does not imply being an inverse multiplying on the right. So you have to use something special about matrices being linear transformations. The best you can do with just matrix math is observing things like ##(BA)B=B(AB)=BI=B##.

So ##BA## is a matrix which when you multiply it on the right by ##B##, you still get ##B## (and ##B## isn't arbitrary here, it's your specific matrix). That doesn't feel like a lot, but if you think carefully about what it means for a square matrix to be full rank, you can prove that ##BA## must be the identity map (I suggest a proof by contradiction)
What it means for a square matrix to be full rank is that the columns of that matrix form a linearly independent set but I don't know how to use that fact to proceed.

Proof by contradiction means that I start from assumption that B is not invertible?

fresh_42 said:
or injectivity
fresh_42 said:
Again: Is ##B## injective? What do you know about linear (injective) transformations?
I don't know whether B is injective or not.

What I know about linear transformation to be injective (one to one) is that ##B(x)=0## has only trivial solution.

fresh_42 said:
and the dimension formula, or an embedding argument.
By dimension formula, do you mean dim (col B) + dim (nul B) = n (assume B has n columns)?

Addez123 said:
EDIT: Apparently you're not allowed to inverse neither A nor B, so this isn't a solution. You sure you read the question right?
The only thing I am sure about this question is I am 100% sure I posted the question correctly as I got it and I am also sure I don't know how to solve it
 
  • #16
songoku said:
What it means for a square matrix to be full rank is that the columns of that matrix form a linearly independent set but I don't know how to use that fact to proceed.
The vectors in a matrix probably won't help here very much. Of course, you can use them, but I think it is easier without them.
songoku said:
Proof by contradiction means that I start from assumption that B is not invertible?
I don't know whether B is injective or not.

What I know about linear transformation to be injective (one to one) is that ##B(x)=0## has only trivial solution.
Yes. An injective function satisfies ##f(x)=f(y) \Longrightarrow x=y## and if ##f## is linear as in our case, this means ##f(x-y)=0 \Longrightarrow x-y=0## or equivalent ##f(v)=0 \Longrightarrow v=0.##

So assume ##B(v)=0.## What does that tell you, given that ##AB=I\;?##
songoku said:
By dimension formula, do you mean dim (col B) + dim (nul B) = n (assume B has n columns)?
Yes, although it is better known as rank-nullity theorem:
##\operatorname{rank} B + \operatorname{null} B = \dim\operatorname{im} B + \dim\operatorname{ker}B= dim (col B) + dim (nul B) = n= \dim V= \dim \operatorname{domain} B.##

Since you have (hopefully) shown that ##\dim\operatorname{ker}B=0## you will get the rank of ##B## from that formula.
 
  • Like
Likes songoku
  • #17
songoku said:
The only thing I am sure about this question is I am 100% sure I posted the question correctly as I got it and I am also sure I don't know how to solve it
The question is right.
 
  • #18
fresh_42 said:
Yes. An injective function satisfies ##f(x)=f(y) \Longrightarrow x=y## and if ##f## is linear as in our case, this means ##f(x-y)=0 \Longrightarrow x-y=0## or equivalent ##f(v)=0 \Longrightarrow v=0.##

So assume ##B(v)=0.## What does that tell you, given that ##AB=I\;?##
Assume ##B## is ##m\times m## matrix and ##B(v)=0##

$$AB=I$$
$$AB(v)=I(v)$$
$$A(0)=v$$
$$v=0$$

fresh_42 said:
Yes, although it is better known as rank-nullity theorem:
##\operatorname{rank} B + \operatorname{null} B = \dim\operatorname{im} B + \dim\operatorname{ker}B= dim (col B) + dim (nul B) = n= \dim V= \dim \operatorname{domain} B.##

Since you have (hopefully) shown that ##\dim\operatorname{ker}B=0## you will get the rank of ##B## from that formula.
I have not learned about "ker" but I searched a bit about it.

Dim (nul B) = 0, then rank B = ##m \rightarrow## B has full rank, so B has ##m## pivot columns and B is row equivalent to ##I##. This means that B is invertible

Is this correct? Thanks
 
  • Like
Likes fresh_42
  • #19
Are we allowed to use properties of determinants; specifically for Det(AB)?.
Or of Rank-Nullity and Rank of a product?
 
  • #20
WWGD said:
Are we allowed to use properties of determinants; specifically for Det(AB)?.
Or of Rank-Nullity and Rank of a product?
My lesson has not covered those things. For rank, the notes from the teacher only state "rank theorem", not Rank - Nullity and I only learn about rank of a matrix so far, not yet about rank of product of matrix.

Thanks
 
  • #21
songoku said:
My lesson has not covered those things. For rank, the notes from the teacher only state "rank theorem", not Rank - Nullity and I only learn about rank of a matrix so far, not yet about rank of product of matrix.

Thanks
It may be reasonably feasible to provide a broad. i. e., not the sharpest possible, bound for the rank of the product. If not, maybe we can drop this option.
 
  • #22
songoku said:
Assume ##B## is ##m\times m## matrix and ##B(v)=0##

$$AB=I$$
$$AB(v)=I(v)$$
$$A(0)=v$$
$$v=0$$
This is a good start. You've shown that ##B## is one-to-one - although critically you haven't actually said that. Is that what you set out to show? That ##B## is one-to-one.

In any case, if you can show that ##B## is also onto, then you have shown that ##B## is invertible.
songoku said:
I have not learned about "ker" but I searched a bit about it.

Dim (nul B) = 0, then rank B = ##m \rightarrow## B has full rank, so B has ##m## pivot columns and B is row equivalent to ##I##. This means that B is invertible

Is this correct? Thanks
There's no need to talk about full rank, pivot columns or row equivalence. You should have been aiming to show that ##B## is onto (or surjective, if that's the terminology you are familiar with).
 
  • Like
Likes songoku
  • #23
songoku said:
Is this correct? Thanks
It is.
 
  • #24
PeroK said:
Is that what you set out to show? That ##B## is one-to-one.
Yes

PeroK said:
There's no need to talk about full rank, pivot columns or row equivalence. You should have been aiming to show that ##B## is onto (or surjective, if that's the terminology you are familiar with).
I think I can show ##B## is onto but using pivot point. Since ##B## has pivot point on each row, ##B## is onto.

How to show it without using pivot?

Thanks
 
  • #25
songoku said:
My lesson has not covered those things. For rank, the notes from the teacher only state "rank theorem", not Rank - Nullity and I only learn about rank of a matrix so far, not yet about rank of product of matrix.

Thanks

What is the rank theorem?
 
  • #26
Office_Shredder said:
What is the rank theorem?
If B has n columns, rank B + dim nul B = n
 
  • #27
songoku said:
If B has n columns, rank B + dim nul B = n

This is also known as the rank nullity theorem. So what is dim nul B, given what you have figured out so far?
 
  • Like
Likes songoku
  • #28
Office_Shredder said:
This is also known as the rank nullity theorem. So what is dim nul B, given what you have figured out so far?
dim nul B = 0
 
  • #29
songoku said:
dim nul B = 0

This is correct, and if will be more educational if you think about what other conclusions you can now make.
 
  • Like
Likes songoku
  • #30
Office_Shredder said:
This is correct, and if will be more educational if you think about what other conclusions you can now make.
I have tried to answer the question in post #18
songoku said:
Dim (nul B) = 0, then rank B = ##m \rightarrow## B has full rank, so B has ##m## pivot columns and B is row equivalent to ##I##. This means that B is invertible

Or do you refer to post #24?
songoku said:
I think I can show ##B## is onto but using pivot point. Since ##B## has pivot point on each row, ##B## is onto.

How to show it without using pivot?

Thanks
 
  • #31
songoku said:
I think I can show ##B## is onto but using pivot point. Since ##B## has pivot point on each row, ##B## is onto.

How to show it without using pivot?

Thanks
I must admit I've never heard of "pivot point" in linear algebra. You can show that ##B## is onto if it is one-to-one by using a basis for the vector space, ##V##, where ##B: V \to V## is a linear mapping.

Let ##\{ e_1, e_2 \dots e_n\}## be a basis for V and consider ##\{ Be_1, Be_2 \dots Be_n\}##. By the linearity of ##B## we have:
$$a_1Be_1 + a_2Be_2 + \dots a_nBa_n = 0 \ \Rightarrow \ B(a_1e_1 + \dots + a_ne_n) = 0$$(And, as ##B## is one-to-one and the ##e_i## are linearly independent):
$$\Rightarrow \ a_1e_1 + \dots + a_ne_n = 0 \ \Rightarrow \ a_1 = a_2 \dots = a_n = 0$$This shows that the vectors ##Be_i## are linearly independent and hence a basis for ##V##.

Finally, if ##v \in V##, then for some scalars ##v_i##:
$$v = v_1Be_1 + \dots + v_nBe_n = B(v_1e_1 + \dots + v_ne_n)$$And, as ##v## was arbitrary, we see that ##B## is onto.
 
  • Like
Likes songoku
  • #32
PeroK said:
This shows that the vectors ##Be_i## are linearly independent and hence a basis for ##V##.

Finally, if ##v \in V##, then for some scalars ##v_i##:
$$v = v_1Be_1 + \dots + v_nBe_n = B(v_1e_1 + \dots + v_ne_n)$$And, as ##v## was arbitrary, we see that ##B## is onto.
And how does this differ from the rank-nullity theorem the OP already correctly used? You use surjectivity to prove surjectivity. But I don't want to confuse the OP more than he already is. His solution might not have been perfectly phrased, nevertheless, it was correct (post #18).

The idea with the Pivots is ok. In the end, we do have not enough information on what he may use according to his book, and what lies ahead.
 
  • #33
fresh_42 said:
And how does this differ from the rank-nullity theorem the OP already correctly used?
The OP asked how to prove it without using pivots, so I showed him.

fresh_42 said:
You use surjectivity to prove surjectivity.
I used injectivity to prove surjectivity. Which seems like an important concept in linear algebra and a neat solution.
 
  • #34
PeroK said:
The OP asked how to prove it without using pivots, so I showed him.I used injectivity to prove surjectivity. Which seems like an important concept in linear algebra and a neat solution.
Yes, and it is called the rank-nullity theorem.
 
  • #35
PeroK said:
I must admit I've never heard of "pivot point" in linear algebra.
Maybe the term that you are familiar with is "pivot" or "pivot element" or "pivot position"

Thank you very much for all the help and explanation fresh_42, PeroK, Office_Shredder, Addez123, WWGD
 
  • Like
Likes Delta2, WWGD and fresh_42

Similar threads

Replies
20
Views
2K
Replies
4
Views
2K
Replies
2
Views
2K
Replies
4
Views
4K
Replies
3
Views
2K
Replies
16
Views
4K
Replies
1
Views
2K
Back
Top