Proving statements regarding invertible matrices

  • MHB
  • Thread starter Bruce Wayne1
  • Start date
  • Tags
    Matrices
In summary: Now, by the definition of $\mathbf x \mapsto A \mathbf x$, for each $\mathbf v \in \mathbb R^n$, the equation $\mathbf x = \lambda_1 \mathbf a_1 + \lambda_2 \mathbf a_2 + ... + \lambda_n \mathbf a_n$ has a solution. This implies that $\mathbf x = \mathbf v$ for every $\mathbf v \in \mathbb R^n$.
  • #1
Bruce Wayne1
15
0
I've been watching the OCW Linear Algebra course and I have a textbook, and I came across something that I think is fascinating: The Invertible Matrix Theorem. I've seen some proofs and I find that a lot of the statements are linked in different orders and sometimes the author will site one tiny little theorem and justify 5 statements at once. I've been able to prove a few on my own. I'm looking at some and I don't understand why they work exactly. OK. I want to prove that the columns of matrix A span Rn and that this is equivalent to the linear transformation x->Ax maps Rn onto Rn. I would like to prove the converse is also true.

So my thinking is:

Assume the columns of matrix A span Rn. By the definition of Ax , for each b in Rn , the equation Ax=b has a solution. This implies that T(x)=b has at least one solution.

That's when I get myself totally confused. I think I'm missing a few intermediary steps in the proof. How do I do this proof?
 
Physics news on Phys.org
  • #2
Hi there. How many other parts of the invertible matrix theorem may we use while doing this proof? If we look at this list of components of this theorem, you are trying to prove that (h) implies (i) and the converse. Can we use (a)-(g)? If not you'll have to define and prove some other stuff first I believe.
 
  • #3
Jameson said:
Hi there. How many other parts of the invertible matrix theorem may we use while doing this proof? If we look at this list of components of this theorem, you are trying to prove that ( h) implies (i) and the converse. Can we use (a)-(g)? If not you'll have to define and prove some other stuff first I believe.

Correct, I am trying to prove ( h)<-->(i) from that list. I'm trying to do it directly between those two, without using the others. However, I do understand (g) can be used, as I tried, but how do I justify it? In other words, I need to have it justified. I'm not looking to just say "oh, statement blah is equivalent, done" because that's how the book does it and I'm trying to justify each step.
 
  • #4
Bruce Wayne said:
I've been watching the OCW Linear Algebra course and I have a textbook, and I came across something that I think is fascinating: The Invertible Matrix Theorem. I've seen some proofs and I find that a lot of the statements are linked in different orders and sometimes the author will site one tiny little theorem and justify 5 statements at once. I've been able to prove a few on my own. I'm looking at some and I don't understand why they work exactly. OK. I want to prove that the columns of matrix A span Rn and that this is equivalent to the linear transformation x->Ax maps Rn onto Rn. I would like to prove the converse is also true.

So my thinking is:

Assume the columns of matrix A span Rn. By the definition of Ax , for each b in Rn , the equation Ax=b has a solution. This implies that T(x)=b has at least one solution.

That's when I get myself totally confused. I think I'm missing a few intermediary steps in the proof. How do I do this proof?

Hail, mighty B! (Bow)

In any proof you should always start from the definitions.

Definition from wiki:
the span of a set S may be defined as the set of all finite linear combinations of elements of S.
In your case that means that any vector $\mathbf v \in \mathbb R^n$ can be written as
$\qquad \mathbf v = \lambda_1 \mathbf a_1 + \lambda_2 \mathbf a_2 + ... + \lambda_n \mathbf a_n$
where $\mathbf a_i$ is the set of column vectors of the matrix $A$, and $\lambda_i$ are the factors making up the linear combination with respect to these column vectors.The Definition from wiki about a function that maps onto:
a function f with domain X and codomain Y is surjective if for every y in Y there exists at least one x in X with f(x)=y.
In your case that means that for any vector $\mathbf v \in \mathbb R^n$ there is at least one vector $\mathbf x \in \mathbb R^n$ such that $A \mathbf x = \mathbf v$.
So let's write your first half of the proof a bit neater.Lemma 1
If the columns of matrix $A$ span $\mathbb R^n$, then the linear transformation $\mathbf x \mapsto A \mathbf x$ maps $\mathbb R^n$ onto $\mathbb R^n$.

Proof
Assume the columns of matrix $A$ span $\mathbb R^n$.

Then any vector $\mathbf v \in \mathbb R^n$ can be written as
$\qquad \mathbf v = \lambda_1 \mathbf a_1 + \lambda_2 \mathbf a_2 + ... + \lambda_n \mathbf a_n$
where $\mathbf a_i$ is the set of column vectors of the matrix $A$, and $\lambda_i$ are the factors making up the linear combination with respect to these column vectors.

Using the definition of matrix multiplication, we can rewrite this as:
$$\mathbf v = A \begin{bmatrix}\lambda_1 \\ \lambda_2 \\ \vdots \\ \lambda_n\end{bmatrix}$$
In other words, for any vector $\mathbf v \in \mathbb R^n$ there is at least one vector $\mathbf x \in \mathbb R^n$ such that $A \mathbf x = \mathbf v$.

Therefore the linear transformation $\mathbf x \mapsto A \mathbf x$ maps $\mathbb R^n$ onto $\mathbb R^n$. $\qquad \blacksquare$
Perhaps you can do the second (converse) half of the proof?
 
Last edited:
  • #5
I like Serena said:
Hail, mighty B! (Bow)

In any proof you should always start from the definitions.

Definition from wiki:
the span of a set S may be defined as the set of all finite linear combinations of elements of S.
In your case that means that any vector $\mathbf v \in \mathbb R^n$ can be written as
$\qquad \mathbf v = \lambda_1 \mathbf a_1 + \lambda_2 \mathbf a_2 + ... + \lambda_n \mathbf a_n$
where $\mathbf a_i$ is the set of column vectors of the matrix $A$, and $\lambda_i$ are the factors making up the linear combination with respect to these column vectors.The Definition from wiki about a function that maps onto:
a function f with domain X and codomain Y is surjective if for every y in Y there exists at least one x in X with f(x)=y.
In your case that means that for any vector $\mathbf v \in \mathbb R^n$ there is at least one vector $\mathbf x \in \mathbb R^n$ such that $A \mathbf x = \mathbf v$.
So let's write your first half of the proof a bit neater.Lemma 1
If the columns of matrix $A$ span $\mathbb R^n$, then the linear transformation $\mathbf x \mapsto A \mathbf x$ maps $\mathbb R^n$ onto $\mathbb R^n$.

Proof
Assume the columns of matrix $A$ span $\mathbb R^n$.

Then any vector $\mathbf v \in \mathbb R^n$ can be written as
$\qquad \mathbf v = \lambda_1 \mathbf a_1 + \lambda_2 \mathbf a_2 + ... + \lambda_n \mathbf a_n$
where $\mathbf a_i$ is the set of column vectors of the matrix $A$, and $\lambda_i$ are the factors making up the linear combination with respect to these column vectors.

Using the definition of matrix multiplication, we can rewrite this as:
$$\mathbf v = A \begin{bmatrix}\lambda_1 \\ \lambda_2 \\ \vdots \\ \lambda_n\end{bmatrix}$$
In other words, for any vector $\mathbf v \in \mathbb R^n$ there is at least one vector $\mathbf x \in \mathbb R^n$ such that $A \mathbf x = \mathbf v$.

Therefore the linear transformation $\mathbf x \mapsto A \mathbf x$ maps $\mathbb R^n$ onto $\mathbb R^n$. $\qquad \blacksquare$
Perhaps you can do the second (converse) half of the proof?

Thanks. Here's my attempt:

Assume the linear transformation $\mathbf x \mapsto A \mathbf x$ maps $\mathbb R^n$ onto $\mathbb R^n$.

For all $\mathbf v \in \mathbb R^n$ there is at least one vector $\mathbf x \in \mathbb R^n$ such that $A \mathbf x = \mathbf v$.

$A \mathbf x = \mathbf v$ can be rewritten as $$\mathbf v = A \begin{bmatrix}\lambda_1 \\ \lambda_2 \\ \vdots \\ \lambda_n\end{bmatrix}$$

This can further be rewritten as $\qquad \mathbf v = \lambda_1 \mathbf a_1 + \lambda_2 \mathbf a_2 + ... + \lambda_n \mathbf a_n$

Therefore the columns of matrix $A$ span $\mathbb R^n$. $\qquad \blacksquare$
 
  • #6
Bruce Wayne said:
Thanks. Here's my attempt:

Assume the linear transformation $\mathbf x \mapsto A \mathbf x$ maps $\mathbb R^n$ onto $\mathbb R^n$.

For all $\mathbf v \in \mathbb R^n$ there is at least one vector $\mathbf x \in \mathbb R^n$ such that $A \mathbf x = \mathbf v$.

$A \mathbf x = \mathbf v$ can be rewritten as $$\mathbf v = A \begin{bmatrix}\lambda_1 \\ \lambda_2 \\ \vdots \\ \lambda_n\end{bmatrix}$$

This can further be rewritten as $\qquad \mathbf v = \lambda_1 \mathbf a_1 + \lambda_2 \mathbf a_2 + ... + \lambda_n \mathbf a_n$

Therefore the columns of matrix $A$ span $\mathbb R^n$. $\qquad \blacksquare$

Looks just fine! ;)
 
  • #7
Thanks! It was a lot simpler than I had anticipated. (Smile)
 

FAQ: Proving statements regarding invertible matrices

What is an invertible matrix?

An invertible matrix is a square matrix that has a unique solution for every set of linear equations. This means that the matrix can be reversed or “inverted” to find the original values of the variables in the equations.

How can I prove that a matrix is invertible?

One way to prove that a matrix is invertible is by showing that its determinant is not equal to zero. Another method is by demonstrating that the matrix can be reduced to the identity matrix through elementary row operations.

Can all square matrices be inverted?

No, not all square matrices are invertible. A square matrix must have a non-zero determinant in order to be inverted. If the determinant is equal to zero, then the matrix is not invertible and is known as a singular matrix.

What is the significance of invertible matrices?

Invertible matrices are important in many areas of mathematics and science, as they allow us to solve systems of linear equations and perform other operations such as finding the inverse of a matrix. They also have applications in fields such as engineering, physics, and economics.

Are there any properties of invertible matrices?

Yes, there are several properties of invertible matrices. They include: an invertible matrix multiplied by its inverse results in the identity matrix, the inverse of an invertible matrix is also invertible, and the transpose of an invertible matrix is also invertible.

Similar threads

Replies
7
Views
2K
Replies
9
Views
4K
Replies
14
Views
2K
Replies
2
Views
2K
Replies
2
Views
2K
Replies
1
Views
961
Back
Top