Proving the Independence and Span of Matrix Columns | Theorem Help

In summary: C_n is also zero. This is because every column of A^T is linearly independent. So if we add any two columns, the sum of the products will always be zero.
  • #1
stunner5000pt
1,465
4
Let A be a mxn matrix with columns C1,...Cn. If rank A = n, show taht
[itex] {A^T C_{1},...,A^T C_{N}} [/itex]is a basis of Rn


since Rank A = n, then the columns are linearly independant

so does that automatically mean that any multiple,, like A transpose for example, will keep the independance of the Columns?

A theorem also tells us that if the Rank A = n, then the column span Rn. So the columns span Rn in this case

is this adequate for a proof?
 
Physics news on Phys.org
  • #2
No, it is not. Multiplication by A is not just multiplication by a scalar. It could also rotate the vectors.

You need to show that the set {A^T C1, ... A^T Cn} is linearly independent, which implies it is a basis in Rn since it has n elements. Assume otherwise. Then you have a1A^T C1 + ... + anA^T Cn = 0 for a1 ... an not all zero. Can you manipulate this to get a contradiction?
 
  • #3
The answer is below. Highlight the first white line if you need a hint. Every time you need a hint, highlight the next line. For your own good, highlight as few lines as possible, i.e. try to do most of it yourself.

{ATC1, ..., ATCn} is a basis of Rn

iff {ATC1, ..., ATCn} is linearly independent

iff the columns of the matrix (ATC1 ... ATCn) are linearly independent

iff det[(ATC1 ... ATCn)] is non-zero

iff det[AT(C1 ... Cn)] is non-zero

iff det[ATA] is non-zero

iff det[AT]det[A] is non-zero

iff det[A]det[A] is non-zero

iff det[A] is non-zero


iff rank[A] = n
 
Last edited:
  • #4
AKG said:
The answer is below. Highlight the first white line if you need a hint. Every time you need a hint, highlight the next line. For your own good, highlight as few lines as possible, i.e. try to do most of it yourself.

{ATC1, ..., ATCn} is a basis of Rn

iff {ATC1, ..., ATCn} is linearly independent

iff the columns of the matrix (ATC1 ... ATCn) are linearly independent

iff det[(ATC1 ... ATCn)] is non-zero

iff det[AT(C1 ... Cn)] is non-zero

iff det[ATA] is non-zero

iff det[AT]det[A] is non-zero

iff det[A]det[A] is non-zero

iff det[A] is non-zero


iff rank[A] = n
This would be true if A were a nxn matrix, but the question specifies that A is a mxn matrix.
 
Last edited:
  • #5
0rthodontist said:
No, it is not. Multiplication by A is not just multiplication by a scalar. It could also rotate the vectors.

You need to show that the set {A^T C1, ... A^T Cn} is linearly independent, which implies it is a basis in Rn since it has n elements. Assume otherwise. Then you have a1A^T C1 + ... + anA^T Cn = 0 for a1 ... an not all zero. Can you manipulate this to get a contradiction?

ok supose that
[tex] a_{1}A^T C_{1} + ... + a_{n}A^T C_{n} = 0 [/tex]
where not all ai are zero

we know that A is not a zero matrix because the rank A = n <= m
not A transpose is not zero
then the Columns must be all zero
But again A is not zero. Thus by contradiction we get that all the ai must be zero

IS that good?
 
  • #6
nocturnal said:
This would be true if A were a nxn matrix, but the question specifies that A is a mxn matrix.
Oops, I overlooked that.
 
  • #7
stunner5000pt said:
ok supose that
[tex] a_{1}A^T C_{1} + ... + a_{n}A^T C_{n} = 0 [/tex]
where not all ai are zero

we know that A is not a zero matrix because the rank A = n <= m
not A transpose is not zero
then the Columns must be all zero
But again A is not zero. Thus by contradiction we get that all the ai must be zero

IS that good?
No. Why is the underlined stuff true? Try this:

{ATC1, ..., ATCn} is a basis
iff {ATC1, ..., ATCn} is linearly independent
iff c1ATC1 + ... + cnATCn = 0 implies c1 = ... = cn = 0
iff AT(c1C1 + ... + cnCn) = 0 implies c1 = ... = cn = 0
iff AT(c1C1 + ... + cnCn) = 0 implies c1C1 + ... + cnCn = 0 [since the Ci are linearly independent since Rank[A] = n, so [itex]\sum c_iC_i = 0 \Leftrightarrow \forall i,\, c_i=0[/itex]]
iff AT(X) = 0 implies X = 0

But rank[AT] = rank[A] = n. Is there some theorem which says that, given this, that italicized line must be true?
 
  • #8
A^T has all independent columns since it has rank n. Therefore, A^T(X) = 0 can only be true if X is 0. Otherwise you would have a linear combination of the columns of A^T, with not all nonzero coefficients, yielding 0, which we know cannot happen.
 
  • #9
A^T has all independent columns since it has rank n
You mean rows.


Anyways, I think everyone's overlooking something important!

[tex] a_{1}A^T C_{1} + ... + a_{n}A^T C_{n} = 0 [/tex]

This means that the vector [itex]a_1 C_1 + \cdots + a_n C_n[/itex] is an element of the null space of [itex]A^T[/itex]. People seem to be arguing that the null space is trivial, and therefore the vector must be zero.

But that's wrong!

[itex]A^T[/itex] is a rank n nxm matrix. Since it's operating on an m-dimensional space, it must have a null space of dimension (m-n).


You have to use the fact that these m-long column vectors are special -- each of the [itex]C_n[/itex] is the transpose of one of the rows of [itex]A^T[/itex].


In fact, if we place the [itex]A^T C_i[/itex] into a matrix, we get:

[tex]
[ A^T C_1 \mid A^T C_2 \mid \cdots \mid A^T C_n ] = A^T A
[/tex]

There's actually a 1-line proof that this (square) matrix is nonsingular, but I think the approach you guys are using ought to work, as long as you start using the fact that the [itex]C_i[/itex] are special m-long vectors, and not arbitrary m-long vectors.
 
Last edited:
  • #10
Hurkyl said:
You mean rows.

:redface: Oh yeah... I was picturing it wrong.
 
  • #11
There's actually a 1-line proof that this (square) matrix is nonsingular
When m=n, this is easy, and that's what I did in my first post because I assumed we were dealing with square matrices. If m is not n, what is the 1-line proof?
 
  • #12
[tex]
A^TA \vec{v} = 0 \implies \vec{v}^T A^T A \vec{v} = 0
\implies (A\vec{v})^T (A \vec{v}) = 0 \implies
A \vec{v} = 0 \implies \vec{v} = 0[/tex]
Where the last one follows because A has rank n and is operating on R^n.
 
  • #13
Very nice, thanks Hurkyl.
 

FAQ: Proving the Independence and Span of Matrix Columns | Theorem Help

What is the theorem for proving the independence and span of matrix columns?

The theorem is called the Linear Independence Theorem, and it states that if a set of vectors is linearly independent, then the span of those vectors is equal to the number of vectors in the set.

How do you prove the independence of matrix columns?

To prove the independence of matrix columns, you need to set up a system of equations and use the principle of linear dependence to show that the only solution is the trivial solution (all coefficients equal to 0).

What is the significance of proving the independence of matrix columns?

Proving the independence of matrix columns is important in linear algebra because it allows us to determine the rank of a matrix and solve systems of equations. It also helps us identify the basis of a vector space.

Can you use the same method to prove the independence of rows in a matrix?

Yes, you can use the same method to prove the independence of rows in a matrix. However, you will need to transpose the matrix in order to apply the principle of linear dependence.

How does the span of matrix columns relate to the concept of linear combinations?

The span of matrix columns is the set of all possible linear combinations of the columns. This means that any vector in the span can be expressed as a linear combination of the matrix columns. Therefore, proving the independence of matrix columns allows us to determine the number of linearly independent vectors in a set, and thus, the number of linear combinations that can be formed.

Similar threads

Back
Top