Proving the Independence and Span of Matrix Columns | Theorem Help

  • Thread starter Thread starter stunner5000pt
  • Start date Start date
  • Tags Tags
    Matrices Theorem
stunner5000pt
Messages
1,443
Reaction score
4
Let A be a mxn matrix with columns C1,...Cn. If rank A = n, show taht
{A^T C_{1},...,A^T C_{N}}is a basis of Rn


since Rank A = n, then the columns are linearly independant

so does that automatically mean that any multiple,, like A transpose for example, will keep the independance of the Columns?

A theorem also tells us that if the Rank A = n, then the column span Rn. So the columns span Rn in this case

is this adequate for a proof?
 
Physics news on Phys.org
No, it is not. Multiplication by A is not just multiplication by a scalar. It could also rotate the vectors.

You need to show that the set {A^T C1, ... A^T Cn} is linearly independent, which implies it is a basis in Rn since it has n elements. Assume otherwise. Then you have a1A^T C1 + ... + anA^T Cn = 0 for a1 ... an not all zero. Can you manipulate this to get a contradiction?
 
The answer is below. Highlight the first white line if you need a hint. Every time you need a hint, highlight the next line. For your own good, highlight as few lines as possible, i.e. try to do most of it yourself.

{ATC1, ..., ATCn} is a basis of Rn

iff {ATC1, ..., ATCn} is linearly independent

iff the columns of the matrix (ATC1 ... ATCn) are linearly independent

iff det[(ATC1 ... ATCn)] is non-zero

iff det[AT(C1 ... Cn)] is non-zero

iff det[ATA] is non-zero

iff det[AT]det[A] is non-zero

iff det[A]det[A] is non-zero

iff det[A] is non-zero


iff rank[A] = n
 
Last edited:
AKG said:
The answer is below. Highlight the first white line if you need a hint. Every time you need a hint, highlight the next line. For your own good, highlight as few lines as possible, i.e. try to do most of it yourself.

{ATC1, ..., ATCn} is a basis of Rn

iff {ATC1, ..., ATCn} is linearly independent

iff the columns of the matrix (ATC1 ... ATCn) are linearly independent

iff det[(ATC1 ... ATCn)] is non-zero

iff det[AT(C1 ... Cn)] is non-zero

iff det[ATA] is non-zero

iff det[AT]det[A] is non-zero

iff det[A]det[A] is non-zero

iff det[A] is non-zero


iff rank[A] = n

This would be true if A were a nxn matrix, but the question specifies that A is a mxn matrix.
 
Last edited:
0rthodontist said:
No, it is not. Multiplication by A is not just multiplication by a scalar. It could also rotate the vectors.

You need to show that the set {A^T C1, ... A^T Cn} is linearly independent, which implies it is a basis in Rn since it has n elements. Assume otherwise. Then you have a1A^T C1 + ... + anA^T Cn = 0 for a1 ... an not all zero. Can you manipulate this to get a contradiction?

ok supose that
a_{1}A^T C_{1} + ... + a_{n}A^T C_{n} = 0
where not all ai are zero

we know that A is not a zero matrix because the rank A = n <= m
not A transpose is not zero
then the Columns must be all zero
But again A is not zero. Thus by contradiction we get that all the ai must be zero

IS that good?
 
nocturnal said:
This would be true if A were a nxn matrix, but the question specifies that A is a mxn matrix.
Oops, I overlooked that.
 
stunner5000pt said:
ok supose that
a_{1}A^T C_{1} + ... + a_{n}A^T C_{n} = 0
where not all ai are zero

we know that A is not a zero matrix because the rank A = n <= m
not A transpose is not zero
then the Columns must be all zero
But again A is not zero. Thus by contradiction we get that all the ai must be zero

IS that good?
No. Why is the underlined stuff true? Try this:

{ATC1, ..., ATCn} is a basis
iff {ATC1, ..., ATCn} is linearly independent
iff c1ATC1 + ... + cnATCn = 0 implies c1 = ... = cn = 0
iff AT(c1C1 + ... + cnCn) = 0 implies c1 = ... = cn = 0
iff AT(c1C1 + ... + cnCn) = 0 implies c1C1 + ... + cnCn = 0 [since the Ci are linearly independent since Rank[A] = n, so \sum c_iC_i = 0 \Leftrightarrow \forall i,\, c_i=0]
iff AT(X) = 0 implies X = 0

But rank[AT] = rank[A] = n. Is there some theorem which says that, given this, that italicized line must be true?
 
A^T has all independent columns since it has rank n. Therefore, A^T(X) = 0 can only be true if X is 0. Otherwise you would have a linear combination of the columns of A^T, with not all nonzero coefficients, yielding 0, which we know cannot happen.
 
A^T has all independent columns since it has rank n
You mean rows.


Anyways, I think everyone's overlooking something important!

a_{1}A^T C_{1} + ... + a_{n}A^T C_{n} = 0

This means that the vector a_1 C_1 + \cdots + a_n C_n is an element of the null space of A^T. People seem to be arguing that the null space is trivial, and therefore the vector must be zero.

But that's wrong!

A^T is a rank n nxm matrix. Since it's operating on an m-dimensional space, it must have a null space of dimension (m-n).


You have to use the fact that these m-long column vectors are special -- each of the C_n is the transpose of one of the rows of A^T.


In fact, if we place the A^T C_i into a matrix, we get:

<br /> [ A^T C_1 \mid A^T C_2 \mid \cdots \mid A^T C_n ] = A^T A<br />

There's actually a 1-line proof that this (square) matrix is nonsingular, but I think the approach you guys are using ought to work, as long as you start using the fact that the C_i are special m-long vectors, and not arbitrary m-long vectors.
 
Last edited:
  • #10
Hurkyl said:
You mean rows.

:redface: Oh yeah... I was picturing it wrong.
 
  • #11
There's actually a 1-line proof that this (square) matrix is nonsingular
When m=n, this is easy, and that's what I did in my first post because I assumed we were dealing with square matrices. If m is not n, what is the 1-line proof?
 
  • #12
<br /> A^TA \vec{v} = 0 \implies \vec{v}^T A^T A \vec{v} = 0<br /> \implies (A\vec{v})^T (A \vec{v}) = 0 \implies <br /> A \vec{v} = 0 \implies \vec{v} = 0
Where the last one follows because A has rank n and is operating on R^n.
 
  • #13
Very nice, thanks Hurkyl.
 
Back
Top