- #1
Subdot
- 78
- 1
Homework Statement
"Prove that a set S of three vectors in V3 is a basis for V3 if and only if its linear span L(S) contains the three unit coordinate vectors i, j, k."
Homework Equations
I have the definitions of bases, linear independence, and linear spans. I have the theorems which states that a set of n linearly independent vectors is a basis, that every basis in Vn contains n vectors, that a set of linearly independent vectors is a subset of some basis, and that any orthogonal set is linearly independent.
I also have the theorem that states if a set of k vectors in Vn is linearly independent, then any set of k + 1 vectors in its linear span is linearly dependent. And of course, I have the theorem which states that a set spans every vector in its linear span uniquely if and only if the set spans the zero vector uniquely, and I know that a set is linearly independent if it spans the zero vector uniquely.
The Attempt at a Solution
I can prove the first part: "A set S of three vectors in V3 contains the three unit coordinate vectors i, j, k in L(S) if S is a basis for V3." I'm having trouble with the second part. I can prove it, but I don't like my proof because it only requires i and k to be in the set's linear span, it isn't *too* easy to generalize to a set of n vectors--which I have to do after this proof, and I think there may be some unnecessary parts to it.
So here is what I've got so far. My plan was to prove S was linearly independent by contradiction. There are n vectors in S, so S would be a basis for V3 by one of the theorem's I listed. To do this, I would first need to prove that none of the vectors in S are the zero vector (which I'll denote as 0).
(1)If i, j, and k are elements of L(S), then d1A1 + d2A2 + d3A3 = i, e1A1 + e2A2 + e3A3 = j, and f1A1 + f2A2 + f3A3 = k for some scalar constants di, ei, and fi where i = 1, 2, 3 and S = {A1, A2, A3}.
Assume S is dependent, then Ai might equal 0 for i = 1, 2, 3. Assume all three are zero. This cannot be true because d1A1 + d2A2 + d3A3 = i =/= 0. So not all the vectors in S are 0. Assume two of the vectors in S are 0. Let all but A1 = 0, if necessary, renumbering the vectors in S to achieve this. Then, d1A1 = i. Thus, A1 is a scalar multiple of i. But e1A1 = j--a contradiction since i is not a scalar multilpe of j. So two vectors in S can be 0.
Assume one of the vectors in S = 0. Let A3 = 0, which can be achieved by renumbering the vectors in S if necessary. Then, d1A1 + d2A2 = i = (1, 0, 0).
For this to be true, the last two components of A1 must be a scalar multiples of A2 and the first component of A1 must not be a scalar multiple of A2. But, e1A1 + e2A2 = j = (0, 1, 0), so the first component of A1 must be a scalar multiple of the first component of A2--a contradiction.
So no vector in S is 0. In c1A1 + c2A2 + c3A3 = 0, where c1, c2, and c3 are scalar constants. Since by assumption S is dependent, and since no vector in S = 0, the only way for this to be dependent is if not all of the constants are 0.
(2)Assume that c3 =/= 0. This means that c1A1 + c2A2 = -c3A3 --> -(c1/c3)A1 - (c2/c3)A2 = A3 which means that the equation d1A1 + d2A2 + d3A3 = i can be rewritten as d1A1 + d2A2 - ((d3c1)/c3)A1 - ((d3c2)/c3)A2 = (d1 - ((d3c1)/c3))A1 + (d2 - ((d3c2)/c3))A2 = i.
This can be rewritten as k1A1 + k2A2 = i, k a scalar constant. Similarly, for a scalar constant p, p1A1 + p2A2 = j. But this is the same as the case when A3 = 0, and so by a similar line of reasoning, -(c1/c3)A1 - (c2/c3)A2 =/= A3.
But this contradicts that S is linearly dependent, so the only solution is c1 = c2 = c3 = 0, and S is linearly independent. Since S consists of three vectors and is in V3, by one of the theorem's I mentioned above, S is is a basis for V3.
So as you can see, this proof only depends on two unit vectors being in L(S), which I find odd since the hypothesis has i, j, and k being members of L(S). In addition, I'm not sure if all of that is necessary. Was I right in needing to prove all the vectors in S were not 0? Could I have just skipped to the part labeled (2) of my proof but include the proof of no two vectors in S spanning i, j, and k there?
Also, I have an alternative proof, which is simpler and quicker, but I wasn't sure if it was "legal" (and there may be some unnecessary parts still). I also haven't thought about it as much as the above one. Here it is:
If i, j, and k are elements of L(S), then d1A1 + d2A2 + d3A3 = i, e1A1 + e2A2 + e3A3 = j, and f1A1 + f2A2 + f3A3 = k for some scalar constants di, ei, and fi where i = 1, 2, 3 and S = {A1, A2, A3}.
Multiply the first equation by a nonzero scalar constant b1, the second by a nonzero scalar constant b2, and the third by a nonzero scalar constant b3. Then, add all the equations together to get (b1d1A1 + b1d2A2 + b1d3A3) + (b2e1A1 + b2e2A2 + b2e3A3) + (b3f1A1 + b3f2A2 + b3f3A3) = (b1d1 + b2e1 + b3f1)A1 + (b1d2 + b2e2 + b3f2)A2 + (b1d3 + b2e3 + b3f3)A3 = b1i + b2j + b3k.
In other words, for some nonzero constant scalars q1, q2, and q3, q1A1 + q2A2 + q3A3 = b1i + b2j + b3k. A linear combination of S produces a linear combination of D = {i, j, k}. Since D is a basis for V3, a linear combination of i, j, k spans every vector in V3 uniquely. Since a linear combination of A1, A2, and A3 can make a linear combination of i, j, k, they can make a unique linear combination for every vector in V3 too. Thus, S is a basis too.
Last edited: