Proving vectors are in the column space

In summary: It is often denoted by ##\operatorname{Span}(S)##.EDIT 3: In summary, the column space is a subspace that is closed under addition of vectors, as it is the set of all linear combinations of row vectors. This can be proven by showing that the column space is the smallest subspace of a vector space containing the given set of row vectors, making it the unique subspace spanned by those vectors.
  • #1
Taylorw369
1
0
How would you prove that adding two vectors in the column space would result in another vector in the column space?

I know this is maybe the most basic property of vectors and subspaces, and that the very definition of the column space says it's spanned by vectors in the column space. Is there any way to prove this, though?
 
Physics news on Phys.org
  • #2
Taylorw369 said:
How would you prove that adding two vectors in the column space would result in another vector in the column space?

I know this is maybe the most basic property of vectors and subspaces, and that the very definition of the column space says it's spanned by vectors in the column space. Is there any way to prove this, though?
What, exactly, are you asking? You say that you know "very definition of the column space says it's spanned by vectors in the column space." You understand that you don't "prove" definitions don't you?
 
  • #3
The column space is a subspace, so that it is closed under addition of vectors.

EDIT: Sorry, I guess you were trying to show that the column space is a subspace. Let me
think it through.

EDIT: by definition, see , e.g.:http://en.wikipedia.org/wiki/Column_space , the column space

is the set of all linear combinations of the row vectors. Take then a combination of row

vectors and add another combination to it to show the sum is itself a combination.

Does that help?

EDIT 2: By definition, a linear transformation takes vector spaces (subspaces) to subspaces.
The column space is the linear image of a vector space is a vector space; linear maps take
vector spaces to vector spaces.
 
Last edited:
  • #4
If X is a vector space over ℝ and S is a subset of X, the set of linear combinations of elements of S is a subspace of X. The proof is trivial: Let's denote the set of linear combinations of elements of S by W. We obviously have ##0\in W##. Let ##a,b\in\mathbb R## and ##x,y\in S## be arbitrary. We have ##ax+by\in W##. (That's the entire proof).

The subspace W can be equivalently defined as the intersection of all subspaces of X that contain S. If you show that these definitions are equivalent, you can easily prove that W is the "smallest" subspace of X that contains S, in the sense that if V is another such subspace, we have ##W\subseteq V##. You can also easily prove that there's only one "smallest" subspace of X that contains S.

So W is the unique smallest subspace of X that contains S. This space is called the subspace generated by S, or the subspace spanned by S.
 
  • #5


Yes, there is a way to prove that adding two vectors in the column space will result in another vector in the column space. This can be done by using the definition of the column space and the properties of vector addition.

First, let's define the column space as the set of all linear combinations of the columns of a matrix. This means that any vector in the column space can be written as a linear combination of the columns of the matrix.

Now, let's say we have two vectors, v and w, in the column space. This means that there exist scalars a1, a2, ..., an and b1, b2, ..., bm such that v = a1c1 + a2c2 + ... + anc_n and w = b1c1 + b2c2 + ... + bmc_m, where c1, c2, ..., cn and c1, c2, ..., cm are the columns of the matrix.

To prove that adding v and w will result in another vector in the column space, we need to show that there exist scalars c1', c2', ..., cn' such that v + w = c1'c1 + c2'c2 + ... + cn'c_n.

To do this, we can use the properties of vector addition. We know that v + w = (a1 + b1)c1 + (a2 + b2)c2 + ... + (an + bn)c_n. Since a1 + b1, a2 + b2, ..., an + bn are all scalars, we can let c1' = a1 + b1, c2' = a2 + b2, ..., cn' = an + bn. Therefore, v + w = c1'c1 + c2'c2 + ... + cn'c_n, which is a linear combination of the columns of the matrix and thus, is in the column space.

Therefore, we have shown that adding two vectors in the column space will result in another vector in the column space, proving that vectors in the column space are closed under vector addition.
 

Related to Proving vectors are in the column space

1. What is the column space of a matrix?

The column space of a matrix is the set of all possible linear combinations of its column vectors. In other words, it is the span of the column vectors in the matrix.

2. How do you prove that a vector is in the column space of a matrix?

To prove that a vector is in the column space of a matrix, you can find the coefficients of the linear combination that produces the vector. These coefficients will be the entries in the vector when it is written as a linear combination of the column vectors in the matrix.

3. Can a vector be in the column space of more than one matrix?

Yes, a vector can be in the column space of more than one matrix. This is because the column space is defined by the span of the column vectors, so as long as the vector can be written as a linear combination of the column vectors in the matrix, it is considered to be in the column space.

4. How can we determine the dimension of the column space of a matrix?

The dimension of the column space of a matrix is equal to the number of linearly independent column vectors in the matrix. This can be determined by performing row operations on the matrix and counting the number of pivot columns.

5. Why is it important to prove that a vector is in the column space of a matrix?

Proving that a vector is in the column space of a matrix is important because it allows us to determine whether the vector can be obtained by a linear combination of the column vectors in the matrix. This can be useful in solving systems of linear equations and in understanding the properties of the matrix.

Similar threads

  • Linear and Abstract Algebra
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
14
Views
2K
  • Linear and Abstract Algebra
Replies
18
Views
785
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
15
Views
4K
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
7
Views
657
  • Linear and Abstract Algebra
Replies
3
Views
644
Back
Top