Help with linear algebra: vectorspace and subspace

In summary, the speaker is struggling to understand vector spaces and subspaces, specifically in relation to matrix A and the column space, as well as finding bases for a set of matrices. They know that a subspace is a vector space that consists of a subset of vectors from a larger vector space and has three properties, and that the basis of a subspace is a set of linearly independent vectors that spans the subspace. However, they are unsure of the relation between a subspace and col(A) and are seeking clarification on whether a given statement is true or
  • #1
appletree23
12
2
Homework Statement
So I have two problems I'm really stuck on:

1. Determine if this statement is true or false:
Let H be subspace of ##R^n##, then there must be matrix A with the form ##n\times n## so that H=col (A).

2. Let ##M_{n\times n}## be a set of #n\times n# matrices. Is this set a vector space? Find to bases for ##M_{2\times 2}##.
Relevant Equations
##dim (H)\leqslant dim(V)##
So the reason why I'm struggling with both of the problems is because I find vector spaces and subspaces hard to understand. I have read a lot, but I'm still confussed about these tasks.

1. So for problem 1, I can first tell you what I know about subspaces. I understand that a subspace is a vector space that consists of a subset of vectors from a larger vector space (##R^n## in this case I guess) and that there are three properties a subspace have. I also know that the basis of the subspace is the set of linearly independent vectors that spans H. The dimension of the subspace H must be equal to or less than the dimension of the vectorspace it is the subspace of. But what is the relation between the subspace and col(A)? The only thing that my book says about col A is that the pivot columns of a matrix A forms the basis of col A. I don't understand this problem at all, and don't know if the statement is true or false.

2. I have seen several proofs for this type of problem but with ##{n\times m}## matrices and using the 8 axioms. But is there a difference if there is ##{n\times n}## ? And how do one find the two different bases for ##M_{2\times 2}##? I have tried to find something in my book about the last question, but it is not much theory about vectorspaces defined by matrices.
 
Physics news on Phys.org
  • #2
appletree23 said:
Homework Statement:: So I have two problems I'm really stuck on:

1. Determine if this statement is true or false:
Let H be subspace of ##R^n##, then there must be matrix A with the form #n\times n# so that H=col (A).

2. Let ##M_{n\times n}## be a set of #n\times n# matrices. Is this set a vector space? Find to bases for ##M_{2\times 2}##.
Relevant Equations:: ##dim (H)\leqslant dim(V)##

So the reason why I'm struggling with both of the problems is because I find vector spaces and subspaces hard to understand. I have read a lot, but I'm still confussed about these tasks.
As I read your question, I imagined a plane in 3-d space, like a wooden board, just a bit bigger ;-)
1. So for problem 1, I can first tell you what I know about subspaces. I understand that a subspace is a vector space that consists of a subset of vectors from a larger vector space (##R^n## in this case I guess) and that there are three properties a subspace have. I also know that the basis of the subspace is the set of linearly independent vectors that spans H. The dimension of the subspace H must be equal to or less than the dimension of the vectorspace it is the subspace of. But what is the relation between the subspace and col(A)? The only thing that my book says about col A is that the pivot columns of a matrix A forms the basis of col A. I don't understand this problem at all, and don't know if the statement is true or false.
Can you define the column space of a matrix? What is ##col(A)##? You could e.g. take a basis of ##H## and arrange it as columns, filling up the rest with zeros. Is it that what is meant?
2. I have seen several proofs for this type of problem but with ##{n\times m}## matrices and using the 8 axioms. But is there a difference if there is #{n\times n}# ? And how do one find the two different bases for ##M_{2\times 2}##? I have tried to find something in my book about the last question, but it is not much theory about vectorspaces defined by matrices.
We have four arbitrary entries for ##M(2)=\left\{\begin{bmatrix}
a&b\\c&d
\end{bmatrix}\right\}##. This can equally be written as ##M(2)=\{(a,b,c,d)\}##. What would here be a basis?
 
  • Like
Likes appletree23
  • #3
appletree23 said:
1. So for problem 1, I can first tell you what I know about subspaces. I understand that a subspace is a vector space that consists of a subset of vectors from a larger vector space (Rn in this case I guess) and that there are three properties a subspace have.
It is simpler to show that a subset of a vector space is or is not a subspace of the containing vector space, since you don't have to verify all the axioms of a vector space.
The three properties, which you don't show, are:
  1. The 0 vector is in the subset.
  2. Vector addition is closed in the subset. I.e., if u and v are in the subset, then u + v is also in the subset.
  3. Scalar multiplication is closed in the subset. I.e., if u is a vector in the subset, and c is a scalar in the field of the vector space, then cu is also in the subset.
appletree23 said:
I also know that the basis of the subspace is the set of linearly independent vectors that spans H. The dimension of the subspace H must be equal to or less than the dimension of the vectorspace it is the subspace of.
Yes.
appletree23 said:
But what is the relation between the subspace and col(A)?
Presumably the notation col(A) means the columns of the matrix A. The columns of the matrix span the subspace, but might not be a basis for that subspace if they are linearly dependent.
 
  • Like
Likes appletree23
  • #4
fresh_42 said:
Can you define the column space of a matrix? What is col(A)? You could e.g. take a basis of H and arrange it as columns, filling up the rest with zeros. Is it that what is meant?

Hmm, I'm not sure what you mean. How can one take a basis for H if I don't know what H it is the subspace of? If it is the subspace if ##R^3## or ##R^4## for example? And how can I choose a basis for H if I don't have any specific set of vectors that span H?

fresh_42 said:
We have four arbitrary entries for M(2)={[abcd]}. This can equally be written as M(2)={(a,b,c,d)}. What would here be a basis?

I'm sorry, I do not understand properly. Do you mean that every matrix in ##M_{2\times2}## consist of several and equal matrices ##\left\{\begin{bmatrix}a&b\\c&d\end{bmatrix}\right\}##. I also find it hard to understand what you mean by ##M(2)=\{(a,b,c,d)\}## and finding its basis. I'm sorry by the way, for all the questions. I really want to understand this, but a lot about vectorspaces and subspaces makes me confussed. I have really tried to understand this and read, but I get so stuck.
 
  • #5
Mark44 said:
It is simpler to show that a subset of a vector space is or is not a subspace of the containing vector space, since you don't have to verify all the axioms of a vector space.
The three properties, which you don't show, are:
  1. The 0 vector is in the subset.
  2. Vector addition is closed in the subset. I.e., if u and v are in the subset, then u + v is also in the subset.
  3. Scalar multiplication is closed in the subset. I.e., if u is a vector in the subset, and c is a scalar in the field of the vector space, then cu is also in the subset.
Yes.
Presumably the notation col(A) means the columns of the matrix A. The columns of the matrix span the subspace, but might not be a basis for that subspace if they are linearly dependent.

Hmm, so you mean that I should use the three properties to determine if this statement is true or not? Thanks for your definition of col (A) by the way :)
 
  • #6
appletree23 said:
Hmm, I'm not sure what you mean. How can one take a basis for H if I don't know what H it is the subspace of? If it is the subspace if ##R^3## or ##R^4## for example? And how can I choose a basis for H if I don't have any specific set of vectors that span H?
You used ##H## and you used ##col(A)##. So what are they? How is ##H## given? What does ##col## stand for? A matrix itself is simply an array of numbers sorted in a square. So how are these numbers related to a vector space?
I'm sorry, I do not understand properly. Do you mean that every matrix in ##M_{2\times2}## consist of several and equal matrices ##\left\{\begin{bmatrix}a&b\\c&d\end{bmatrix}\right\}##. I also find it hard to understand what you mean by ##M(2)=\{(a,b,c,d)\}## and finding its basis. I'm sorry by the way, for all the questions. I really want to understand this, but a lot about vectorspaces and subspaces makes me confussed. I have really tried to understand this and read, but I get so stuck.
The (vector space) of all two by two matrices is the set ##M(2\times 2)=\left\{\left. \begin{bmatrix}a&b \\ c&d\end{bmatrix}\,\right|\,a,b,c,d \in \mathbb{R}\right\}##. But the ordering in a square is not important here, it is only a way to write down the matrices. We get exactly the same set if we write the elements in an array: ##M(2\times 2)=\left\{\left. (a,b , c,d)\,\right|\,a,b,c,d \in \mathbb{R}\right\}##, or as a column ##M(2\times 2)=\left\{\left. \begin{bmatrix}a\\b \\ c\\d\end{bmatrix}\,\right|\,a,b,c,d \in \mathbb{R}\right\}##. I only thought the question might be easier for you, if you saw those matrices written like a vector.
 
  • Like
Likes appletree23
  • #7
appletree23 said:
Hmm, so you mean that I should use the three properties to determine if this statement is true or not?
No. A subset H of a vector space V is a subspace if H itself is a vector space. In other words, H satisfies all of the axioms that a vector space does. To prove that H is a vector space, however, you only need to show it satisfies the three conditions @Mark44 noted. If you can do that, you're assured H also satisfies all of the axioms of a vector space.

In this problem, you're given that H is a subspace; you don't need to prove H is a subspace. Since H is a vector space in its own right, you know that there exists a set of vectors that form a basis for H. You don't have to know anything more specific about H or the set of vectors. You just need to know that you can indeed find a basis for H. Given that such a set of vectors exist, can you see how to construct a matrix A where the span of the columns of A is the same as the span of the basis?
 
  • Like
Likes PeroK and appletree23
  • #8
vela said:
No. A subset H of a vector space V is a subspace if H itself is a vector space. In other words, H satisfies all of the axioms that a vector space does. To prove that H is a vector space, however, you only need to show it satisfies the three conditions @Mark44 noted. If you can do that, you're assured H also satisfies all of the axioms of a vector space.

In this problem, you're given that H is a subspace; you don't need to prove H is a subspace. Since H is a vector space in its own right, you know that there exists a set of vectors that form a basis for H. You don't have to know anything more specific about H or the set of vectors. You just need to know that you can indeed find a basis for H. Given that such a set of vectors exist, can you see how to construct a matrix A where the span of the columns of A is the same as the span of the basis?

So I have been thinking about what you wrote here and maybe I understand it a little bit better (not sure by the way). So let us say that we have the subspace ##H## that is the subspace of ##R^6## (##n=6##)
If ##H## for example is defined by the set of vectors ##({x_1,x_2,x_3,x_4,x_5,x_6})##, the basis of the subspace is the vectors that span ##H## and are linearly independt. Let us say that the basis for example is then the vectors ##({x_1, x_2, x_3, x_4})##. The dimension of ##H## is equal to the numbers of vectors in the basis.

There are exactly n vectors in every basis for ##R^n##. And we have that ##dim(H)\leqslant dim(V)##. So we can make a matrix A that is ##6\times 6## and span H by setting up a matrix that contains the basisvectors from H, because the number of basis vectors in the example above must be equal or less that 6. If there are less than 6 basisvectors, we can add 2 0-columns to make it a ##6\times 6## matrix. So from the example above we can make the matrix A consisting of the columns ##[x_1, x_2 , x_3, x_4, 0, 0]##. This can be proven for any value of ##n## we choose. So the statement must be true.

I don't know if what I write above is right, so be free to correct me, especially if the statement is meant to be false. One thing I'm unsure of is that when I take away the vectors that are linearly dependent in ##H##, is the set of remaining vectors equal to ##H##? Because in the problem it says ##col(A)=H## and if A only contains the linearly independent vectors of H, then sentence before this one must be true (?).
 
  • #9
fresh_42 said:
You used ##H## and you used ##col(A)##. So what are they? How is ##H## given? What does ##col## stand for? A matrix itself is simply an array of numbers sorted in a square. So how are these numbers related to a vector space?

The (vector space) of all two by two matrices is the set ##M(2\times 2)=\left\{\left. \begin{bmatrix}a&b \\ c&d\end{bmatrix}\,\right|\,a,b,c,d \in \mathbb{R}\right\}##. But the ordering in a square is not important here, it is only a way to write down the matrices. We get exactly the same set if we write the elements in an array: ##M(2\times 2)=\left\{\left. (a,b , c,d)\,\right|\,a,b,c,d \in \mathbb{R}\right\}##, or as a column ##M(2\times 2)=\left\{\left. \begin{bmatrix}a\\b \\ c\\d\end{bmatrix}\,\right|\,a,b,c,d \in \mathbb{R}\right\}##. I only thought the question might be easier for you, if you saw those matrices written like a vector.
Okay, I think I understand problem 2 now. Thank your for your help! :smile:
 
  • #10
appletree23 said:
So I have been thinking about what you wrote here and maybe I understand it a little bit better (not sure by the way). So let us say that we have the subspace ##H## that is the subspace of ##R^6## (##n=6##)
If ##H## for example is defined by the set of vectors ##({x_1,x_2,x_3,x_4,x_5,x_6})##,
What do you mean when you say "defined by set of vectors"?

the basis of the subspace is the vectors that span ##H## and are linearly independent.
Just to be clear. H has many bases. There is no one basis. It's not correct to refer to THE basis of H the way you're saying it.

Let us say that the basis for example is then the vectors ##({x_1, x_2, x_3, x_4})##. The dimension of ##H## is equal to the numbers of vectors in the basis.

There are exactly n vectors in every basis for ##R^n##. And we have that ##dim(H)\leqslant dim(V)##. So we can make a matrix A that is ##6\times 6## and span H by setting up a matrix that contains the basis vectors from H, because the number of basis vectors in the example above must be equal or less that 6. If there are less than 6 basis vectors, we can add 2 0-columns to make it a ##6\times 6## matrix. So from the example above we can make the matrix A consisting of the columns ##[x_1, x_2 , x_3, x_4, 0, 0]##. This can be proven for any value of ##n## we choose. So the statement must be true.
Right. This is what @fresh_42 suggested back in post #2.

I don't know if what I write above is right, so be free to correct me, especially if the statement is meant to be false. One thing I'm unsure of is that when I take away the vectors that are linearly dependent in ##H##, is the set of remaining vectors equal to ##H##? Because in the problem it says ##col(A)=H## and if A only contains the linearly independent vectors of H, then sentence before this one must be true (?).
H contains an infinite number of vectors. If you remove a vector from H, the resulting set is obviously not equal to H. You're probably just being sloppy with your wording, but it's enough to where it doesn't make sense to me what you're asking.

A vector doesn't have a property of being linearly independent or linearly dependent. You can only talk about whether a set of vectors is linearly independent or dependent. If this doesn't make sense to you, review the definition of linear independence.
 
  • #11
English is not my native language, so I'm sorry that I'm sloppy with my wording some times. I will try to explain what I mean a little bit better (at least try).

vela said:
Just to be clear. H has many bases. There is no one basis. It's not correct to refer to THE basis of H the way you're saying it.
Yes, I know this and should rather used the word "a" instead of "the" when I wrote that sentence.

vela said:
H contains an infinite number of vectors. If you remove a vector from H, the resulting set is obviously not equal to H. You're probably just being sloppy with your wording, but it's enough to where it doesn't make sense to me what you're asking.

A vector doesn't have a property of being linearly independent or linearly dependent. You can only talk about whether a set of vectors is linearly independent or dependent. If this doesn't make sense to you, review the definition of linear independence.
So I can try to write the question differently. So you said what I wrote here is correct:
appletree23 said:
There are exactly n vectors in every basis for Rn. And we have that dim(H)⩽dim(V). So we can make a matrix A that is 6×6 and span H by setting up a matrix that contains the basisvectors from H, because the number of basis vectors in the example above must be equal or less that 6. If there are less than 6 basisvectors, we can add 2 0-columns to make it a 6×6 matrix. So from the example above we can make the matrix A consisting of the columns [x1,x2,x3,x4,0,0]. This can be proven for any value of n we choose. So the statement must be true.

So you the agree that the statement is true? If the statement is true then ##H=col (A)##. But the matrix ##A## that I make, only consits of the vectors from ##H## that makes a basis for ##H## and zero-vectors.## A## does not consist of every vector in ##H##, only the ones that makes up a basis, so a set of linearly independent vectors. So how can then ##H=col(A)## if ##A## only contains a basis and zerovector, and not every vector in ##H##? I hope you understand what I mean now.
 
  • #12
appletree23 said:
So you the agree that the statement is true? If the statement is true then H=col(A).
It depends on what the notation ##col(A)## means. I looked in four different linear algebra textbooks I have, and none of them used this notation. If "col(A)" is shorthand for the column space of A, then I agree that H = col(A). Or if col(A) means the span of the column vectors, which is just another way to say the column space, I also agree. On the other hand, if it means only the set of four column vectors ##x_1, x_2, x_3, x_4##, then no, ##H \ne col(A)##.
appletree23 said:
But the matrix A that I make, only consits of the vectors from H that makes a basis for H and zero-vectors.A does not consist of every vector in H, only the ones that makes up a basis, so a set of linearly independent vectors. So how can then H=col(A) if A only contains a basis and zerovector, and not every vector in H? I hope you understand what I mean now.
Any set of vectors that contains one or more zero vectors cannot be linearly independent.
Let me define a term that it seems you're having trouble with.
##Span(v_1, v_2, v_3)## is the set of all linear combinations of these vectors. I.e., the set ##\{v : v = c_1v_1 + c_2v_2 + c_3v_3\}## for scalars ##c_1, c_2, c_3##.
 
Last edited:
  • Like
Likes appletree23
  • #13
appletree23 said:
So you the agree that the statement is true?
Yes, pretty much.

If the statement is true then ##H=col (A)##. But the matrix ##A## that I make, only consits of the vectors from ##H## that makes a basis for ##H## and zero-vectors.## A## does not consist of every vector in ##H##, only the ones that makes up a basis, so a set of linearly independent vectors. So how can then ##H=col(A)## if ##A## only contains a basis and zero vector, and not every vector in ##H##? I hope you understand what I mean now.
Not really. How did you reach the conclusion in the first sentence if you don't know the answer to your second question?
 
  • Like
Likes appletree23
  • #14
Mark44 said:
It depends on what the notation ##col(A)## means. I looked in four different linear algebra textbooks I have, and none of them used this notation. If "col(A)" is shorthand for the column space of A, then I agree that H = col(A). Or if col(A) means the span of the column vectors, which is just another way to say the column space, I also agree. On the other hand, if it means only the set of four column vectors ##x_1, x_2, x_3, x_4##, then no, ##H \ne col(A)##.
Any set of vectors that contains one or more zero vectors cannot be linearly independent.
Let me define a term that it seems you're having trouble with.
##Span(v_1, v_2, v_3)## is the set of all linear combinations of these vectors. I.e., the set ##\{v : v = c_1v_1 + c_2v_2 + c_3v_3\}## for scalars ##c_1, c_2, c_3##.

I have not thought about that before; that a set of vectors that contains one or more zero vectors cannot be linearly independent. For the definition of col A, my book says this:

"The column space of a ##m\times n## matrix ##A##, written as ##Col A##, is the set if all linear combinations of the columns of ##A##. If ##A=[a_1...a_n]## the ##Col A=Span{(a_1, ... ,a_n)}##"

But as a last question, do you thinking the conclusion I make here below is right? Is it good enough to justify that the statement is true? I will write it in a more general way, but except for that, is it good enough?

appletree23 said:
There are exactly n vectors in every basis for Rn. And we have that dim(H)⩽dim(V). So we can make a matrix A that is 6×6 and span H by setting up a matrix that contains the basisvectors from H, because the number of basis vectors in the example above must be equal or less that 6. If there are less than 6 basisvectors, we can add 2 0-columns to make it a 6×6 matrix. So from the example above we can make the matrix A consisting of the columns [x1,x2,x3,x4,0,0]. This can be proven for any value of n we choose. So the statement must be true.
 
  • #15
vela said:
Yes, pretty much.Not really. How did you reach the conclusion in the first sentence if you don't know the answer to your second question?
I reached my conclusion based on a similar problem I found on the internett:

1615899575092.png

Where the solution is given as:

1615899646712.png

I was only asking because I was curious about the equality between ##col A## and ##H## and what that have to mean.
 
  • #16
appletree23 said:
I reached my conclusion based on a similar problem I found on the internett:

View attachment 279841
Where the solution is given as:

View attachment 279842
I was only asking because I was curious about the equality between ##col A## and ##H## and what that have to mean.
I assume we are working in the standard basis for ##\mathbb R^n## here. But, that's a minor point.

This isn't so complicated. ##H## is a vector space of some dimension ##m \le n##. ##H## has a basis of ##m## vectors. ##H## is the span of those basis vectors.

Put those vectors into an ##n \times m## matrix. If you want an ##n \times n## matrix, then pad the matrix out with ##n - m## zero vectors - or, as above, with a repetition of one of the basis vectors ##n - m## times.

In any case, the column space of that matrix is the span of the basis vectors of ##H##, which is precisely what ##H## is.

Extending a set of vectors with zero vectors or repetitions of the vectors you already have doesn't affect the span of the set.
 

FAQ: Help with linear algebra: vectorspace and subspace

What is a vector space?

A vector space is a mathematical structure that consists of a set of vectors and two operations, addition and scalar multiplication. These operations must satisfy certain properties, such as closure, commutativity, and associativity, in order for the set to be considered a vector space.

What is a subspace?

A subspace is a subset of a vector space that also satisfies the properties of a vector space. This means that it contains the zero vector, is closed under addition and scalar multiplication, and is non-empty.

How do I determine if a set is a vector space?

In order for a set to be considered a vector space, it must satisfy the properties of a vector space. This means that it must contain a zero vector, be closed under addition and scalar multiplication, and be non-empty. To determine if a set meets these criteria, you can check if it satisfies the properties of a vector space.

How do I find the basis of a vector space?

The basis of a vector space is a set of linearly independent vectors that span the entire vector space. To find the basis, you can use the Gaussian elimination method to reduce the vectors to their simplest form. The remaining vectors will form the basis of the vector space.

What is the difference between a vector space and a subspace?

A vector space is a mathematical structure that consists of a set of vectors and two operations, while a subspace is a subset of a vector space that also satisfies the properties of a vector space. In other words, a subspace is a smaller version of a vector space that is contained within it.

Back
Top