Relationship between column space of a matrix and rref of matrix

In summary: Each R will be what you get if you do the row operation to the identity matrix. E.g. if you do the row operation that swaps rows 1 and 3, then R will be the identity matrix with rows 1 and 3 swapped.The rows of an m by n matrix are a finite set of vectors in R^n that span a subspace of R^n called the row space, and having some dimension k where k ≤ n. Row reduction changes the rows into some other set of vectors spanning the same space.Since the row space has dimension k, there is some subset of k of the standard axes in R^n, such that projection of the row space onto the span of those k
  • #1
srfriggen
307
7
Hello,

Does the column space of a matrix A always equal the column space of the rref(A)? i.e. are the solution sets to Ax=b, or even Ax=0 the same for A and rref(A)?

When doing some examples of matrices that had some linearly independent columns it seemed the Span was preserved by row operations. However, I'm not sure that is the case if the columns are Linearly Dependent. For example, the solution set to the Matrix with columns <1,1> and <2,2> geometrically span a line in 2 space with slope 1, but the rref of said matrix has columns <1,0> and <2,0>, which geometrically Span the x-axis.

Perhaps someone can elaborate on the relationship between a matrix and it's reduced row echelon form, or perhaps point me toward some material that would help me better understand. I have done many searches on google and youtube and have come up short.
 
Physics news on Phys.org
  • #2
The rref of A is of the form GA for some invertible matrix G, so the solution sets to Ax=b and rref(A)x=b will generally be different. The precise fact to note is: if Ax=b then rref(A)x=(GA)x=G(Ax)=Gb. Note that if b=0 then the previous computation yields rref(A)x=0; and conversely, if rref(A)x=0 then Ax=0. That is, if b=0 then the solution sets to Ax=0 and rref(A)x=0 are the same. This is simply the statement that A and its rref(A) have the same nullspace (which is the basic idea behind Gaussian elimination).
 
  • #3
So each row operation can be represented as a multiplication of a matrix... or more generally, there exists a matrix G such that GA=rref(A).

I see now why GJ works... when you augment A with the Identity matrix you wind up with G on the right after the row operations! (now to see if it works with any invertible matrix...)

Then GA=rref(A). I see...So can you represent each individual row operation by matrices R1, R2,...Rn. Then RnRn-1...R1=G? How could you determine what each R should look like?
 
  • #4
srfriggen said:
So each row operation can be represented as a multiplication of a matrix... or more generally, there exists a matrix G such that GA=rref(A).

I see now why GJ works... when you augment A with the Identity matrix you wind up with G on the right after the row operations! (now to see if it works with any invertible matrix...)

Then GA=rref(A). I see...So can you represent each individual row operation by matrices R1, R2,...Rn. Then RnRn-1...R1=G? How could you determine what each R should look like?
Each R will be what you get if you do the row operation to the identity matrix. E.g. if you do the row operation that swaps rows 1 and 3, then R will be the identity matrix with rows 1 and 3 swapped.
 
  • #5
the rows of an m by n matrix are a finite set of vectors in R^n that span a subspace of R^n called the row space, and having some dimension k where k ≤ n. Row reduction changes the rows into some other set of vectors spanning the same space.

Since the row space has dimension k, there is some subset of k of the standard axes in R^n, such that projection of the row space onto the span of those k axes is an isomorphism. Let's take the simplest case where projection on the first k axes is an isomorphism.

That means if we throw away all but the first k entries of each row, we get a vector in R^k, and this is an isomorphism between the row space and R^k. In this case we can say exactly what the row reduced form looks like.

Namely, each of the standard basis vectors of R^k,i.e. of form (1,0,,,.,0), or (0,1,0,...,0)...corresponds under projection to exactly one vector in the row space. Those vectors in the row space which correspond to these standard basis vectors of R^k under projection, are exactly the rows of the reduced form.

I.e. to find the reduced matrix in this case, take the row space spanned by the rows of the original matrix, and project this row space isomorphically onto the span of the first k axes. Then locate the standard k basis vectors of R^k and go back up to the row space nd choose for each one the corresponding vector in the row space.

Those vectors are a basis for the row space, and they each look like one of the standard basis vectors of R^k, but filled out by some other (n-k) numbers.

In general you take the earliest k axes such that projection on them is an isomorphism from the row space.The solution space, or null space, is just the orthogonal complement of the row space. Once you hAVE THIS SPECIAL BASIS OF THE ROW SPACE, it is easy to write down a basis of the null space. Namely, you just start off your solution vector with the first k entries of one of the remaining n-k columns, and finish it off with minus a standard basis vector of R^(n-k).
 
  • #6
Thank you guys, much appreciated. Definitely gaining a better insight.

On a different topic, I've heard some people a few times talk about how linear algebra is "man made", that the operations are constructed and not, for lack of a better word, organic. Is this some philosophical statement comparing different maths? Is it saying linear algebra is more of a tool rather than a discovery?

Not sure if you have ever heard anyone say that, but I was curious as to what those people (professors, youtube lecturers, blogs) may have meant by it.
 
  • #7
On second thought those people may have just been talking about the idea of matrices
 
  • #8
Does the column space of a matrix A always equal the column space of the rref(A)? i.e. are the solution sets to Ax=b, (no),

or even Ax=0 (yes)

the same for A and rref(A)?

but the solution sets to Ax = b and rrefA = rrefb, are the same.
 

FAQ: Relationship between column space of a matrix and rref of matrix

1. What is the relationship between the column space of a matrix and the reduced row echelon form (rref) of the matrix?

The column space of a matrix is the set of all linear combinations of the columns of the matrix. The rref of a matrix is the result of applying elementary row operations to the matrix until it is in its simplest form. The relationship between the two is that the non-zero columns in the rref of a matrix form a basis for the column space of the matrix. In other words, the rref of a matrix gives us a way to find a basis for the column space.

2. How does the column space change when the rref of a matrix is calculated?

The column space of a matrix does not change when the rref of the matrix is calculated. The rref only rearranges the columns and rows of the matrix to reveal the linearly independent columns that make up the column space. The column space remains the same as it is defined by the original matrix.

3. Can the column space of a matrix be larger than the rref of the matrix?

No, the column space of a matrix cannot be larger than the rref of the matrix. The rref of a matrix only contains the linearly independent columns, which are the minimum number of columns needed to span the entire column space. Therefore, the column space can only be equal to or smaller than the rref of the matrix.

4. How does the dimension of the column space relate to the rref of a matrix?

The dimension of the column space is equal to the number of linearly independent columns in the rref of the matrix. This is because the rref of a matrix gives us a basis for the column space, and the number of basis vectors in a basis is equal to the dimension of the vector space.

5. Why is it important to understand the relationship between the column space of a matrix and the rref of the matrix?

Understanding the relationship between the column space and the rref of a matrix is important because it helps us to understand the structure and properties of a matrix. It also allows us to solve systems of linear equations, find the rank of a matrix, and determine if a set of vectors is linearly independent or dependent. Additionally, it is a fundamental concept in linear algebra and is used in many applications in science and engineering.

Similar threads

Replies
4
Views
2K
Replies
1
Views
1K
Replies
4
Views
2K
Replies
4
Views
6K
Replies
1
Views
1K
Replies
9
Views
4K
Replies
15
Views
1K
Back
Top