- #1
Evgeny.Makarov
Gold Member
MHB
- 2,436
- 4
I am looking for a method of finding a basis of the union and intersection of two subspaces of $\Bbb R^n$. My question is primarily about the intersection. Suppose that the basis of $L_1$ is $\mathcal{A}=(a_1,\dots,a_k)$ and the basis of $L_2$ is $\mathcal{B}=(b_1,\dots,b_l)$. Then $v\in L_1\cap L_2$ iff there exist $x_1,\dots,x_k$ and $y_1,\dots y_l$ such that
\[
x_1a_1+\dots+x_ka_k=y_1b_1+\dots+y_lb_l=v
\]
So we form a matrix where $\mathcal{A}$ and $\mathcal{B}$ are columns and reduce it to echelon form. The columns with pivots (suppose there are $s$ of them) correspond to vectors from $\mathcal{A},\mathcal{B}$ that make up the basis of $L_1\cup L_2$. The columns without pivots (there are $d=k+l-s$ of them) correspond to free variables. So we assign $d$ linearly independent sets of values to free variables, and they determine all of $x_1,\dots,x_k,y_1,\dots,y_l$. For each such solution, $\sum_{i=1}^l y_ib_i$ is in the basis of $L_1\cap L_2$.
So far so good. I would also like to know how to save myself some calculations when the given vectors $\mathcal{A},\mathcal{B}$ are not necessarily linearly independent. We can still reduce the matrix of these vectors as columns to echelon form. Then columns with pivots from the $\mathcal{A}$ part correspond to the basis of $L_1$. But concerning columns without pivots from the $\mathcal{B}$ part we don't know whether they belong to the basis of $L_2$. They correspond to free variables only when they do also correspond to vectors from the basis of $L_2$, and this means that we have to find the basis of $L_2$ first. Is it correct?
For example, suppose that $n=k=l=3$ and when we write vectors as columns and reduce the matrix to echelon form we get
\[
\begin{pmatrix}
\cdot & \cdot & \cdot & \cdot & \cdot & \cdot\\
& \cdot & \cdot & \cdot & \cdot & \cdot\\
&&& \cdot & \cdot & \cdot
\end{pmatrix}
\]
where nonzero entries are denoted by $\cdot$. This means that $a_1,a_2$ form the basis of $L_1$ and $a_1,a_2,b_1$ form the basis of $L_1\cup L_2$. But we don't know if $b_2$ and $b_3$ are in the basis of $L_2$ along with $b_1$. If they are, then we set $y_2=1,y_3=0$, which determines $y_1$, and then we set $y_2=0,y_3=1$, which gives another $y_1'$. In this case, the basis of $L_1\cap L_2$ is $(y_1b_1+b_2,y_1'b_1+b_3)$. If, on the other hand, the basis of $L_2$ is $(b_1,b_2)$, then the free variable is only $y_2$. Setting $y_2=1$ determines $y_1$ and the basis of $L_1\cap L_2$ is $(y_1b_1+b_2)$.
So, the question is: do I need to find the basis of $L_2$ before reducing the whole matrix?
\[
x_1a_1+\dots+x_ka_k=y_1b_1+\dots+y_lb_l=v
\]
So we form a matrix where $\mathcal{A}$ and $\mathcal{B}$ are columns and reduce it to echelon form. The columns with pivots (suppose there are $s$ of them) correspond to vectors from $\mathcal{A},\mathcal{B}$ that make up the basis of $L_1\cup L_2$. The columns without pivots (there are $d=k+l-s$ of them) correspond to free variables. So we assign $d$ linearly independent sets of values to free variables, and they determine all of $x_1,\dots,x_k,y_1,\dots,y_l$. For each such solution, $\sum_{i=1}^l y_ib_i$ is in the basis of $L_1\cap L_2$.
So far so good. I would also like to know how to save myself some calculations when the given vectors $\mathcal{A},\mathcal{B}$ are not necessarily linearly independent. We can still reduce the matrix of these vectors as columns to echelon form. Then columns with pivots from the $\mathcal{A}$ part correspond to the basis of $L_1$. But concerning columns without pivots from the $\mathcal{B}$ part we don't know whether they belong to the basis of $L_2$. They correspond to free variables only when they do also correspond to vectors from the basis of $L_2$, and this means that we have to find the basis of $L_2$ first. Is it correct?
For example, suppose that $n=k=l=3$ and when we write vectors as columns and reduce the matrix to echelon form we get
\[
\begin{pmatrix}
\cdot & \cdot & \cdot & \cdot & \cdot & \cdot\\
& \cdot & \cdot & \cdot & \cdot & \cdot\\
&&& \cdot & \cdot & \cdot
\end{pmatrix}
\]
where nonzero entries are denoted by $\cdot$. This means that $a_1,a_2$ form the basis of $L_1$ and $a_1,a_2,b_1$ form the basis of $L_1\cup L_2$. But we don't know if $b_2$ and $b_3$ are in the basis of $L_2$ along with $b_1$. If they are, then we set $y_2=1,y_3=0$, which determines $y_1$, and then we set $y_2=0,y_3=1$, which gives another $y_1'$. In this case, the basis of $L_1\cap L_2$ is $(y_1b_1+b_2,y_1'b_1+b_3)$. If, on the other hand, the basis of $L_2$ is $(b_1,b_2)$, then the free variable is only $y_2$. Setting $y_2=1$ determines $y_1$ and the basis of $L_1\cap L_2$ is $(y_1b_1+b_2)$.
So, the question is: do I need to find the basis of $L_2$ before reducing the whole matrix?