Why does this hold when we have the zero matrix?

  • MHB
  • Thread starter mathmari
  • Start date
  • Tags
    Matrix Zero
In summary, " and nothing before it.In summary, the conversation discusses the vector subspace $U$ defined as the set of matrices $X$ in $\mathbb{R}^{2\times 2}$ that satisfy $AX=XB$ for given matrices $A$ and $B$. The non-emptiness and closure properties of $U$ are proven. The conversation also considers the condition for $U$ to equal $\{0\}$, which is equivalent to the matrices $A$ and $B$ having no common entries. The final part of the conversation discusses the idea that $U$ is dependent on the choice of $A$ and $B$ and that $U=\{0
  • #1
mathmari
Gold Member
MHB
5,049
7
Hey! :eek:

I want to show that for $A,B\in \mathbb{R}^{2\times 2}$ the $U=\{X\in \mathbb{R}^{2\times 2}\mid AX=XB\}$ is a vector subspace of $\mathbb{R}^{2\times 2}$.

We have that it is non-empty, since the zero matrix belongs to $U$ : $AO=O=OB$.

Let $X_1, X_2\in U$ then $AX_1=X_1B$ and $AX_2=X_2B$.
We have that $A(X_1+X_2)=AX_1+AX_2=X_1B+X_2B=(X_1+X_2)B$, and so $X_1+X_2\in U$.

Let $\lambda \in \mathbb{R}$.
We have that $A(\lambda X_1)=\lambda (AX_1)=\lambda (X_1B)=(\lambda X_1)B$,and so $\lambda X_1\in U$.

Is everything correct? (Wondering) Then when $A=\begin{pmatrix}a_1 & 0 \\ a_3 & a_4\end{pmatrix}$ and $B=\begin{pmatrix}b_1 & b_2 \\ 0 & b_4\end{pmatrix}$ I want to show that $$U=\{0\} \iff \{a_1, a_4\}\cap \{b_1, b_4\}=\emptyset$$

We have that $$AX=\begin{pmatrix}a_1 & 0 \\ a_3 & a_4\end{pmatrix}\begin{pmatrix}x_1 & x_2 \\ x_3 & x_4\end{pmatrix}=\begin{pmatrix}a_1x_1 & a_1x_2 \\ a_3x_1+a_4x_3 & a_3x_2+a_4x_4\end{pmatrix}$$
and
$$XB=\begin{pmatrix}x_1 & x_2 \\ x_3 & x_4\end{pmatrix}\begin{pmatrix}b_1 & b_2 \\ 0 & b_4\end{pmatrix}=\begin{pmatrix}x_1b_1 & x_1b_2+x_2b_4 \\ x_3b_1 & x_3b_2+x_4b_4\end{pmatrix}$$

So, $$AX=XB \Rightarrow \begin{pmatrix}a_1x_1 & a_1x_2 \\ a_3x_1+a_4x_3 & a_3x_2+a_4x_4\end{pmatrix}=\begin{pmatrix}x_1b_1 & x_1b_2+x_2b_4 \\ x_3b_1 & x_3b_2+x_4b_4\end{pmatrix} \\ \Rightarrow \left\{\begin{matrix}
a_1x_1=x_1b_1 \\
a_1x_2=x_1b_2+x_2b_4 \\
a_3x_1+a_4x_3=x_3b_1 \\
a_3x_2+a_4x_4=x_3b_2+x_4b_4
\end{matrix}\right.\Rightarrow \left\{\begin{matrix}
(a_1-b_1)x_1=0 \\
(a_1-b_4)x_2=x_1b_2 \\
a_3x_1=x_3(b_1-a_4) \\
a_3x_2=x_3b_2+x_4(b_4-a_4)
\end{matrix}\right.$$

For the direction $\Leftarrow$ we have that $\{a_1, a_4\}\cap \{b_1, b_4\}=\emptyset$, so we have the following:
Since $a_1\neq b_1$ it must be $(a_1-b_1)x_1=0 \Rightarrow x_1=0$.
From the second equation we have $(a_1-b_4)x_2=x_1b_2\Rightarrow (a_1-b_4)x_2=0$, since $a_1\neq b_4$ it follows that $x_2=0$.
From the third equation we have that $a_3x_1=x_3(b_1-a_4)\Rightarrow x_3(b_1-a_4)=0$, since $b_1\neq a_4$ it follows that $x_3=0$.
From the last equation we have $a_3x_2=x_3b_2+x_4(b_4-a_4)\Rightarrow 0=x_4(b_4-a_4)$, since $b_4\neq a_4$ it follows that $x_4=0$.

Therefore, the matrix $X$ is the zero matrix, and so $U=\{0\}$, right? (Wondering) How can we show the other direction? (Wondering)
When we have the zero matrix, doesn't it hold for every $A$ and $B$ ? (Wondering)
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
mathmari said:
Hey! :eek:

I want to show that for $A,B\in \mathbb{R}^{2\times 2}$ the $U=\{X\in \mathbb{R}^{2\times 2}\mid AX=XB\}$ is a vector subspace of $\mathbb{R}^{2\times 2}$.

We have that it is non-empty, since the zero matrix belongs to $U$ : $AO=O=OB$.

Let $X_1, X_2\in U$ then $AX_1=X_1B$ and $AX_2=X_2B$.
We have that $A(X_1+X_2)=AX_1+AX_2=X_1B+X_2B=(X_1+X_2)B$, and so $X_1+X_2\in U$.

Let $\lambda \in \mathbb{R}$.
We have that $A(\lambda X_1)=\lambda (AX_1)=\lambda (X_1B)=(\lambda X_1)B$,and so $\lambda X_1\in U$.

Is everything correct? (Wondering)

Looks good to me!
Then when $A=\begin{pmatrix}a_1 & 0 \\ a_3 & a_4\end{pmatrix}$ and $B=\begin{pmatrix}b_1 & b_2 \\ 0 & b_4\end{pmatrix}$ I want to show that $$U=\{0\} \iff \{a_1, a_4\}\cap \{b_1, b_4\}=\emptyset$$

We have that $$AX=\begin{pmatrix}a_1 & 0 \\ a_3 & a_4\end{pmatrix}\begin{pmatrix}x_1 & x_2 \\ x_3 & x_4\end{pmatrix}=\begin{pmatrix}a_1x_1 & a_1x_2 \\ a_3x_1+a_4x_3 & a_3x_2+a_4x_4\end{pmatrix}$$
and
$$XB=\begin{pmatrix}x_1 & x_2 \\ x_3 & x_4\end{pmatrix}\begin{pmatrix}b_1 & b_2 \\ 0 & b_4\end{pmatrix}=\begin{pmatrix}x_1b_1 & x_1b_2+x_2b_4 \\ x_3b_1 & x_3b_2+x_4b_4\end{pmatrix}$$

So, $$AX=XB \Rightarrow \begin{pmatrix}a_1x_1 & a_1x_2 \\ a_3x_1+a_4x_3 & a_3x_2+a_4x_4\end{pmatrix}=\begin{pmatrix}x_1b_1 & x_1b_2+x_2b_4 \\ x_3b_1 & x_3b_2+x_4b_4\end{pmatrix} \\ \Rightarrow \left\{\begin{matrix}
a_1x_1=x_1b_1 \\
a_1x_2=x_1b_2+x_2b_4 \\
a_3x_1+a_4x_3=x_3b_1 \\
a_3x_2+a_4x_4=x_3b_2+x_4b_4
\end{matrix}\right.\Rightarrow \left\{\begin{matrix}
(a_1-b_1)x_1=0 \\
(a_1-b_4)x_2=x_1b_2 \\
a_3x_1=x_3(b_1-a_4) \\
a_3x_2=x_3b_2+x_4(b_4-a_4)
\end{matrix}\right.$$

For the direction $\Leftarrow$ we have that $\{a_1, a_4\}\cap \{b_1, b_4\}=\emptyset$, so we have the following:
Since $a_1\neq b_1$ it must be $(a_1-b_1)x_1=0 \Rightarrow x_1=0$.
From the second equation we have $(a_1-b_4)x_2=x_1b_2\Rightarrow (a_1-b_4)x_2=0$, since $a_1\neq b_4$ it follows that $x_2=0$.
From the third equation we have that $a_3x_1=x_3(b_1-a_4)\Rightarrow x_3(b_1-a_4)=0$, since $b_1\neq a_4$ it follows that $x_3=0$.
From the last equation we have $a_3x_2=x_3b_2+x_4(b_4-a_4)\Rightarrow 0=x_4(b_4-a_4)$, since $b_4\neq a_4$ it follows that $x_4=0$.

Therefore, the matrix $X$ is the zero matrix, and so $U=\{0\}$, right? (Wondering)

Looks great!

How can we show the other direction? (Wondering)
When we have the zero matrix, doesn't it hold for every $A$ and $B$ ? (Wondering)

That's not quite the right way of looking at it, so far as I can make out. The thing is, your $U$ is dependent on your choice of $A$ and $B$. That is, you really have this:
$$U(A,B)=\left\{X\in \mathbb{R}^{2\times 2}\mid AX=XB\right\}.$$
Saying that $U=\{0\}$ is really saying that ONLY the zero matrix allows $AX=XB$ for that choice of $A$ and $B$. In other words, for your choice of $A$ and $B$, $AX=XB$ implies $X=0$. Does that help?
 
  • #3
Ackbach said:
That's not quite the right way of looking at it, so far as I can make out. The thing is, your $U$ is dependent on your choice of $A$ and $B$. That is, you really have this:
$$U(A,B)=\left\{X\in \mathbb{R}^{2\times 2}\mid AX=XB\right\}.$$
Saying that $U=\{0\}$ is really saying that ONLY the zero matrix allows $AX=XB$ for that choice of $A$ and $B$. In other words, for your choice of $A$ and $B$, $AX=XB$ implies $X=0$. Does that help?

So, do we want to show that $X=0$ is the only solution that allows $AX=XB$ for the given matrices $A$ and $B$ and with the restriction $\{a_1, a_4\}\cap \{b_1, b_2\}=\emptyset$ ? (Wondering)
 
  • #4
mathmari said:
So, do we want to show that $X=0$ is the only solution that allows $AX=XB$ for the given matrices $A$ and $B$ and with the restriction $\{a_1, a_4\}\cap \{b_1, b_2\}=\emptyset$ ? (Wondering)

Actually, I think that's what you've already shown. I think what you want to show now is this:

Given that $A$ and $B$ are of the form above, and that the set $U=\{X\mid AX=XB\}$ is the singleton set $U=\{0\}$, prove that $\{a_1, a_4\} \cap \{b_1, b_2\} = \emptyset$.
 
  • #5
Ackbach said:
Actually, I think that's what you've already shown. I think what you want to show now is this:

Given that $A$ and $B$ are of the form above, and that the set $U=\{X\mid AX=XB\}$ is the singleton set $U=\{0\}$, prove that $\{a_1, a_4\} \cap \{b_1, b_2\} = \emptyset$.

So, for the direction $\Rightarrow$ we do the following:

We have the system \begin{equation*}\left\{\begin{matrix}
(a_1-b_1)x_1=0 \\
b_2x_1+(b_4-a_1)x_2=0 \\
a_3x_1+(a_4-b_1)x_3=0 \\
a_3x_2+(-b_2)x_3+(a_4-b_4)x_4=0
\end{matrix}\right.\end{equation*}

From the first equation we have that if $a_1=b_1$ then it would exist infinitely many $x_1$ such that $(a_1-b_1)x_1=0$. But $x_1$ must have only the value $0$, therefore it must be $a_1\neq b_1$.

From the second equation we have that, since $x_1=0$, then if $a_1=b_4$ then it would exist infinitely many $x_2$ such that $(b_4-a_1)x_2=0$. But $x_2$ must have only the value $0$, therefore it must be $a_1\neq b_4$.

From the third equation we have that, since $x_1=x_2=0$, then if $a_4=b_1$ then it would exist infinitely many $x_3$ such that $(a_4-b_1)x_3=0$. But $x_3$ must have only the value $0$, therefore it must be $a_4\neq b_1$.

From the last equation we have that, since $x_1=x_2=x_3=0$, then if $a_4=b_4$ then it would exist infinitely many $x_4$ such that $(a_4-b_4)x_4=0$. But $x_4$ must have only the value $0$, therefore it must be $a_4\neq b_4$. Is everything correct? Could I improve something? (Wondering)
 
  • #6
I think you've got it! Yeah, there might be some cutesy, ridiculously short way to do it, but I'm not always so much in favor of those sorts of methods as some. If you can do the straight-forward method, just do that!
 
  • #7
Ackbach said:
I think you've got it! Yeah, there might be some cutesy, ridiculously short way to do it, but I'm not always so much in favor of those sorts of methods as some. If you can do the straight-forward method, just do that!

Ok! Thank you very much! (Happy)
 
  • #8
OK, one economy I can think of is that your four paragraphs are all using essentially the same argument. So, after your first paragraph, you could say something like, "Similarly, given that the factors $(b_4-a_1), \; (a_4-b_1), \;$ and $(a_4-b_4)$ all appear in the linear system, the quantities $a_1$ and $b_4$, $a_4$ and $b_1$, and $a_4$ and $b_4$ must be distinct." Whether this shortcut is acceptable to your teacher or not is something you have to gauge.
 

FAQ: Why does this hold when we have the zero matrix?

Why does the zero matrix have a unique property?

The zero matrix has a unique property because it is the only matrix that when multiplied with any other matrix, the result is always the zero matrix. This is because the zero matrix is made up entirely of zeros, and any number multiplied by zero always equals zero.

How does the zero matrix affect the properties of other matrices?

The zero matrix does not affect the properties of other matrices in any way. It does not change the dimensions, invertibility, or rank of the matrix it is multiplied with. It simply results in the zero matrix.

Can the zero matrix be used in calculations?

Yes, the zero matrix can be used in calculations, but it will always result in the zero matrix. This is because any number multiplied by zero will always equal zero.

Why is the zero matrix used in linear algebra?

The zero matrix is used in linear algebra as a placeholder or to represent a system with no solutions. It can also be used to simplify calculations or to prove certain theorems.

Does the zero matrix have any real-world applications?

The zero matrix does not have any direct real-world applications. However, it is used in various fields of math and science, such as linear algebra, graph theory, and computer graphics.

Similar threads

Replies
34
Views
2K
Replies
15
Views
1K
Replies
7
Views
1K
Replies
1
Views
1K
Replies
2
Views
1K
Back
Top