Conditions so that the determinant is zero

In summary: Thinking)Therefore $A$ only has entries that are zero....So $A$ is the zero matrix because it has the same number of rows and columns as $\mathbb R^3$.
  • #1
mathmari
Gold Member
MHB
5,049
7
Hey! :eek:

We have $3$ lines with equations $a_{i1}x+a_{i2}y+a_{i3}=0$, $i=1,2,3$. I want to show that $\det ((a_{ij}))=0$ iff the lines are pairwise parallel of they have a common point.

We have that $\det ((a_{ij}))=0$ iff we have a zero row. That would mean that we have linear independency of the rows, so linear independency of the lines. That means that some lines are parallel or not? (Wondering)
 
Physics news on Phys.org
  • #2
I have the following idea:

We have the lines: $$\left\{\begin{matrix}
a_{11}x+a_{12}y+a_{13}=0 \\
a_{21}x+a_{22}y+a_{23}=0 \\
a_{31}x+a_{32}y+a_{33}=0
\end{matrix}\right. \Rightarrow \left\{\begin{matrix}
a_{11}x+a_{12}y+a_{13}\cdot 1=0 \\
a_{21}x+a_{22}y+a_{23}\cdot 1=0 \\
a_{31}x+a_{32}y+a_{33}\cdot 1=0
\end{matrix}\right.$$

The intersection point is in the form $(x_o,y_o,1)$.

In matrix form we have the following: $$\begin{pmatrix}a_{11}& a_{12}& a_{13} \\ a_{21}& a_{22}& a_{23} \\ a_{31}& a_{32}& a_{33}\end{pmatrix}\begin{pmatrix}x\\ y\\ 1\end{pmatrix}=\begin{pmatrix}0\\ 0\\ 0\end{pmatrix}$$

If $\det ((a_{ij}))=0$ then the system has infinitely many solutions, doesn't it?

Is everything correct so far? Does this help us? (Wondering)
 
  • #3
Hey mathmari! (Smile)

mathmari said:
In matrix form we have the following: $$\begin{pmatrix}a_{11}& a_{12}& a_{13} \\ a_{21}& a_{22}& a_{23} \\ a_{31}& a_{32}& a_{33}\end{pmatrix}\begin{pmatrix}x\\ y\\ 1\end{pmatrix}=\begin{pmatrix}0\\ 0\\ 0\end{pmatrix}$$

If $\det ((a_{ij}))=0$ then the system has infinitely many solutions, doesn't it?

Infinitely many solutions - or no solutions at all.
That's the converse of the situation that $\det A\ne 0$, in which case we have exactly one solution. (Thinking)

mathmari said:
Is everything correct so far? Does this help us?

It means the $(x,y,1)$ has to be in the kernel of $A$ (the space of all vectors that have the zero vector as their image).

If $\det A=0$, we can distinguish 3 cases:
  1. The kernel is a line.
  2. The kernel is a plane.
  3. The kernel is all of $\mathbb R^3$.

Let's start with the case that the kernel is a line, say, $\lambda(p,q,r)$.
Then we have that $(x,y,1) = \lambda(p,q,r)$.
It means we have to distinguish 2 sub cases: either $r\ne 0$ or $r=0$.
Starting with the first sub case, if $r\ne 0$, we can find exactly one solution for $(x,y)$.
In other words, the 3 lines intersect at a common point.
(Thinking)
 
  • #4
I like Serena said:
If $\det A=0$, we can distinguish 3 cases:
  1. The kernel is a line.
  2. The kernel is a plane.
  3. The kernel is all of $\mathbb R^3$.

Why do we have these cases? (Wondering)
 
  • #5
mathmari said:
Why do we have these cases? (Wondering)

If there is a vector $v \ne 0$ such that $Av=0$, then any multiple $\lambda v$ has $A(\lambda v)=\lambda Av=0$.
So the line that is the span of $v$ is in the kernel.
If there is no other vector $w$ linearly independent of $v$ that has $Aw=0$, then that is the complete kernel: a line.

Suppose there is another linearly independent vector $w$ with $Aw=0$.
Then any vector $\lambda v + \mu w$ in the plane that is the span of $v$ and $w$ has $A(\lambda v + \mu w)=\lambda Av + \mu Aw = 0$.
So then that plane is in the kernel.

Now suppose there is a 3rd independent vector that maps to the zero vector.
In that case $A$ maps any vector to the zero vector, meaning $A$ is the zero matrix.

(Thinking)
 
  • #6
I like Serena said:
If there is a vector $v \ne 0$ such that $Av=0$, then any multiple $\lambda v$ has $A(\lambda v)=\lambda Av=0$.
So the line that is the span of $v$ is in the kernel.

What exactly do you mean by "line that is the span of $v$" ? Is the span of a vector a line? Isn't it a point? (Wondering)
 
  • #7
mathmari said:
What exactly do you mean by "line that is the span of $v$" ? Is the span of a vector a line? Isn't it a point?

The linear span or just span of a set of vectors is the set of all possible linear combinations of that set of vectors.
If we have 1 non-zero vector, the span of that vector is a line. (Nerd)
 
  • #8
I like Serena said:
The linear span or just span of a set of vectors is the set of all possible linear combinations of that set of vectors.
If we have 1 non-zero vector, the span of that vector is a line. (Nerd)

I see! (Nerd)
I like Serena said:
Now suppose there is a 3rd independent vector that maps to the zero vector.
In that case $A$ maps any vector to the zero vector, meaning $A$ is the zero matrix.

Could you explain it further to me why $A$ must be the zero matrix? (Wondering)
 
  • #9
mathmari said:
Could you explain it further to me why $A$ must be the zero matrix? (Wondering)

Any vector in $\mathbb R^3$ can we written as a linear combination of 3 independent vectors.
That includes the 3 standard unit vectors $\mathbf e_i$.
Each of the columns of $A$ correspond to the image of the corresponding standard unit vector.
That is, for instance the first column of $A$ is the image of $\mathbf e_1$.
And each of them has to be the zero vector.
Therefore $A$ only has entries that are zero. (Nerd)
 
  • #10
I like Serena said:
Any vector in $\mathbb R^3$ can we written as a linear combination of 3 independent vectors.
That includes the 3 standard unit vectors $\mathbf e_i$.
Each of the columns of $A$ correspond to the image of the corresponding standard unit vector.
That is, for instance the first column of $A$ is the image of $\mathbf e_1$.
And each of them has to be the zero vector.
Therefore $A$ only has entries that are zero. (Nerd)
Ah ok!
I like Serena said:
Let's start with the case that the kernel is a line, say, $\lambda(p,q,r)$.
Then we have that $(x,y,1) = \lambda(p,q,r)$.
It means we have to distinguish 2 sub cases: either $r\ne 0$ or $r=0$.
Starting with the first sub case, if $r\ne 0$, we can find exactly one solution for $(x,y)$.
In other words, the 3 lines intersect at a common point.
(Thinking)
If $r=0$ there is no solution, is it?

If $r\neq 0$, we have that $x=\lambda p$ and $y=\lambda q$, right? (Wondering)
 
  • #11
mathmari said:
If $r=0$ there is no solution, is it?

Correct.
However, it means that we still have to verify that the lines are pairwise parallel.
That's because the premisse was, that the determinant is zero iff the lines either intersect at a common point, or are pairwise parallel.

mathmari said:
If $r\neq 0$, we have that $x=\lambda p$ and $y=\lambda q$, right? (Wondering)

That was true either way.
More specifically we can now solve $\lambda$ as $\lambda =\frac 1r$ and substitute that.
So we find the common point of intersection $(x,y)=(p/r, q/r)$.
 
  • #12
Ah ok!

At the second case we have that $(x,y,1)=\lambda (p,q,r)+\mu (a,b,c)=(\lambda p+\mu a, \lambda q+\mu b, \lambda r+\mu c)$.
We have that $ \lambda r+\mu c=1 \Rightarrow \lambda =\frac{1-\mu c}{r}$.
Therefore, we get $(x,y,1)=(\frac{(1-\mu c)p}{r} +\mu a, \frac{(1-\mu c)q}{r} +\mu b, 1-\mu c+\mu c)=(\frac{(1-\mu c)p}{r} +\mu a, \frac{(1-\mu c)q}{r} +\mu b, 1)$.
Is this correct? (Wondering)
 
  • #13
I have an other idea:

Suppose that that the determinant is zero, that means that the rows are linear dependent. (The elements of the rows are the coefficients of the lines.)

So, we have that for example $(a_{11},a_{12},a_{13})=\kappa\,(a_{21},a_{22},a_{23})+\lambda\,(a_{31},a_{32},a_{33})$.

If one of $\kappa$ or $\lambda$ is equal to zero, for example $\kappa$ then we have that $(a_{11},a_{12},a_{13})=\lambda\,(a_{31},a_{32},a_{33})$. This means that the normal vector of the first line $(a_{11}, a_{12})$ is a multiple of the other one, so they are parallel, and so the lines are also parallel, right? But this holds only for two lines, not for all the three, or not?

Do we have to take also the case that $\kappa, \lambda \neq 0$ ?

Or is this approach wrong? (Wondering)
 
  • #14
mathmari said:
Ah ok!

At the second case we have that $(x,y,1)=\lambda (p,q,r)+\mu (a,b,c)=(\lambda p+\mu a, \lambda q+\mu b, \lambda r+\mu c)$.
We have that $ \lambda r+\mu c=1 \Rightarrow \lambda =\frac{1-\mu c}{r}$.
Therefore, we get $(x,y,1)=(\frac{(1-\mu c)p}{r} +\mu a, \frac{(1-\mu c)q}{r} +\mu b, 1-\mu c+\mu c)=(\frac{(1-\mu c)p}{r} +\mu a, \frac{(1-\mu c)q}{r} +\mu b, 1)$.
Is this correct? (Wondering)

Looks like it.
In particular, we can rewrite the solution in the form $\binom xy = \mathbf a + \mu \mathbf d$, showing that the solution is a line.
That means that all 3 lines coincide, and as such they are pairwise parallel.

mathmari said:
I have an other idea:

Suppose that that the determinant is zero, that means that the rows are linear dependent. (The elements of the rows are the coefficients of the lines.)

So, we have that for example $(a_{11},a_{12},a_{13})=\kappa\,(a_{21},a_{22},a_{23})+\lambda\,(a_{31},a_{32},a_{33})$.

If one of $\kappa$ or $\lambda$ is equal to zero, for example $\kappa$ then we have that $(a_{11},a_{12},a_{13})=\lambda\,(a_{31},a_{32},a_{33})$. This means that the normal vector of the first line $(a_{11}, a_{12})$ is a multiple of the other one, so they are parallel, and so the lines are also parallel, right? But this holds only for two lines, not for all the three, or not?

Doesn't it mean that the first 2 lines coincide?
And that the 3rd line crosses them in a common point of intersection? (Wondering)

mathmari said:
Do we have to take also the case that $\kappa, \lambda \neq 0$ ?

Or is this approach wrong? (Wondering)

This might work as well. As yet I don't see how though. (Thinking)
 
  • #15
I like Serena said:
Doesn't it mean that the first 2 lines coincide?
And that the 3rd line crosses them in a common point of intersection? (Wondering)

mathmari said:
Suppose that that the determinant is zero, that means that the rows are linear dependent. (The elements of the rows are the coefficients of the lines.)

So, we have that for example $(a_{11},a_{12},a_{13})=\kappa\,(a_{21},a_{22},a_{23})+\lambda\,(a_{31},a_{32},a_{33})$.
I have done the following:

2 lines are either parallel of have an intersection point.

We suppose that the lines $a_{21}x+a_{22}y+a_{23}=0$ and $a_{31}x+a_{32}y+a_{33}=0$ have an intersection point,
say $(x_0, y_0)$.

We have the following:
\begin{align*}a_{11}x_0+a_{12}y_0+a_{13}&=(\kappa a_{21}+\lambda a_{31})x_0+(\kappa a_{22}+\lambda a_{32})y_0+(\kappa a_{23}+\lambda a_{33}) \\ & =\kappa ( a_{21}x_0+ a_{22}+y_0+ a_{23})+\lambda ( a_{31}x_0+ a_{32}y_0+a_{33}) \\ & =0+0 =0\end{align*}
Therefore $(x_0, y_0)$ is an intersection point of all the 3 lines. We suppose that the lines $a_{21}x+a_{22}y+a_{23}=0$ and $a_{31}x+a_{32}y+a_{33}=0$ are parallel. Then the normal vectors are parallel. So, we have that $(a_{21}, a_{22})=\mu (a_{31},a_{32})$.

We have that \begin{equation*}(a_{11}, a_{12})=(\kappa a_{21}+\lambda a_{31},\kappa a_{22}+\lambda a_{32})=(\kappa \mu a_{31}+\lambda a_{31},\kappa \mu a_{32}+\lambda a_{32}) =(\kappa \mu +\lambda)(a_{31}, a_{32})\end{equation*}

So, the 3 lines are parallel. Is everything correct? (Wondering)
 

FAQ: Conditions so that the determinant is zero

What is the determinant of a matrix?

The determinant of a matrix is a value that can be calculated from the elements of the matrix. It is a special number that can help determine whether a matrix is invertible and can be used to solve systems of linear equations.

How does the determinant relate to the invertibility of a matrix?

If the determinant of a square matrix is equal to zero, then the matrix is not invertible. This means that there is no unique solution to the system of equations represented by the matrix.

What are the conditions for the determinant of a matrix to be zero?

The determinant of a matrix will be equal to zero if and only if the matrix is singular, meaning it does not have an inverse. This can occur if there is a row or column of zeros, or if the rows or columns are linearly dependent.

How can the determinant be used to solve systems of linear equations?

The determinant of a matrix can be used to determine whether a system of linear equations has a unique solution, no solution, or infinitely many solutions. If the determinant is non-zero, then the system has a unique solution that can be found by using Cramer's rule.

Can the determinant of a matrix be negative or zero?

Yes, the determinant of a matrix can be any real number, including negative or zero. However, if the determinant is zero, then the matrix is not invertible and may have no unique solution to a system of linear equations.

Similar threads

Back
Top