Determinant is independent of row/column

In summary: The case j<k is similar. In both cases, the coefficients of the terms are equal, so the two sums are equal.In summary, the proof for the fact that the value of a determinant computed using the Laplace expansion is independent of along which row (or column) the expansion is performed uses induction and the definition of a determinant. It is a technical proof, but it can be understood by following the steps and understanding the notations.
  • #1
Bipolarity
776
2
I am curious about the proof of the fact that the value of a determinant computed using the Laplace (or cofactor) expansion is independent of along which row (or column) the expansion is performed.

Is this a very difficult proof? My textbook omits it entirely. I was curious if someone could provide a link to the proof, as I am interested in reading it. Wikipedia has a proof http://en.wikipedia.org/wiki/Laplace_expansion but it was too complicated for me to understand.

Does anyone know a simpler form of the proof i.e. one that is longer but clearer in its statements for a less insightful reader?

BiP
 
Physics news on Phys.org
  • #2
By the nature of the Laplace expansion, a proof is necessarily going to be ugly and technical.

HINT: it will be much easier to prove this yourself than to follow this proof.

Let me use the same notations as in wikipedia. So take a matrix B. Let me show that expansion along the first row yields the same result as expansion along the second row. The more general statement is left to you. We prove this by induction. For the 1x1 case, the statement is trivial.

So assume that B is nxn. Expansion along the first row yields
[tex]b_{1,1}C_{1,1}+...+b_{1,n}C_{1,n}=b_{1,1}M_{1,1}-b_{1,2}M_{1,2}+...+(-1)^{n+1}b_{1,n}M_{1,n}[/tex]

Expansion along the second row yields
[tex]b_{2,1}C_{2,1}+...+b_{2,n}C_{2,n}=-b_{2,1}M_{2,1}+b_{2,2}M_{2,2}+...+(-1)^{n+2}b_{2,n}M_{2,n}[/tex]

We wish to calculate [itex]M_{1,1}[/itex]. By definition this is the determinant of the matrix that results if we remove the first row and the first column from B. By induction hypothesis, we can calculate this determinant by taking the Laplace expansion along the first row. So we can write
[tex]M_{1,1}=b_{2,2}D_{1,2}^{1,2} - b_{3,2}D_{1,2}^{1,3}+...+(-1)^{2+n}D_{1,2}^{1,n}[/tex]
where [itex]D_{a,b}^{c,d}[/itex] is the determinant of the matrix resulting from B if we remove row a and b, and if we remove column c and d.
In general:
[tex]M_{1,k}=(-1)^{\delta(1,k)} b_{2,1}D_{1,2}^{1,k} +(-1)^{\delta(2,k)} b_{2,2}D_{1,2}^{2,k}+... + (-1)^{\delta(n,k)}b_{2,n}D_{1,2}^{n,k}[/tex]

We used the following notations: [itex]D_{1,2}^{k,k}=0[/itex] and [itex]\delta(l,k)[/itex] is the number of elements in [itex]\{1,...,l-1\}\setminus\{k\}[/itex].

To calculate [itex]M_{2,k}[/itex], we calculate this matrix by taking the Laplace expansion along the first row. We get
[tex]M_{2,k}=(-1)^{\delta(1,k)} b_{1,1}D_{1,2}^{1,k} + (-1)^{\delta(1,k)}b_{1,2}D_{1,2}^{2,k}+...+(-1)^{\delta(n,k)}b_{1,n}D_{1,2}^{n,k}[/tex]

We substitute these values of [itex]M_{1,k}[/itex] and [itex]M_{2,k}[/itex] into the original sum.

By definition we know that [itex]D_{1,2}^{j,k}=D_{1,2}^{k,j}[/itex]. We wish to prove that the coefficients of these terms are equal.
The coefficient of [itex]D_{1,2}^{j,k}[/itex] in the first sum is:
[tex](-1)^{k+1}b_{1,k}(-1)^{\delta(j,k)}b_{2,j}[/tex]
The coefficient of [itex]D_{1,2}^{k,j}[/itex] in the first sum is:
[tex](-1)^{j+1}b_{1,j}(-1)^{\delta(k,j)}b_{2,k}[/tex]
So together, we have
[tex](-1)^{k+\delta(j,k)+1}b_{1,k}b_{2,j}+ (-1)^{j+\delta(k,j)+1}b_{2,k}b_{2,j}[/tex]

We do the same for the terms in the second sum. The coefficient of [itex]D_{1,2}^{j,k}[/itex] in the second sum is:
[tex](-1)^{k+2}b_{2,k}(-1)^{\delta(j,k)}b_{1,j}[/tex]
The coefficient of [itex]D_{1,2}^{k,j}[/itex] in the second sum is:
[tex](-1)^{j+2}b_{2,j}(-1)^{\delta(k,j)}b_{1,k}[/tex]
So together we have
[tex](-1)^{k+\delta(j,k)+2}b_{2,j}b_{1,j}b_{2,k} + (-1)^{j+\delta(k,j)+2}b_{2,j}b_{1,k}[/tex]

In order that both sums are equal, it suffices to show that
[tex](-1)^{j+\delta(k,j)+2}=(-1)^{k+\delta(j,k)+1}[/tex]
Assume first that [itex]k<j[/itex]. Then [itex]\delta(k,j)[/itex] is the number of elements in [itex]\{1,...,k-1\}\setminus \{j\}[/itex] and this is k-1. So the left-hand side becomes
[tex](-1)^{j+k+1}[/tex]
If [itex]k<j[/itex], then [itex]\delta(j,k)[/itex] is the number of elements in [itex]\{1,...,j-1\}\setminus \{k\}[/itex] and this is j-2. So the right hand side becomes
[tex](-1)^{k+j-1}[/tex]
Clearly, the left-hand side equals the right-hand side.
 

FAQ: Determinant is independent of row/column

What does it mean for the determinant to be independent of row/column?

When the determinant of a matrix is independent of the choice of row or column, it means that the value of the determinant will remain the same regardless of which row or column is used to calculate it.

Why is it important for the determinant to be independent of row/column?

This property of determinants is important because it allows us to perform operations on matrices, such as swapping rows or multiplying a row by a constant, without changing the value of the determinant. This makes it easier to solve systems of equations and perform other calculations.

What is the mathematical proof that the determinant is independent of row/column?

The proof for this property of determinants involves using the properties of determinants, such as linearity, to show that the value of the determinant remains the same when rows or columns are swapped or multiplied by a constant. It can also be shown using the cofactor expansion method.

Can the determinant be independent of both row and column simultaneously?

Yes, the determinant can be independent of both row and column simultaneously. This means that the value of the determinant will not change if both a row and a column are swapped or multiplied by a constant at the same time.

Does this property hold for all types of matrices?

No, this property only holds for square matrices, meaning matrices with the same number of rows and columns. For non-square matrices, the determinant is not defined and thus the concept of being independent of row/column does not apply.

Back
Top