Proof regarding determinant of block matrices

In summary: I think?In summary, the determinant of a 2x2 matrix is incorrect when the matrix is composed of a zero matrix and an identity matrix.
  • #1
Adgorn
130
18

Homework Statement


Let A,B,C,D be commuting n-square matrices. Consider the 2n-square block matrix ##M= \begin{bmatrix}
A & B \\
C & D \\
\end{bmatrix}##. Prove that ##\left | M \right |=\left | A \right |\left | D \right |-\left | B \right |\left | C \right |##. Show that the result may not be true if the matrices do not commute.

Homework Equations


##det(M)= det(A_1)det(A_2)...det(A_n)## (Where M is an upper (lower) triangular block matrix with ##A_1,A_2,...,A_n## diagonal blocks.)

The Attempt at a Solution


At first I tried using the theorems of determinants in combinations with the properties of (commuting) matrices to try and get the desired expression, but had no success. I then tried expressing the determinant using the elements of the square matrices in M and dividing the expression to 4 separate determinants, but could not figure out how.
The closest I managed to get to a solution is prove the above equation where C=0.
Any help would be appreciated.
 
Physics news on Phys.org
  • #2
Thank you Mr. AI bot thingy but I said all I really have to say, It's pretty straight forward.
 
  • #3
Adgorn said:
Thank you Mr. AI bot thingy but I said all I really have to say, It's pretty straight forward.
Yes, but your attempts could have been a bit more elaborated. Unfortunately, the only solution I can think of are either an induction or the explicit formula via diagonals. Doesn't sound funny though.
 
  • #4
fresh_42 said:
Yes, but your attempts could have been a bit more elaborated. Unfortunately, the only solution I can think of are either an induction or the explicit formula via diagonals. Doesn't sound funny though.
Unfortunately I cannot explain any further simply because I don't even know how to approach this, I have never worked with commutative matrices before (Learning from Shaum's outlines), let alone under the context of determinants. So I don't really know what to exploit with the given information that the matrices are commutative.
I am certain that the proof uses induction, I managed to prove something similar with C=0:
##M=\begin{bmatrix}
A & C & \\
0 & B &
\end{bmatrix}##
Prove ##det(M)=det(A)det(B)##

If A is r-square and B is s-square, then n=r+s and
##\sum_{σ \in S_n}## (sgn ##σ##) ##m_{1σ(1)} m_{2σ(2)}...m_{nσ(n)}##.
If ##i\gt r## and ##j\leq r##. ##m_{ij}=0##. Therefor we only consider the permutations ##σ(r+1,r+2,...r+s)=(r+1,r+2,...,r+s)## and ##σ(1,2,...,r)=(1,2,...,r)##.
Let ##σ_1(k)=σ(k)## for ##k\leq r## and let ##σ_2(k)=σ(k+r)-r## for ##k\leq s##.
Thus, (sgn ##σ##) ##m_{1σ(1)} m_{2σ(2)}...m_{nσ(n)}####=##(sgn ##σ_1##)##a_{1σ_1(1)} a_{2σ_1(2)}...a_{rσ_1(r)}##(sgn ##σ_1##)##b_{1σ_2(1)} b_{2σ_2(2)}...b_{sσ_2(s)}##
Which implies ##det(M)=det(A)det(B)##.

So I proved this theorem, which is the closest I got to proving the question of this post. I tried using the raw formula to no sucess, perhaps there is some hidden property of commutative matrices that will allow me to solve this.
 
  • #5
The commutativity requirement should automatically show up during the proof. I wouldn't start with it. But it may be used early in a proof. So without any brute force methods to apply, I think there could be a tricky decomposition of ##M##. However, it should be a multiplicative decomposition or a basis transformation ##TMT^{-1}##, since we don't really want to calculate something like ##\det (M+N)##. If you already have a proof for triangular matrices, maybe you can find a decomposition into factors like this.

One can always write any matrix ##M = \frac{1}{2}(M+M^t) + \frac{1}{2}(M-M^t)## as a sum of a symmetric and a skew-symmetric matrix. Don't know whether this helps. Or apply some normal forms you know about.
 
  • #6
Adgorn said:

Homework Statement


Let A,B,C,D be commuting n-square matrices. Consider the 2n-square block matrix ##M= \begin{bmatrix}
A & B \\
C & D \\
\end{bmatrix}##. Prove that ##\left | M \right |=\left | A \right |\left | D \right |-\left | B \right |\left | C \right |##.

This is an old thread, but it has been left open and I thought about this earlier today for some reason.

The assertion in the problem statement is wrong.

A simple and general counter example is: consider the case where ##n## is some even natural number, and ##\mathbf A = \mathbf D = \mathbf 0_{nxn}## and ##\mathbf B = \mathbf C = \mathbf I_{nxn}##

The zero matrix and the identity matrix commute with all matrices. Yet the above formula indicates that the determinant is ##-1## when in fact it is ##+1##. (I.e. in this case we have a permutation matrix that becomes the identity matrix after an even number of pairwise column swaps and hence has determinant of 1.)

real simple example: consider ##n = 2##

##1 =det\Big(\begin{bmatrix}
0 & 0 & 1 &0 \\
0 & 0 & 0 & 1\\
1 & 0 & 0&0 \\
0 & 1 & 0& 0
\end{bmatrix}\Big) = det\Big(
\begin{bmatrix}
\mathbf 0 & \mathbf I \\
\mathbf I & \mathbf 0 \\
\end{bmatrix}\Big)
\neq det\big(\mathbf 0\big)det\big(\mathbf 0\big) - det\big(\mathbf I\big)det\big(\mathbf I\big) = 0*0 - 1*1 = -1##

- - - -
note: the specific example I am giving is problem 6.2.5 in Meyer's Matrix Analysis.
 

FAQ: Proof regarding determinant of block matrices

What is a determinant?

A determinant is a mathematical value that can be calculated from a square matrix. It represents the scaling factor of the matrix and can be used to determine various properties of the matrix, such as whether it is invertible or singular.

How is the determinant of a block matrix calculated?

The determinant of a block matrix can be calculated by using the block matrix formula, which involves breaking down the matrix into smaller submatrices and performing operations on those submatrices. The specific calculations will depend on the size and arrangement of the block matrix.

What is the significance of the determinant of a block matrix?

The determinant of a block matrix can provide information about the overall matrix, such as whether it is invertible or singular. It can also be used to solve systems of equations and determine the eigenvalues of the matrix.

Can the determinant of a block matrix be zero?

Yes, the determinant of a block matrix can be zero. This means that the matrix is singular and cannot be inverted. In terms of block matrices, this could indicate that one or more of the submatrices is singular.

Are there any special properties of block matrices that affect the determinant?

Yes, there are certain properties of block matrices that can affect the determinant. For example, if all the submatrices along the diagonal are square and have the same size, the determinant of the block matrix can be calculated by simply taking the product of the determinants of each submatrix. Additionally, the determinant of a block diagonal matrix is equal to the product of the determinants of each diagonal submatrix.

Similar threads

Replies
25
Views
3K
Replies
4
Views
1K
Replies
4
Views
616
Replies
2
Views
4K
Replies
4
Views
1K
Replies
1
Views
6K
Replies
2
Views
6K
Replies
4
Views
2K
Back
Top