How Do You Prove (det A)aij = Cij(A) for an Orthogonal Matrix with det(A)=+1?

In summary, the question is asking if an orthogonal matrix has the property that det(A)=-1. If so, the person is asking how to prove that. However, they don't understand why det(A)=-1 would always equal Cij(A).
  • #1
ognik
643
2
Hi, the question (from math methods for physicists) is: If A is orthogonal and det(A)=+1, show that (det A)aij = Cij(A).

I know that if det(A)=+1, then we are looking at a rotation.
(Side question - I have seen that det(A) =-1 can be a reflection, but is 'mostly not reflections'; what does det(A)=-1 most often indicate then? A link to something simple that covers this would be nice :-) - I couldn't find anything)

But my main issue is to prove the above. I want to do this using indexing, so I tried:
By defn: det A = $ \sum_{}^{}{a}_{ij}({-1})^{i+j}{C}_{ij}(A) $
Then (det A)aij = $ {a}_{ij} \sum_{}^{}{a}_{ij}({-1}^{i+j}){C}_{ij}(A) $ ...

What struck me immediately is I don't know how to treat what happens when an indexed element is outside the summation? I am reasonably sure that aij is a single element here, so I can't include it in the summation. I can almost see an argument that by multiplying by the aijth element, I am 'selecting' only that element's cofactor out of the summation - but that seems too flimsy to me, would appreciate a better understanding.

Another side question is that I have the definition of a cofactor as: $ {C}_{ij}(A)=({-1})^{i+j}{M}_{ij}(A) $ Where {M}_{ij} is the minor...
It seems to me that $ ({-1})^{i+j} $ shouldn't be in both $ \sum_{}^{}{a}_{ij}({-1})^{i+j}{C}_{ij}(A) $ AND $ {C}_{ij}(A)=({-1})^{i+j}{M}_{ij}(A) $ ?

Thanks
 
Physics news on Phys.org
  • #2
ognik said:
Hi, the question (from math methods for physicists) is: If A is orthogonal and det(A)=+1, show that (det A)aij = Cij(A).

I know that if det(A)=+1, then we are looking at a rotation.
(Side question - I have seen that det(A) =-1 can be a reflection, but is 'mostly not reflections'; what does det(A)=-1 most often indicate then? A link to something simple that covers this would be nice :-) - I couldn't find anything)

But my main issue is to prove the above. I want to do this using indexing, so I tried:
By defn: det A = $ \sum_{}^{}{a}_{ij}({-1})^{i+j}{C}_{ij}(A) $
Then (det A)aij = $ {a}_{ij} \sum_{}^{}{a}_{ij}({-1}^{i+j}){C}_{ij}(A) $ ...

What struck me immediately is I don't know how to treat what happens when an indexed element is outside the summation? I am reasonably sure that aij is a single element here, so I can't include it in the summation. I can almost see an argument that by multiplying by the aijth element, I am 'selecting' only that element's cofactor out of the summation - but that seems too flimsy to me, would appreciate a better understanding.

Another side question is that I have the definition of a cofactor as: $ {C}_{ij}(A)=({-1})^{i+j}{M}_{ij}(A) $ Where {M}_{ij} is the minor...
It seems to me that $ ({-1})^{i+j} $ shouldn't be in both $ \sum_{}^{}{a}_{ij}({-1})^{i+j}{C}_{ij}(A) $ AND $ {C}_{ij}(A)=({-1})^{i+j}{M}_{ij}(A) $ ?

Thanks

Hi ognik!

It seems to me we shouldn't worry much which summation applies to find $\det A$.
That's because we already know that $\det A = 1$.

Anyway, to answer your side questions first.
You are quite right that you have a superfluous $({-1})^{i+j}$ in your summation for $\det A$.
And if you have an indexed element outside of the summation, it should use different indices.

So it should be:
$$(\det A)a_{ij} = {a}_{ij} \sum_{k,l}^{}{a}_{kl}(-1)^{k+l}{M}_{kl}(A) = {a}_{ij} \sum_{k,l}^{}{a}_{kl}{C}_{kl}(A)$$To get back to your problem, perhaps you can use that an orthogonal matrix has the property that $A^{-1}=A^T$.
And that an inverse is given by $A^{-1}=\frac{1}{\det A}\big(C(A)\big)^T$.
 
  • #3
Hi and thanks. I wasn't sure I could use that inverse relationship as it hasn't been covered in the text yet, but I can't see any other way. Having said that, i think I need some more understanding as you will see:
$$ If\: A^-1 =\frac{1}{\left| A \right|}.{C}^{T}\;then\:\left| A \right|\left({A}^{-1}\right)={C}^{T} $$
$$ Then \:\left| A \right|\left({A}^{-1}\right)_{ij} ={C}^{T}_{ij}\:\:but\:{C}^{T}_{ij}={C}_{ji}$$
So for what I am trying to prove, somehow $$\left({A}^{-1}\right)_{ij}\:must\:=A_{ji} $$
but I don't know why that would be true?
 
  • #4
..sorry, of course I do! For an orthog matrix
$$ {A}^{-1}={A}^{T} \therefore ({A}^{-1})_{ij}={A}^{T}_{ij}={A}_{ji} $$
Thanks for your help (BTW, how do I add a thanks?)
 
  • #5
Good! :)

Every post has a thanks button at the bottom right.
 
  • #6
I also found that after I posted, one of those days ...
Would appreciate it if you had a look at http://mathhelpboards.com/advanced-applied-mathematics-16/confirm-equation-numerov-method-14831.html ... thanks again :-)
 

FAQ: How Do You Prove (det A)aij = Cij(A) for an Orthogonal Matrix with det(A)=+1?

1. What is the orthogonal determinant identity?

The orthogonal determinant identity is a mathematical formula that relates the determinant of a matrix to the determinants of its orthogonal submatrices. It is also known as the Cauchy-Binet formula.

2. How is the orthogonal determinant identity used in mathematics?

The orthogonal determinant identity is commonly used in linear algebra and calculus, particularly in the study of orthogonal matrices and determinants. It can also be used in the calculation of multivariate integrals.

3. What is the significance of the orthogonal determinant identity?

The orthogonal determinant identity is a useful tool for simplifying calculations involving determinants and orthogonal matrices. It also has applications in fields such as physics and engineering.

4. Is the orthogonal determinant identity limited to specific types of matrices?

No, the orthogonal determinant identity is applicable to any square matrix, regardless of its size or the values of its elements. However, it is most commonly used with orthogonal matrices.

5. Are there any other identities related to the orthogonal determinant identity?

Yes, there are several other identities related to the orthogonal determinant identity, such as the Sylvester's determinant identity and the Cayley's determinant identity. These identities have similar forms and are also used in linear algebra and calculus.

Similar threads

Replies
3
Views
1K
Replies
4
Views
1K
Replies
20
Views
3K
Replies
2
Views
1K
Replies
11
Views
1K
Replies
15
Views
4K
Back
Top