Is the Hermitian Conjugation Identity Correct?

AI Thread Summary
The discussion centers on the Hermitian conjugation identity for operators, specifically the equation (\hat A \times \hat B)^* = -\hat B^* \times \hat A^*. Participants analyze the implications of the antisymmetry of the Levi-Civita symbol and the properties of Hermitian operators. There is a debate about the correct interpretation of indices and the conditions under which the operators can be interchanged. The conclusion emphasizes that the original statement does not specify that the operators are Hermitian, which is crucial for the validity of the identity. Overall, the thread highlights the nuances of operator algebra and the importance of clarity in mathematical notation.
Dyatlov
Messages
25
Reaction score
1

Homework Statement


##(\hat A \times \hat B)^*=-\hat B^* \times \hat A^*##
Note that ##*## signifies the dagger symbol.

Homework Equations


##(\hat A \times \hat B)=-(\hat B \times \hat A)+ \epsilon_{ijk} [a_j,b_k]##

The Attempt at a Solution


Using as example ##R## and ##P## operators:
##(\hat R \times \hat P)^*_i=-(\hat P \times \hat R)^*_i+ \epsilon_{ijk} [Y,P_z]##
##(\hat R \times \hat P)^*_i=-(\hat P \times \hat R)^*_i##
##(\hat R \times \hat P)^*=-\hat P^* \times \hat R^*##
 
Physics news on Phys.org
Dyatlov said:

Homework Statement


##(\hat A \times \hat B)^*=-\hat B^* \times \hat A^*##
Note that ##*## signifies the dagger symbol.
Start instead from a component-wise definition:
$$(A \times B)_i ~=~ \epsilon_{ijk} A_j B_k $$ (where the usual summation convention applies to repeated indices).
 
Since ##\epsilon_{ijk}## is antisymmetric then we have
##\epsilon_{ijk}A_jB_k=A_jB_k-A_kB_j##
##A_jB_k-A_kB_j=-(B_jA_k-B_kA_j)##
##(A \times B)_i=-(B \times A)_i##
Since A and B are Hermitian the same equlity holds for their self-adjoint counterparts.
 
Last edited:
Dyatlov said:
Since ##\epsilon_{ijk}## is antisymmetric then we have
##\epsilon_{ijk}A_jB_k=A_jB_k-A_kB_j##
That equation does not make sense. On the LHS, ##i## is a free index, but ##j,k## are dummy summation indices. However, on your RHS both ##j## and ##k## are free indices, and there's no ##i## at all. Both left and right hand sides of such an equation must have exactly the same free indices.

The LHS uses a version of the summation convention. It is short for $$\sum_{j,k} \epsilon_{ijk}A_jB_k $$
##A_jB_k-A_kB_j=-(B_jA_k-B_kA_j)##
This is wrong if ##B_j## and ##A_k## don't commute (which is presumably the case here, since the problem statement didn't specify commutativity). So you can't blithely interchange ##A## and ##B## like that.

##(A \times B)_i=-(B \times A)_i##
Since A and B are Hermitian
Your original problem statement doesn't say that ##A,B## are Hermitian.

the same equality holds for their self-adjoint counterparts.
But that doesn't solve the problem as stated.

Start with this: $$\left( \sum_{j,k} \epsilon_{ijk}A_jB_k \right)^\dagger ~=~ \dots\,? \dots $$ Hint: for arbitrary operators ##X,Y##, what is ##(XY)^\dagger## ?
 
  • Like
Likes Dyatlov
Thanks for the replies.
The title mentions that I am solving the identity for Hermitian operators.
I know that ##\epsilon_{ijk}A_jB_k## is a sum over j and k, with ##j,k=1,2,3##.
##(\epsilon_{ijk}A_jB_k)^\dagger=(\epsilon_{ijk}B^\dagger_kA^\dagger_j)=-(B^\dagger \times A^\dagger)_i##
Therefore:
##(A \times B)^\dagger_i=-(B^\dagger \times A^\dagger)_i##
 
Dyatlov said:
The title mentions that I am solving the identity for Hermitian operators.
No it doesn't -- it mentions "Hermitian conjugation", which is an operation which can be performed on any operator.

Anyway, I take it you're now happy with the solution.
 
  • Like
Likes Dyatlov
Bad wording I guess then.
Thanks for the help, anyway!
 
Back
Top