How can the dual tensors derivation be achieved using rotation matrices?

In summary, Dual tensors (pp 192) can be obtained by multiplying a bunch of rotations by their determinant.
  • #1
euphoricrhino
22
7
Hello,
I'm reading Group Theory in a nutshell for physicist by A Zee. When he introduced Dual tensors (pp 192), he made a claim with a light hint, and I have had great trouble deriving this claim, any help would be appreciated -

Let ##R \in SO(N)## be an ##N##-dimensional rotation, then the following is true
$$
\epsilon^{ijk\cdots n}R^{ip}R^{jq}=\epsilon^{pqr\cdots s}R^{kr}\cdots R^{ns}
$$
(where ##\epsilon## is the antisymmetric symbol and the above uses repeated index summing convention).
The hint was to use the ##N\times N## matrix determinant
$$
\epsilon^{ijk\cdots n}R^{ip}R^{jq}R^{kr}\cdots R^{ns}=\epsilon^{pqr\cdots s} \mbox{det}R=\epsilon^{pqr\cdots s} \quad(\mbox{since }R\mbox{ is special})
$$
and multiply it "by a bunch of ##R^T##s carrying appropriate indices".

I have tried to understand the claim with ##N=3## which I think is the cross product relation, but I couldn't see how that could be obtained by involving ##R^T##, and how it could be extended to ##N## dimensions.

Thanks for the help!
 
  • Like
Likes yanpengju
Physics news on Phys.org
  • #2
I don't have Zee's book but does he talk about,

##\epsilon^{ijk\cdots n}R^{ip}R^{jq}\cdots R^{ns} = \det(R)\epsilon^{pqr\cdots s}##

which I think this follows from the definition of the determinate. Now, for ##SO(N)## one has ##\det(R) = 1##. From this one applies the group relation ##R^{ij}R^{ik} = \delta^{jk}## to both sides twice and you have it. Hope this helps.
 
  • #3
Thanks for the reply!

However I must be missing something really obvious, I don't see how to "apply the group relation ##R^{ij}R^{ik}=\delta^{jk}## to both sides" of the determinant equality.

The LHS of the determinant relation is a sum of ##N!## terms, each of which is a product whose factors don't share any index. Can you kindly elaborate for the ##N=3## case here?

From determinant equality
$$
\epsilon^{ijk}R^{ip}R^{jq}R^{kr}=\epsilon^{pqr}
$$
where ##(pqr)## is a given permutation
how to derive (for any given ##p,q,k##)
$$
\epsilon^{ijk}R^{ip}R^{jq}=\epsilon^{pqr}R^{kr}
$$

Thank you very much!
 
  • #4
In matrix form

##R^TR = RR^T = I##

so, one also has the relation,

##R^{ip}R^{kp} = \delta^{ik}##
 
  • Like
Likes euphoricrhino
  • #5
I finally figured it out, it's actually quite simple, but all the symbols there have been distracting.

The determinant relation
$$
\epsilon^{ijk\cdots n}R^{ip}R^{jq}R^{kr}\cdots R^{ns}=\epsilon^{pqr\cdots s}
$$
can be viewed as an inner product relation
$$
v^nR^{ns}=\epsilon^{pqr\cdots s}
$$
where ##v^n## is defined by
$$
v^n=\epsilon^{ijk\cdots n}R^{ip}R^{jq}R^{kr}\cdots
$$
Since columns ##R^{\cdot s}## form an orthonormal basis of the ##N##-dimensional space, the inner product relation above actually gives the decomposition of vector ##v## into this basis, i.e.
$$
v=\epsilon^{pqr\cdots s}R^{\cdot s}
$$
Taking the ##n##-th component of this yields
$$
\epsilon^{ijk\cdots n}R^{ip}R^{jq}R^{kr}\cdots=v^n=\epsilon^{pqr\cdots s}R^{ns}
$$

Now we just need to repeat the same argument on all the other ##R##s on the left until only ##R^{ip}## and ##R^{jq}## were left
 
  • #6
I have been struggling with this question for days and came across here. I don't understand euphoricrhino's final solution, but instead got my own derivations here. Hope it helps anyone who might Google here in the future.

The key obstacle here is that Zee didn't explicitly specify the indices to be summed on. That's Einstein's fault anyway, for his invention of summation convention messes all it up. :smile:

What we have and want to prove are:

$$\begin{aligned}
\epsilon^{ijk\cdots n}R^{ip}R^{jq}R^{kr}\cdots R^{ns} &= \epsilon^{pqr\cdots s} \\
\epsilon^{ijk\cdots n}R^{ip}R^{jq} &= \epsilon^{pqr\cdots s}R^{kr}\cdots R^{ns}
\end{aligned}$$

Let's make the summation operation explicit:

$$\begin{aligned}
\sum_{ijk\cdots n}\epsilon^{ijk\cdots n}R^{ip}R^{jq}R^{kr}\cdots R^{ns} &= \epsilon^{pqr\cdots s} \\
\sum_{ij}\epsilon^{ijk\cdots n}R^{ip}R^{jq} &= \sum_{r\cdots s}\epsilon^{pqr\cdots s}R^{kr}\cdots R^{ns}
\end{aligned}$$

(You may stop here and continue on your own derivation if you like.)

We introduce some fixed indices ##k^{\prime},\cdots,n^{\prime}## and multiply the following same terms on both sides like this:

$$\sum_{ijk\cdots n}\epsilon^{ijk\cdots n}R^{ip}R^{jq}R^{kr}\cdots R^{ns}\cdot R^{k^{\prime}r}\cdots R^{n^{\prime}s}=\epsilon^{pqr\cdots s}\cdot R^{k^{\prime}r}\cdots R^{n^{\prime}s}$$

Summing both sides over indices ##r\cdots s##, then the left side becomes:

$$\begin{aligned}
& \sum_{r\cdots s}\sum_{ijk\cdots n}\epsilon^{ijk\cdots n}R^{ip}R^{jq}R^{kr}\cdots R^{ns}\cdot R^{k^{\prime}r}\cdots R^{n^{\prime}s} \\
=& \sum_{ij}R^{ip}R^{jq}\sum_{k\cdots}\sum_{r\cdots}R^{kr}\cdots R^{k^{\prime}r}\cdots\sum_{n}\epsilon^{ijk\cdots n}\sum_{s}R^{ns}R^{n^{\prime}s} \\
=& \sum_{ij}R^{ip}R^{jq}\sum_{k\cdots}\sum_{r\cdots}R^{kr}\cdots R^{k^{\prime}r}\cdots\sum_{n}\delta^{nn^{\prime}}\epsilon^{ijk\cdots n} \\
=& \sum_{ij}R^{ip}R^{jq}\sum_{k\cdots}\sum_{r\cdots}\epsilon^{ijk\cdots n^{\prime}}R^{kr}\cdots R^{k^{\prime}r}\cdots \\
=& \cdots \\
=& \sum_{ij}\epsilon^{ijk^{\prime}\cdots n^{\prime}}R^{ip}R^{jq}
\end{aligned}$$

Rename ##k^{\prime},\cdots,n^{\prime}## to ##k,\cdots,n## and remove the explicit summation, we get the equation finally:

$$\begin{aligned}
\sum_{ij}\epsilon^{ijk^{\prime}\cdots n^{\prime}}R^{ip}R^{jq} &= \sum_{r\cdots s}\epsilon^{pqr\cdots s}R^{k^{\prime}r}\cdots R^{n^{\prime}s} \\
\sum_{ij}\epsilon^{ijk\cdots n}R^{ip}R^{jq} &= \sum_{r\cdots s}\epsilon^{pqr\cdots s}R^{kr}\cdots R^{ns} \\
\epsilon^{ijk\cdots n}R^{ip}R^{jq} &= \epsilon^{pqr\cdots s}R^{kr}\cdots R^{ns}
\end{aligned}$$
 

FAQ: How can the dual tensors derivation be achieved using rotation matrices?

What is a dual tensor?

A dual tensor is a mathematical object that represents a linear transformation between two vector spaces. It is defined by a set of components that describe how the transformation acts on basis vectors in each space.

What are rotation matrices?

Rotation matrices are mathematical tools used to describe rotations in three-dimensional space. They are square matrices that can be used to rotate a vector or set of coordinates around a fixed point.

How can the dual tensors derivation be achieved using rotation matrices?

The derivation of dual tensors using rotation matrices involves using the properties of rotation matrices to derive the components of the dual tensor. This can be done by considering the transformation of basis vectors under a rotation and using the resulting equations to determine the components of the dual tensor.

What are the benefits of using rotation matrices in the derivation of dual tensors?

Using rotation matrices in the derivation of dual tensors allows for a more intuitive understanding of the transformation and can simplify the calculations involved. It also allows for a visual representation of the transformation, making it easier to interpret the results.

Are there any limitations to using rotation matrices in the derivation of dual tensors?

While rotation matrices can be useful in the derivation of dual tensors, they are limited to transformations in three-dimensional space. For higher dimensions, other mathematical tools may be necessary to achieve the derivation.

Similar threads

Replies
5
Views
949
Replies
1
Views
4K
Replies
4
Views
2K
Replies
3
Views
935
Replies
35
Views
27K
Replies
20
Views
4K
Back
Top