- #1
spaghetti3451
- 1,344
- 34
Homework Statement
Imagine we have a tensor ##X^{\mu\nu}## and a vector ##V^{\mu}##, with components
##
X^{\mu\nu}=\left( \begin{array}{cccc}
2 & 0 & 1 & -1 \\
-1 & 0 & 3 & 2 \\
-1 & 1 & 0 & 0 \\
-2 & 1 & 1 & -2 \end{array} \right), \qquad V^{\mu} = (-1,2,0,-2).
##
Find the components of:
(a) ##{X^{\mu}}_{\nu}##
(b) ##{X_{\mu}}^{\nu}##
(c) ##X^{(\mu\nu)}##
(d) ##X_{[\mu\nu]}##
(e) ##{X^{\lambda}}_{\lambda}##
(f) ##V^{\mu}V_{\mu}##
(g) ##V_{\mu}X^{\mu\nu}##
Homework Equations
The Attempt at a Solution
(a) ##{X^{\mu}}_{\nu}=X^{\mu\rho}\eta_{\rho\nu}=\left( \begin{array}{cccc}
2 & 0 & 1 & -1 \\
-1 & 0 & 3 & 2 \\
-1 & 1 & 0 & 0 \\
-2 & 1 & 1 & -2 \end{array} \right)
\left( \begin{array}{cccc}
-1 & 0 & 0 & 0 \\
0 & 1 & 0 & 0 \\
0 & 0 & 1 & 0 \\
0 & 0 & 0 & 1 \end{array} \right)=\left( \begin{array}{cccc}
-2 & 0 & 1 & -1 \\
1 & 0 & 3 & 2 \\
1 & 1 & 0 & 0 \\
2 & 1 & 1 & -2 \end{array} \right)
##,
where the rows of the left matrix are multiplied by the columns of the right matrix because the summation is over the second index of ##X^{\mu\rho}## and the first index of ##\eta_{\rho\nu}##.
(b) ##{X_{\mu}}^{\nu}=\eta_{\mu\rho}X^{\rho\nu}=
\left( \begin{array}{cccc}
-1 & 0 & 0 & 0 \\
0 & 1 & 0 & 0 \\
0 & 0 & 1 & 0 \\
0 & 0 & 0 & 1 \end{array} \right)
\left( \begin{array}{cccc}
2 & 0 & 1 & -1 \\
-1 & 0 & 3 & 2 \\
-1 & 1 & 0 & 0 \\
-2 & 1 & 1 & -2 \end{array} \right)
=\left( \begin{array}{cccc}
-2 & 0 & -1 & 1 \\
-1 & 0 & 3 & 2 \\
-1 & 1 & 0 & 0 \\
-2 & 1 & 1 & -2 \end{array} \right)
##,
where the rows of the left matrix are multiplied by the columns of the right matrix because the summation is over the second index of ##\eta_{\mu\rho}## and the first index of ##X^{\rho\nu}##.
(c) ##X^{(\mu\nu)}=\frac{1}{2}(X^{\mu\nu}+X^{\nu\mu})=\frac{1}{2}\Bigg[\left( \begin{array}{cccc}
2 & 0 & 1 & -1 \\
-1 & 0 & 3 & 2 \\
-1 & 1 & 0 & 0 \\
-2 & 1 & 1 & -2 \end{array} \right)+\left( \begin{array}{cccc}
2 & -1 & -1 & -2 \\
0 & 0 & 1 & 1 \\
1 & 3 & 0 & 1 \\
-1 & 2 & 0 & -2 \end{array} \right)
\Bigg]=\left( \begin{array}{cccc}
2 & -0.5 & 0 & -1.5 \\
-0.5 & 0 & 2 & 1.5 \\
0 & 2 & 0 & 0.5 \\
-1.5 & 1.5 & 0.5 & -2 \end{array} \right)
##
(d) ##X_{[\mu\nu]}=\frac{1}{2}(X_{\mu\nu}-X_{\nu\mu})=\frac{1}{2}(\eta_{\mu\rho}X^{\rho\sigma}\eta_{\sigma\nu}-\eta_{\nu\sigma}X^{\sigma\rho}\eta_{\rho\mu})##
Are my answers to (a), (b) and (c) correct?
With part (d), I'm not sure if I should take the original matrix to ##X^{\rho\sigma}## or the transposed matrix to ##X^{\rho\sigma}##? Does it make a difference anyway?