Tensors: raising and lowering indices

In summary, we have a (2,0) tensor X^{\mu \nu} with components given by the matrix:X^{\mu \nu} = \begin{pmatrix}2 & 0 & 1 & -1\\-1 & 0 & 3 & 2\\-1 & 1 & 0 & 0\\-2 & 1 & 1 & -2\end{pmatrix}We are asked to find the components of the (1,1) tensors {X^\mu}_\nu and {X_\mu}^\nu. To do so, we first rename X^{\mu \nu} as X^{\mu \sigma} and
  • #1
Irid
207
1
From Carroll's textbook:

1. The problem statement
Imagine we have a tensor [tex]X^{\mu \nu}[/tex] with components

[tex]X^{\mu \nu} = \begin{pmatrix}
2 & 0 & 1 & -1\\
-1 & 0 & 3 & 2\\
-1 & 1 & 0 & 0\\
-2 & 1 & 1 & -2
\end{pmatrix}[/tex]

Find the components of: (a) [tex]{X^\mu}_\nu[/tex]; (b) [tex]{X_\mu}^\nu[/tex].2. The attempt at a solution
I don't really understand what am I doing here, just following examples from the textbook. First, temporarily rename

[tex]X^{\mu \nu} \rightarrow X^{\mu \sigma}[/tex]

Then, lower an index:

[tex]{X^\mu}_\nu = \eta_{\nu \sigma} X^{\mu \sigma}[/tex]

where [tex]\eta_{\nu \sigma}[/tex] is the metric. Multiplicate matrices:

[tex] {X^\mu}_\nu = \begin{pmatrix}
-1 & 0 & 0 & 0\\
0 & 1 & 0 & 0\\
0 & 0 & 1 & 0\\
0 & 0 & 0 & 1
\end{pmatrix}\begin{pmatrix}
2 & 0 & 1 & -1\\
-1 & 0 & 3 & 2\\
-1 & 1 & 0 & 0\\
-2 & 1 & 1 & -2
\end{pmatrix}=\begin{pmatrix}
-2 & 0 & 1 & -1\\
1 & 0 & 3 & 2\\
1 & 1 & 0 & 0\\
2 & 1 & 1 & -2
\end{pmatrix}[/tex]

Then, for part (b), rename

[tex]X^{\mu \nu} = X^{\sigma \nu}[/tex]

and lower another index:

[tex]{X_\mu}^\nu = \eta_{\mu \sigma} X^{\sigma \nu}[/tex]

However, in matrix notation it's the same procedure, so the answer for (b) is

[tex]{X_\mu}^\nu = \begin{pmatrix}
-2 & 0 & 1 & -1\\
1 & 0 & 3 & 2\\
1 & 1 & 0 & 0\\
2 & 1 & 1 & -2
\end{pmatrix}[/tex]

3. Questions
If the matrices are the same, what is the difference between these two tensors: (a) [tex]{X^\mu}_\nu[/tex]; (b) [tex]{X_\mu}^\nu[/tex]?
As I understand, I had a (2,0) tensor to begin with, multiplied it by the metric (0,2) to obtain two (1,1) tensors. What is that good for? What's the difference in the rank of tensors if the matrices look similar, only a few components change?
 
Physics news on Phys.org
  • #2
Irid said:
[tex] {X^\mu}_\nu = \begin{pmatrix}
-1 & 0 & 0 & 0\\
0 & 1 & 0 & 0\\
0 & 0 & 1 & 0\\
0 & 0 & 0 & 1
\end{pmatrix}\begin{pmatrix}
2 & 0 & 1 & -1\\
-1 & 0 & 3 & 2\\
-1 & 1 & 0 & 0\\
-2 & 1 & 1 & -2
\end{pmatrix}=\begin{pmatrix}
-2 & 0 & 1 & -1\\
1 & 0 & 3 & 2\\
1 & 1 & 0 & 0\\
2 & 1 & 1 & -2
\end{pmatrix}[/tex]
This is the correct multiplication to perform. However, you have not performed it correctly.

Raising and lowering of indices is useful for balancing equations. For an equation to hold, the indices that are not contracted must agree on both sides of the equation. Sometimes it will be necessary to swap an upper and lower index in order to get this to happen. In special relativity, the metric \eta used to raise and lower is somewhat boring. However, in general relativity a more interesting metric, usually denoted g, is used. \eta is a special case of this metric g. When you start working with it, the practice you get by using the simpler \eta will come in handy.
 
  • #3
Okay, you're just making a small mistake when you're turning index notation into matrix notation.

You're right that [tex]X^{\mu}{}_{\nu}=\eta_{\nu \sigma}X^{\mu \sigma}[/tex]

Now since we're writing this in terms of indices, it doesn't matter which order you write [tex]\eta_{\nu \sigma}[/tex] and [tex]X^{\mu \sigma}[/tex] in (as log as you write in the indices of course).

So you can write [tex]X^{\mu}{}_{\nu}=\eta_{\nu \sigma}X^{\mu \sigma}=X^{\mu \sigma}\eta_{\nu \sigma}=X^{\mu \sigma}\eta_{\sigma \nu}[/tex]
(where the last step follows because [tex]\eta_{\nu \sigma}[/tex] is symmetric)

And this is matrix multiplication in the order [tex]X^{\mu}{}_{\nu}=[X][\eta][/tex]
since you sum over columns in [tex][X][/tex] and rows in [tex][\eta][/tex] (it may help to write out the product explicitly to see this).

Now, the other way you did correctly since [tex]X_{\mu}{}^{\nu}=\eta_{\mu \sigma}X^{\sigma \nu}=[\eta][X][/tex]

So, the matrices should not be the same (as you probably expected). The difference is in what basis the components are with respect to. [tex]X^{\mu}{}_{\nu}[/tex] are the components of a (1,1) tensor in the [tex]\hat{e}\otimes\hat{\theta}[/tex] basis (to use Carroll's notation - pg. 21), and [tex]X_{\mu}{}^{\nu}[/tex] are the components of the same (1,1) tensor in the [tex]\hat{\theta}\otimes\hat{e}[/tex] basis. Note these are the same tensor expressed in different bases. You can also make a (0,2) tensor by lowering both indices of [tex]X^{\mu \sigma}[/tex]. The (0,2), (1,1), and original (2,0) tensors are all different objects labeled by the same letter - it's the index placement that differentiates them.

edit: hah, it took me too long to type this - you beat me jimmy!
 
  • #4
OK, so I now see that (a):

[tex]{X^\mu}_\nu = \eta_{\nu \sigma} X^{\mu \sigma} = [X][\eta] = \begin{pmatrix}
2 & 0 & 1 & -1\\
-1 & 0 & 3 & 2\\
-1 & 1 & 0 & 0\\
-2 & 1 & 1 & -2
\end{pmatrix}\begin{pmatrix}
-1 & 0 & 0 & 0\\
0 & 1 & 0 & 0\\
0 & 0 & 1 & 0\\
0 & 0 & 0 & 1
\end{pmatrix}=\begin{pmatrix}
-2 & 0 & 1 & -1\\
1 & 0 & 3 & 2\\
1 & 1 & 0 & 0\\
2 & 1 & 1 & -2
\end{pmatrix}
[/tex]

and (b):

[tex]{X_\mu}^\nu = \eta_{\mu \sigma} X^{\sigma \nu} = [\eta][X] = \begin{pmatrix}
-1 & 0 & 0 & 0\\
0 & 1 & 0 & 0\\
0 & 0 & 1 & 0\\
0 & 0 & 0 & 1
\end{pmatrix}\begin{pmatrix}
2 & 0 & 1 & -1\\
-1 & 0 & 3 & 2\\
-1 & 1 & 0 & 0\\
-2 & 1 & 1 & -2
\end{pmatrix}=\begin{pmatrix}
-2 & 0 & -1 & 1\\
-1 & 0 & 3 & 2\\
-1 & 1 & 0 & 0\\
-2 & 1 & 1 & -2
\end{pmatrix}
[/tex]

Thus, the order of matrices in multiplication is determined by the position of the index being raised (lowered). Suppose I wanted to raise an index, then

[tex]{X^\mu}_\nu = \eta^{\mu \sigma} X_{\sigma \nu} = [\eta] [X]\, ? [/tex]

Also, why is the metric such a special tensor that it's used to lower and raise these indices?
 
  • #5
Irid said:
Also, why is the metric such a special tensor that it's used to lower and raise these indices?
Given [tex]A^{\mu}[/tex], then by definition, [tex]A_0 = -A^0, A_i = A^i[/tex]. [tex]\eta[/tex] is the matrix that accomplishes this transformation.
 
  • #6
Starts getting clearer...

OK, what if the index being raised is in the middle, such as

[tex]{{X_{\mu}}^\nu}_\rho = \eta^{\nu \sigma} X_{\mu \sigma \rho} = [\eta][X] \quad \text{OR} \quad [X][\eta][/tex]

or is this operation simply not allowed?
 
  • #7
Another question, how do I symmetrize [tex]X^{(\mu \nu)}[/tex], given [tex]X^{\mu \nu}[/tex]? I tried to lower both indices then raise them back in a different order, but eventually the metric and inverse metric just multiplies and I obtain [tex]X^{\mu \nu} = X^{\nu \mu}[/tex]. I don't know how to obtain [tex]X^{\nu \mu}[/tex].
 
  • #8
Irid said:
Starts getting clearer...

OK, what if the index being raised is in the middle, such as

[tex]{{X_{\mu}}^\nu}_\rho = \eta^{\nu \sigma} X_{\mu \sigma \rho} = [\eta][X] \quad \text{OR} \quad [X][\eta][/tex]

or is this operation simply not allowed?

It's allowed. You just can't write it as a matrix multiplication since X isn't a matrix. But if you know all of the components of X and eta, you wouldn't have any trouble computing any component of the lowered index tensor, would you?
 
  • #9
Irid said:
Another question, how do I symmetrize [tex]X^{(\mu \nu)}[/tex], given [tex]X^{\mu \nu}[/tex]? I tried to lower both indices then raise them back in a different order, but eventually the metric and inverse metric just multiplies and I obtain [tex]X^{\mu \nu} = X^{\nu \mu}[/tex]. I don't know how to obtain [tex]X^{\nu \mu}[/tex].

It's [itex](X^{\mu \nu} + X^{\nu \mu})/2[/itex]. There's nothing to 'compute' until you have more information about X. E.g. [itex]X^{(01)}=(X^{01} + X^{10})/2[/itex].
 
  • #10
Dick said:
It's [itex](X^{\mu \nu} + X^{\nu \mu})/2[/itex]. There's nothing to 'compute' until you have more information about X. E.g. [itex]X^{(01)}=(X^{01} + X^{10})/2[/itex].

Well, I know the matrix representation of X. Isn't that enough?
 
  • #11
Irid said:
Well, I know the matrix representation of X. Isn't that enough?

Then just add the matrix of X and the transpose of the matrix of X and divide by 2.
 
  • #12
Yeah, I suspected that

[tex]X^{\mu \nu} = \left( X^{\nu \mu} \right)^T[/tex]

but you can never be sure... OK, enough help to solve the problem, I hope I'll understand more as I learn about the subject further. I'm sure I'll post some more newbie questions shortly :))
 

FAQ: Tensors: raising and lowering indices

1. What are tensors in general?

Tensors are mathematical objects that describe geometric and physical quantities. They have components that depend on the chosen basis of a vector space and transform in a specific way under coordinate transformations.

2. What does it mean to raise an index on a tensor?

Raising an index on a tensor refers to the process of multiplying the components of the tensor by the inverse of the metric tensor. This results in the tensor transforming into a different type of tensor with different transformation properties.

3. Why is it necessary to raise or lower indices on tensors?

Raising and lowering indices allows for a more convenient representation of tensors in calculations and equations. It also helps to maintain the correct transformation properties of the tensors under coordinate transformations.

4. How is the metric tensor used in raising and lowering indices?

The metric tensor is used to raise or lower indices by providing the necessary coefficients in the transformation. For example, raising an index on a contravariant tensor involves multiplying its components by the inverse of the metric tensor, while lowering an index on a covariant tensor involves multiplying its components by the metric tensor itself.

5. Can tensors with more than two indices be raised or lowered?

Yes, tensors with more than two indices can also be raised or lowered by following the same process as tensors with two indices. The metric tensor will provide the necessary coefficients for the transformation of each index.

Back
Top