Raising/Lowering Indices w/ Metric Tensor

In summary: Instead from the point of view of the final tensor we get after plugging in the vectors, the order is irrelevant.In summary, the notation and order used for operations involving tensors can be confusing, but ultimately it is just a matter of convention and does not affect the final result. Whether raising or lowering indices through the metric tensor, the order may appear to make a difference, but from the perspective of the final tensor, the order is irrelevant.
  • #1
cianfa72
2,475
255
TL;DR Summary
Rules to raise or lower indices through metric tensor
I'm still confused about the notation used for operations involving tensors.
Consider the following simple example:
$$\eta^{\mu \sigma} A_{\mu \nu} = A_{\mu \nu} \eta^{\mu \sigma}$$
Using the rules for raising an index through the (inverse) metric tensor ##\eta^{\mu \sigma}## we get ##A^{\sigma}{}_{\nu}##. However if we work out explicitly the contraction employing the operator ##C_{\alpha}^{\mu} ()## we get:

$$C_{\alpha}^{\mu} (A_{\alpha \nu} \eta^{\mu \sigma} e^{\alpha} \otimes e^{\nu} \otimes e_{\mu} \otimes e_{\sigma}) = A_{\mu \nu} \eta^{\mu \sigma} e^{\mu} (e_{\mu}) e^{\nu} \otimes e_{\sigma} = A_{\mu \nu} \eta^{\mu \sigma} e^{\nu} \otimes e_{\sigma}$$
The latter is a tensor, say ##T = T_{\nu} {}^{\sigma} e^{\nu} \otimes e_{\sigma}##.

Is it the same as ##A^{\sigma}{}_{\nu} e_{\sigma} \otimes e^{\nu}## ?
 
Last edited:
Physics news on Phys.org
  • #2
You made a mistake in your work;
\begin{align*}C_{\alpha}^{\mu} (A_{\alpha \nu} \eta^{\mu \sigma} e^{\alpha} \otimes e^{\nu} \otimes e_{\mu} \otimes e_{\sigma}) &= A_{\alpha \nu} \eta^{\mu \sigma} e^{\alpha} (e_{\mu}) e^{\nu} \otimes e_{\sigma} \\ &= A_{\mu \nu} \eta^{\mu \sigma} e^{\nu} \otimes e_{\sigma} \\ &={A^{\sigma}}_{\nu} e^{\nu} \otimes e_{\sigma}\end{align*}
 
  • #3
ergospherical said:
You made a mistake in your work;
\begin{align*}C_{\alpha}^{\mu} (A_{\alpha \nu} \eta^{\mu \sigma} e^{\alpha} \otimes e^{\nu} \otimes e_{\mu} \otimes e_{\sigma}) &= A_{\alpha \nu} \eta^{\mu \sigma} e^{\alpha} (e_{\mu}) e^{\nu} \otimes e_{\sigma} \\ &= A_{\mu \nu} \eta^{\mu \sigma} e^{\nu} \otimes e_{\sigma} \\ &={A^{\sigma}}_{\nu} e^{\nu} \otimes e_{\sigma}\end{align*}
Oops yes, from the RHS on the first line summing over ##\alpha## we get the second line and then (summing over ##\mu##) the result.

Maybe I'm missing the point, in your result the ##\nu## index in ##A^{\sigma}{}_{\nu}## actually refers to the first element in tensor product ##e^{\nu} \otimes e_{\sigma}##, not to the second one. Instead in the expression ##T_{\nu}{}^{\sigma} e^{\nu} \otimes e_{\sigma}## is the first index ##\nu## that refers to the first element.

Is the following correct ?

$${A^{\sigma}}_{\nu} e^{\nu} \otimes e_{\sigma} = T_{\nu}{}^{\sigma} e^{\nu} \otimes e_{\sigma}$$
 
  • #4
cianfa72 said:
Is the following correct ?
$${A^{\sigma}}_{\nu} e^{\nu} \otimes e_{\sigma} = T_{\nu}{}^{\sigma} e^{\nu} \otimes e_{\sigma}$$
Well, yes, but only because that’s how you defined ##{T_{\nu}}^{\sigma}##…

You are worrying too much. It’s conventional to maintain the same horizontal ordering of the component indices and the tensor arguments (so that it’s easy to tell which slot is which), but you can do whatever you want.
 
  • Like
Likes cianfa72
  • #5
Sorry, it seems to me there are actually two different answers we get reversing the order of the 'index raising' operation, namely:
$$C_{\alpha}^{\mu} (A_{\alpha \nu} \eta^{\mu \sigma} e^{\alpha} \otimes e^{\nu} \otimes e_{\mu} \otimes e_{\sigma}) = {A^{\sigma}}_{\nu} e^{\nu} \otimes e_{\sigma}$$
then if we reverse the order we get:
$$C_{\alpha}^{\mu} (\eta^{\mu \sigma} A_{\alpha \nu} e_{\mu} \otimes e_{\sigma} \otimes e^{\alpha} \otimes e^{\nu}) = {A^{\sigma}}_{\nu} e_{\sigma} \otimes e^{\nu}$$

The two are really two different tensors, where is the mistake ?
 
  • #6
It’s nothing more significant than the ordering of the vector and co-vector arguments.
 
  • #7
Suppose ##n=2## we get in the two cases:
$$A^{1}{}_{1} e^{1} \otimes e_{1} + A^{1}{}_{2} e^{2} \otimes e_{1} + A^{2}{}_{1} e^{1} \otimes e_{2} + A^{2}{}_{2} e^{2} \otimes e_{2}$$ $$A^{1}{}_{1} e_{1} \otimes e^{1} + A^{1}{}_{2} e_{1} \otimes e^{2} + A^{2}{}_{1} e_{2} \otimes e^{1} + A^{2}{}_{2} e_{2} \otimes e^{2}$$
So the difference is really just the slots order in which plug in the vector and co-vector.

Does the same thing hold for cases like the following ?

##\eta_{\mu \alpha} A^{\alpha \nu} \eta_{\sigma \nu} \Rightarrow A_{\mu \sigma} e^{\mu} \otimes e^{\sigma}##

##\eta_{\sigma \nu} A^{\alpha \nu} \eta_{\mu \alpha} \Rightarrow A_{\mu \sigma} e^{\sigma} \otimes e^{\mu}##
 
  • #8
Yeah, chill, it’s just like the difference between ##f(x,y) = x^2 y## and ##g(x,y) = y^2 x##, whereby ##f(x,y) = g(y,x)##.
 
  • #9
ergospherical said:
Yeah, chill, it’s just like the difference between ##f(x,y) = x^2 y## and ##g(x,y) = y^2 x##, whereby ##f(x,y) = g(y,x)##.
yes, the point confusing me is that when formally we raise and/or lower tensor indices through the metric tensor we need to take in account it (in other words we need to take in account the orders of the slots -- in the above example the order of the slots 'waiting' for vectors to be plugged in).

Hence from the point of view of the tensor we get from the raising/lowering operations through the metric tensor, the order makes the difference.
 
Last edited:

FAQ: Raising/Lowering Indices w/ Metric Tensor

What is the purpose of raising and lowering indices with a metric tensor?

The metric tensor is a mathematical tool used in the study of geometry and physics, particularly in the field of general relativity. Raising and lowering indices with a metric tensor allows for the conversion between covariant and contravariant vectors, which are important for understanding the relationship between space and time in curved spacetime.

How do you raise an index with a metric tensor?

To raise an index with a metric tensor, you simply multiply the vector by the inverse of the metric tensor. This converts a contravariant vector (with an upper index) to a covariant vector (with a lower index).

What is the difference between raising and lowering indices with a metric tensor?

Raising and lowering indices with a metric tensor are essentially inverse operations. Raising an index converts a contravariant vector to a covariant vector, while lowering an index converts a covariant vector to a contravariant vector. This is important for maintaining the correct mathematical relationships between vectors in curved spacetime.

Can you raise or lower indices with any type of tensor?

No, raising and lowering indices is specific to metric tensors. Other types of tensors may have different operations for converting between covariant and contravariant vectors.

How is the metric tensor related to the curvature of spacetime?

The metric tensor is used to define the geometry of spacetime in general relativity. It describes the relationship between space and time and how it is affected by the presence of matter and energy. The curvature of spacetime is directly related to the components of the metric tensor, which are used to calculate the spacetime interval between two events.

Similar threads

Replies
8
Views
2K
Replies
7
Views
2K
Replies
5
Views
818
Replies
7
Views
655
Replies
22
Views
2K
Replies
10
Views
1K
Back
Top