Ricci notations and visualisation

  • #1
chartery
40
4
I'm having trouble with notations and visualisations regarding Ricci curvature.

For Riemann tensor there is variously:

##R^{\rho}\text{ }_{\sigma\mu\nu}\text{ }X^{\mu}Y^{\nu}V^{\sigma}\partial_{\rho}##

##[\nabla _{X},\nabla _{Y}]V##

##R(XY)V\mapsto Z##

##\left\langle R(XY)V,Z \right\rangle## i.e. covariant ##R(XYVZ)##

which I think I can grasp, mostly.

So my pictorial visualisation is of Z as the result of transporting vector V around loop given by vectors X,Y.
It even makes some sense then regarding it (Riem) as a sectional Gaussian Curvature of 2D subplane defined by X,Y.
But I run into sand with Ricci tensor being an average of these subplanes.

1) Does ##R^{\rho}\text{ }_{\sigma\rho\nu}\text{ }X^{\rho}Y^{\nu}V^{\sigma}\partial_{\rho}\text{ }## become ##R(XY)V\mapsto X## and ##\left\langle R(XY)V,X \right\rangle## ?

2) Does ##(R^{\rho}\text{ }_{\sigma\rho\nu}\text{ }X^{\rho}Y^{\nu}V^{\sigma}\partial_{\rho}\text{ }=)\text{ }R_{\sigma\nu}Y^{\nu}V^{\sigma}## become ##Ric(YV)## or ##Ric(XX)## or something else ?

3) Gaussian Curvature seems to be given as ##K(yv)## proportional to ##\left\langle R(yv)v,y \right\rangle## which has both duplicated, rather than just the ##x##, as in 1) above ?
(I have used ##yv## instead of usual ##uv## to try and keep track of the contraction.)

4) Is there a way to picture how contracting the Z and X vectors (leaving the V and Y) would lead to averaging of XY subplanes?
 
Physics news on Phys.org
  • #2
chartery said:
For Riemann tensor there is variously:
It might help if you gave some references for where you are getting these various definitions from.

chartery said:
##R^{\rho}\text{ }_{\sigma\mu\nu}\text{ }X^{\mu}Y^{\nu}V^{\sigma}\partial_{\rho}##
This is a contraction of the Riemann tensor with three vectors and a 1-form (note, btw, that the 1-form does not have to be a partial derivative/coordinate basis 1-form) to form a scalar.

chartery said:
##[\nabla _{X},\nabla _{Y}]V##
This by itself is not the Riemann tensor. It's not even the curvature operator that leads to the Riemann tensor, because it's missing a term. The curvature operator ##\mathscr{R}## is given by

$$
\mathscr{R}(X, Y) = [\nabla_X, \nabla_Y] - \nabla_{[X, Y]}
$$

chartery said:
##R(XY)V\mapsto Z##
If this is expressing that the curvature operator maps a vector to another vector, it's basically correct. But it's still the curvature operator, which is not quite the same thing as the Riemann tensor.

chartery said:
##\left\langle R(XY)V,Z \right\rangle## i.e. covariant ##R(XYVZ)##
I'm not sure what this means. Again, some references would help.

chartery said:
So my pictorial visualisation is of Z as the result of transporting vector V around loop given by vectors X,Y.
That is what the curvature operator represents, yes.

chartery said:
I run into sand with Ricci tensor being an average of these subplanes.
The Ricci tensor is a contraction of the Riemann tensor on its first two indexes. That means it is a sum of multiple components of the Riemann tensor. So you would have to think of it in curvature operator terms as a sum of multiple expressions:

$$
\text{Ricci}(A, B) = \Sigma_{X, Z} \mathscr{R}(X, B)Z
$$

where the vector ##A## is a sort of "average" of the results of the curvature operator expressions that are being summed.

I'm not sure there is any easy correspondence between this and "average of subplanes".
 
  • #3
@PeterDonis, many thanks for reply and apologies for the lax terminology. References are unavailable, as the end result is my attempted distillate from a blizzard of websites.

I realise ##R^{\rho}\text{ }_{\sigma\mu\nu}\text{ }X^{\mu}Y^{\nu}V^{\sigma}\partial_{\rho}## is a contraction, but wanted to keep clear (at least for me) the relationships between vectors and indices.

And I forgot to specify torsion-free in ##[\nabla _{X},\nabla _{Y}]V##.
I assumed that the operator and tensor were effectively different representations of the same 'operation'. Is that a bad idea, or wrong?

The presumption was that ##R(XYVZ)## meant ##R_{\rho\sigma\mu\nu}## and so ##\left\langle R(XY)V,Z \right\rangle## meant some kind of metric inner product dropping the contravariant index. But of course it makes no sense thinking that any inner product would give a fourth-rank tensor. If it keeps cropping up, I'll come back with a reference.I think ##\text{Ricci}(A, B) = \Sigma_{X, Z} \mathscr{R}(X, B)Z## will be helpful when (if!) I can get some sort of geometric picture in my head. I will work on it.
Taking both sides as operators, the Riemann with three vectors (contracted on two) makes sense, but I don't understand why the Ricci would have two vectors rather than one?
 
  • #4
chartery said:
I realise ##R^{\rho}\text{ }_{\sigma\mu\nu}\text{ }X^{\mu}Y^{\nu}V^{\sigma}\partial_{\rho}## is a contraction, but wanted to keep clear (at least for me) the relationships between vectors and indices.
Ok. Once again, though, note that the 1-form that contracts with the first index on the Riemann tensor does not need to be a coordinate basis 1-form ##d_\rho##. Any 1-form ##\sigma_\rho## will do.

chartery said:
And I forgot to specify torsion-free in ##[\nabla _{X},\nabla _{Y}]V##.
That wouldn't matter. The extra term I described is still there even with a torsion free connection, as is standard in GR. See, for example, Misner, Thorne, & Wheeler, Section 11.3.

chartery said:
I assumed that the operator and tensor were effectively different representations of the same 'operation'. Is that a bad idea, or wrong?
They're related operations, but they're not quite the same operation.

chartery said:
I don't understand why the Ricci would have two vectors rather than one?
Because the Riemann tensor takes 4 vectors (or a 1-form and 3 vectors, depending on whether you lower the first index or not), so if you contract it, the result, the Ricci tensor, takes 2 vectors (since contraction reduces the number of indexes by two).

chartery said:
the Riemann with three vectors (contracted on two)
No, it's the Riemann tensor with indexes (MTW calls them "slots") for one 1-form and three vectors, contracted on the 1-form and the second of the three vectors. Leaving two vectors.
 
  • Like
Likes chartery and vanhees71
  • #5
@PeterDonis Thanks again.

Torsion-free: yes, my misapprehension, I got misled by Eq 3.112 in Carroll's book
$$\left[ \bigtriangledown _{\mu},\bigtriangledown _{\nu} \right]V^{\rho}=R^{\rho}\text{ }_{\sigma\mu\nu}V^{\sigma}-T_{\mu\nu}\text{ }^{\lambda}\bigtriangledown _{\lambda}V^{\rho}$$I'm still stuck on ## \Sigma_{X, Z} \mathscr{R}(X, B)Z##

This is a map from three vector fields to a fourth. Does that mean the contraction effectively changes it to a map from one vector field to another? And these two then become the 'slots' of the Ricci?

And then my ##R(XY)V\mapsto Z## would translate as your ## \mathscr{R}(X, B)Z\mapsto A## with the ##A## implied on the right hand side (only by being in the Ricci on the left) in ##\text{Ricci}(A, B) = \Sigma_{X, Z} \mathscr{R}(X, B)Z## ?
Or am I on the wrong track?
 
  • #6
chartery said:
I got misled by Eq 3.112 in Carroll's book
There was a PF thread about this a while back; yes, the way he does it, at least in that section, doesn't really reflect the extra ##\nabla_{[X, Y]}## term properly. In the notation of Eq. 3.112, that extra term is part of the ##R^{\rho}\text{ }_{\sigma\mu\nu}V^{\sigma}## term. But his discussion doesn't really make clear why that should be the case. MTW's discussion seems to me to be much clearer.

chartery said:
I'm still stuck on ##\Sigma_{X, Z} \mathscr{R}(X, B)Z##

This is a map from three vector fields to a fourth.
The usual way of viewing the curvature operator is that ##\mathscr{R}(X, Y)## is a map from vectors to vectors; it maps a vector to what it gets changed to when you parallel transport it around the closed loop parallelogram defined by ##X## and ##Y##. In index notation this would be a (1, 1) tensor obtained by contracting the Riemann tensor with the vectors ##X## and ##Y##, i.e., ##R^\rho{}_{\sigma \mu \nu} X^\mu Y^\nu##.

Note that there is no sum here; this is just a single curvature operator. But we can use the above to try to unpack the sum expression for the Ricci tensor. First, though, I think I wrote that expression wrong. The Ricci tensor is formed by contracting the Riemann tensor on its first and third slots. That means we have to insert the same vector into each of the slots, and sum over all possible such vectors. (The first slot actually takes a 1-form, so we have to lower the index of the vector we choose to get its corresponding 1-form.)

The third slot of the Riemann tensor corresponds to the first of the two vectors in the argument of the curvature operator, i.e., the vector ##X##. Inserting the corresponding 1-form in the first slot means we have to contract the vector that the curvature operator outputs with that 1-form. So the proper sum for a particular component of the Ricci tensor would actually look like this:

$$
R_{\sigma \nu} = \Sigma_X \langle \mathscr{R}(X, e_\sigma) e_\nu, X \rangle>
$$

where the angle brackets denote contraction and ##e_\sigma## and ##e_\nu## are basis vectors. In other words, to compute the full Ricci tensor, we need to compute ten of these sums, corresponding to the ten independent components. (Actually I think you only have to sum over the basis vectors in place of ##X##, not all possible vectors ##X##.)

I think the above is correct; however, trying to unpack the Ricci tensor in terms of curvature operators is not something that apppears to be covered by the textbooks I have available.
 
  • Like
Likes chartery and vanhees71
  • #7
PeterDonis said:
So the proper sum for a particular component of the Ricci tensor would actually look like this:

$$
R_{\sigma \nu} = \Sigma_X \langle \mathscr{R}(X, e_\sigma) e_\nu, X \rangle>
$$
If you wanted to express this in operator language, it would be

$$
\text{Ricci}(A, B) = \Sigma_\alpha \langle \mathscr{R}(e_\alpha, A) B, e_\alpha \rangle
$$

i.e., ##\text{Ricci}## takes two vectors and outputs a number. In terms of the curvature operator, ##\text{Ricci}## takes vectors ##A## and ##B## and outputs the sum of the scalars obtained by parallel transporting the vector ##B## around the parallelogram defined by each of the basis vectors and the vector ##A##, and contracting each of the resulting vectors with the basis vector that was used (or more precisely its corresponding basis 1-form).
 
  • Like
Likes chartery and vanhees71
  • #8
PeterDonis said:
In index notation this would be a (1, 1) tensor obtained by contracting the Riemann tensor with the vectors ##X## and ##Y##, i.e., ##R^\rho{}_{\sigma \mu \nu} X^\mu Y^\nu##.

Note that there is no sum here; this is just a single curvature operator.
Contracted but no sum?These look exactly what I think I need to understand to relate tensors and operators, and 'pictorialise':
##R_{\sigma \nu} = \Sigma_X \langle \mathscr{R}(X, e_\sigma) e_\nu, X \rangle##
##\text{Ricci}(A, B) = \Sigma_\alpha \langle \mathscr{R}(e_\alpha, A) B, e_\alpha \rangle##
Might take a little time :-)
 
  • #9
chartery said:
Contracted but no sum?
This is one of those cases where ordinary language is clumsy. I meant there is no sum over different possible vectors, as there is in the case of the Ricci tensor (where we have to sum over the possible basis vectors that can occupy the two slots being contracted). There is just a contraction of the Riemann tensor with two specific vectors. Of course that contraction involves a sum over the matching components of the tensor and vectors.
 
  • Like
Likes chartery
  • #10
@PeterDonis, thanks very much again for your help and patience
 
  • Like
Likes berkeman
  • #11
chartery said:
@PeterDonis, thanks very much again for your help and patience
You're welcome! :smile:
 

FAQ: Ricci notations and visualisation

What is Ricci notation?

Ricci notation is a symbolic representation used in tensor calculus and differential geometry, named after the Italian mathematician Gregorio Ricci-Curbastro. It is a shorthand way to express complex tensor equations and operations, making it easier to manipulate and understand the relationships between different tensors.

How does Ricci notation differ from Einstein notation?

Ricci notation and Einstein notation are closely related, as both are used in tensor calculus. The primary difference is that Einstein notation implicitly sums over repeated indices, known as the Einstein summation convention, while Ricci notation does not necessarily imply summation and can be more explicit about the operations being performed. In practice, Ricci notation can include additional symbols and conventions to clarify tensor operations.

Why is Ricci notation important in physics and mathematics?

Ricci notation is important because it provides a concise and standardized way to describe tensors and their interactions, which are fundamental in fields like general relativity, continuum mechanics, and differential geometry. By simplifying complex equations, Ricci notation helps scientists and mathematicians communicate ideas more effectively and perform calculations more efficiently.

What are the common symbols used in Ricci notation?

Common symbols in Ricci notation include indices (usually lowercase Latin or Greek letters) to denote tensor components, the Kronecker delta (δ), the Levi-Civita symbol (ε), and various operators like the covariant derivative (∇). These symbols help to specify the operations being performed on tensors, such as contraction, permutation, and differentiation.

How can Ricci notation be visualized?

Visualizing Ricci notation can be challenging due to its abstract nature, but it often involves understanding the geometric and algebraic properties of tensors. Diagrams, such as tensor networks or Penrose graphical notation, can help by representing tensors as nodes and their indices as edges, making the relationships between different tensors more intuitive. Additionally, software tools and symbolic computation programs can assist in visualizing and manipulating tensor equations written in Ricci notation.

Similar threads

Replies
19
Views
699
Replies
5
Views
736
Replies
7
Views
2K
Replies
15
Views
2K
Replies
2
Views
1K
Replies
19
Views
3K
Replies
34
Views
3K
Back
Top