Having trouble understanding Tensor Contraction

In summary: Ricci tensor acquires a new form, called the Ricci scalar."Certainly not. For example:In 2D: the full Riemann tensor is expressible entirely in terms of the Ricci scalar.In 3D: the full Riemann tensor is expressible entirely in terms of the Ricci tensor and scalar.In any dimension: "When the scalar curvature... is taken into account, the Ricci tensor acquires a new form, called the Ricci scalar."
  • #1
paperplane
3
0
TL;DR Summary
Having trouble understanding tensor contraction
I'm having trouble understanding tensor contraction. So for example, for something like
AuvBvu, would this equal to some scalar?
 
Last edited:
Physics news on Phys.org
  • #2
Yes.
 
  • Like
Likes Chestermiller and malawi_glenn
  • #3
Since this topic was just started: I also have problems understanding tensor contraction. Not the process, but the motivation behind it. If you have a tensor of rank 2, let's say in 3 dimensions, then this tensor contains 9 elements. If you contract it to a scalar, you are left with one element. Why is this legitimate? What happens to all the information that the original tensor contained? You don't need it, you don't want it, you just throw it out?
 
  • #4
Rick16 said:
Since this topic was just started: I also have problems understanding tensor contraction. Not the process, but the motivation behind it. If you have a tensor of rank 2, let's say in 3 dimensions, then this tensor contains 9 elements. If you contract it to a scalar, you are left with one element. Why is this legitimate? What happens to all the information that the original tensor contained? You don't need it, you don't want it, you just throw it out?
The simplest example of tensor contraction is the inner product of a vector with itself to give its length:
$$a = a^{\mu}a_{\mu}$$That's occasionally useful.
 
  • #5
PeroK said:
The simplest example of tensor contraction is the inner product of a vector with itself to give its length:
$$a = a^{\mu}a_{\mu}$$That's occasionally useful.
Thank you! This answer is enormously helpful.
 
  • Like
Likes PeroK
  • #6
This is my pragmatic take on tensors and contractions. It is a way to ensure that you have an object that transforms in a certain way, one special such transformation is that it does not transform at all, i.e. it is invariant.

For instance, if I know that ##T^{\mu \nu} ## is a rank-2 contravariant tensor, and if ##v_\alpha## is rank-1 covariant tensor, I know that ##T^{\mu \nu}a_\mu ## will transform as a rank-1 contravariant tensor, and that ##T^{\mu \nu}a_\mu a_\nu ## will be invariant (a scalar).
 
  • Like
Likes vanhees71
  • #7
At this place it's important to be pedantic. ##T^{\mu \nu}## is not a tensor but contravariant components of a 2nd-rank tensor. I v_{\alpha} are rank-1-covariant tensor components, then ##T^{\mu \nu} v_{\nu}## are contra-variant rank-1 tensor components (aka contravariant vector components).

Tensors are multilinear functions, which map ##m## vectors and ##n## dual vectors to the real numbers (taking the case we talk about real vector spaces as needed in classical mechanics and relativity). They are completely independent of the choice of any basis and the correponding dual basis. They are invariants under basis transformations, and the rules how to transform the corresponding components follow from this invariance.
 
  • Love
Likes malawi_glenn
  • #8
vanhees71 said:
At this place it's important to be pedantic. Tμν is not a tensor but contravariant components of a 2nd-rank tensor.
True that :)
 
  • Like
Likes vanhees71
  • #9
Rick16 said:
Why is this legitimate?
Is a dot product legitimate?
 
  • #10
Vanadium 50 said:
Is a dot product legitimate?
Actually, comment #4 already answered my question. I did not think of the dot product between two vectors as a tensor contraction. When I take the dot product of a vector with itself, I lose all the directional information of the vector, but I obtain new information. Before #4 I only saw the loss of information as a result of a tensor contraction, and therefore I wondered about the legitimacy of the process. But since new information is produced in the process, I now see the use of it (and the legitimacy).
 
  • #11
I want to come back to my question about the meaning behind tensor contractions. I just finished Susskind’s theoretical minimum volume 4. On page 325 he writes: “I don’t know any particular physical significance or geometric significance to the Ricci tensor or the curvature scalar.” This then means that we don’t know either what information gets lost when the Riemann tensor is contracted to the Ricci tensor and further down to the Ricci scalar. I had actually this contraction in the back of my head when I asked my question in #3. We don’t know what information gets lost, but we contract anyway, and we hope for the best, i.e. we hope that the information that we need the resulting tensor to contain does not get kicked out in the process? Is this what happens here? And then we contract even further until we are left with a single number. I find this particularly intriguing. So the Ricci scalar is just some number that says something about the curvature? Is this all that I need to know about it? Is this all that I can know about it?
 
  • #12
Rick16 said:
So the Ricci scalar is just some number that says something about the curvature? Is this all that I need to know about it? Is this all that I can know about it?
Certainly not. For example:
  • In 2D: the full Riemann tensor is expressible entirely in terms of the Ricci scalar.
  • In 3D: the full Riemann tensor is expressible entirely in terms of the Ricci tensor and scalar.
  • In any dimension: "When the scalar curvature is positive at a point, the volume of a small geodesic ball about the point has smaller volume than a ball of the same radius in Euclidean space. On the other hand, when the scalar curvature is negative at a point, the volume of a small ball is larger than it would be in Euclidean space." (https://en.wikipedia.org/wiki/Scalar_curvature); and: "Geometrically, the Ricci curvature is the mathematical object that controls the growth rate of the volume of metric balls in a manifold." (https://mathworld.wolfram.com/RicciCurvatureTensor.html); and: "The Ricci tensor can be characterized by measurement of how a shape is deformed as one moves along geodesics in the space. In general relativity, which involves the pseudo-Riemannian setting, this is reflected by the presence of the Ricci tensor in the Raychaudhuri equation." (https://en.wikipedia.org/wiki/Ricci_curvature)
 
  • #13
Tensor contraction : if we have tensor space E p,q over E for p>0 & q>0 then for every 1\< i\<p and 0\<j\<q ( p is up indicator q down one) there's linear projection
c (i j); E p q -> E p-1 q-1 such that every folding tensor's z=V1,...,Vp , v'1,..., V'q Ther is attached C I j(z) := < Vi , V'j> V1... Vi-1 Vi+1 (under i).... Next question someone explained it earlier tensor (curvature) ,Riemann -Christoffl is expressible in terms of Levi-Civita connection , in work G.Ricci appears term covariant derivative for the first time
 
  • #14
arturwojciechowicz said:
Tensor contraction : if we have tensor space E p,q over E for p>0 & q>0 then for every 1\< i\<p and 0\<j\<q ( p is up indicator q down one) there's linear projection
c (i j); E p q -> E p-1 q-1 such that every folding tensor's z=V1,...,Vp , v'1,..., V'q Ther is attached C I j(z) := < Vi , V'j> V1... Vi-1 Vi+1 (under i).... Next question someone explained it earlier tensor (curvature) ,Riemann -Christoffl is expressible in terms of Levi-Civita connection , in work G.Ricci appears term covariant derivative for the first time
See the guide for using Latex for mathematics:

https://www.physicsforums.com/help/latexhelp/
 
  • Love
  • Like
Likes arturwojciechowicz and vanhees71
  • #15
That's interesting problem ,first tensor contraction, next is Ricci tensor and most interesting connection, I'll try to be better prepared
 

FAQ: Having trouble understanding Tensor Contraction

What is tensor contraction?

Tensor contraction is an operation that reduces the order of a tensor by summing over one or more pairs of its indices. It is analogous to taking the trace of a matrix but can be applied to higher-order tensors. For example, if you have a rank-3 tensor \(T_{ijk}\), contracting it over indices \(i\) and \(j\) would result in a rank-1 tensor \(T_k\).

How is tensor contraction different from matrix multiplication?

While both tensor contraction and matrix multiplication involve summing over indices, tensor contraction is a more general operation that can be applied to tensors of any rank. Matrix multiplication is a specific case of tensor contraction where two rank-2 tensors (matrices) are contracted over one index. Tensor contraction can involve multiple indices and higher-dimensional tensors.

Why is tensor contraction important in physics and machine learning?

Tensor contraction is crucial in fields like physics and machine learning because it simplifies complex tensor expressions and reduces computational complexity. In physics, it is often used in the context of general relativity and quantum mechanics. In machine learning, tensor contraction is a key operation in neural network computations, especially in models like tensor networks and deep learning frameworks.

Can you provide an example of tensor contraction?

Sure! Consider a rank-3 tensor \(A_{ijk}\) and a rank-2 tensor \(B_{jl}\). A possible contraction over the index \(j\) would result in a new tensor \(C_{il}\) given by:\[ C_{il} = \sum_{j} A_{ijk} B_{jl} \]This operation sums over the index \(j\), effectively reducing the rank of the resulting tensor.

What tools or libraries can help with tensor contraction?

Several libraries can help with tensor contraction, especially in Python. Popular ones include NumPy, which provides basic tensor operations, and more specialized libraries like TensorFlow and PyTorch, which offer advanced tensor manipulation capabilities. These libraries often have built-in functions for tensor contraction, such as NumPy's einsum function.

Similar threads

Replies
4
Views
740
Replies
4
Views
619
Replies
6
Views
2K
Replies
33
Views
3K
Replies
24
Views
2K
Back
Top