Need some clarifications on tensor calculus please

In summary: I think.2) Consider a tensor A of order 2 and the quadratic form (x−y)⋅A⋅(x−y) for any vectors x,y. If A is positive semidefinite then x⋅A⋅x−y⋅A⋅y≥2y⋅A⋅(x−y). Does there exist any analogous inequality for tensors of any order j>2, where the tensor A is positive semidefinite?I’m sure there is. I’m not sure of the form, but I’d bet you’d have to either go to something like the Riemann curvature tensor or use some more advanced language like
  • #1
pitaly
6
1
TL;DR Summary
Need some clarifications on tensor calculus
I've started reading up on tensors. Since this lies well outside my usual area, I need some clarifications on some tensor calculus issues.

Let ##A## be a tensor of order ##j > 1##. Suppose that the tensor is cubical, i.e., every mode is of the same size. So for example, if ##A## is of order 3 then ##A \in R^{n \times n \times n}##.

Further assume that ##A## is symmetric (sometimes called supersymmetric), i.e., it is invariant under permutation of indices. For example, if the order is 3 then ##A_{ijk} = A_{ikj} = A_{jik} = A_{jki} = A_{kij} = A_{kij}##.

We say that a tensor is positive semidefinite if ##x \cdot A \cdot x \geq 0## for all vectors ##x \neq 0##. So for example, if ##A## is of order 2, we have a standard quadratic form and if ##A## is positive semidefinite then ##x \cdot A \cdot x \geq 0##.

Now, to my two questions:

1) Is there any clean notation for tensor differentiation? If ##A## is of order 2 then the gradient of ##x \cdot A \cdot x## with respect to ##x## is ##2 x \cdot A##. How do I write the gradient of ##x \cdot A \cdot x## with respect to ##x## when ##A## is a tensor of any order ##j>2##?

2) Consider a tensor ##A## of order 2 and the quadratic form ##(x-y) \cdot A \cdot (x-y)## for any vectors ##x,y##. If ##A## is positive semidefinite then ##x \cdot A \cdot x - y \cdot A \cdot y \geq 2y \cdot A \cdot (x-y) ##. Does there exist any analogous inequality for tensors of any order ##j>2##, where the tensor ##A## is positive semidefinite?

Any help on this is highly appreciated!
 
  • Like
Likes Vanilla Gorilla
Physics news on Phys.org
  • #3
pitaly said:
Summary:: Need some clarifications on tensor calculus

I've started reading up on tensors. Since this lies well outside my usual area, I need some clarifications on some tensor calculus issues.

Let ##A## be a tensor of order ##j > 1##. Suppose that the tensor is cubical, i.e., every mode is of the same size. So for example, if ##A## is of order 3 then ##A \in R^{n \times n \times n}##.

Further assume that ##A## is symmetric (sometimes called supersymmetric), i.e., it is invariant under permutation of indices. For example, if the order is 3 then ##A_{ijk} = A_{ikj} = A_{jik} = A_{jki} = A_{kij} = A_{kij}##.

We say that a tensor is positive semidefinite if ##x \cdot A \cdot x \geq 0## for all vectors ##x \neq 0##. So for example, if ##A## is of order 2, we have a standard quadratic form and if ##A## is positive semidefinite then ##x \cdot A \cdot x \geq 0##.
What is ##x \cdot u \otimes v \otimes w \cdot x\;##?
pitaly said:
Now, to my two questions:

1) Is there any clean notation for tensor differentiation? If ##A## is of order 2 then the gradient of ##x \cdot A \cdot x## with respect to ##x## is ##2 x \cdot A##. How do I write the gradient of ##x \cdot A \cdot x## with respect to ##x## when ##A## is a tensor of any order ##j>2##?
The first that comes to my mind is the boundary operator in cohomology theory.
pitaly said:
2) Consider a tensor ##A## of order 2 and the quadratic form ##(x-y) \cdot A \cdot (x-y)## for any vectors ##x,y##. If ##A## is positive semidefinite then ##x \cdot A \cdot x - y \cdot A \cdot y \geq 2y \cdot A \cdot (x-y) ##. Does there exist any analogous inequality for tensors of any order ##j>2##, where the tensor ##A## is positive semidefinite?

Any help on this is highly appreciated!
Good question. I had the same (see above).

Let's see what you have actually done. You fed a tensor with two vectors and computed a scalar. Hence we need ##u^*,v^*\in V^*## to get ##(u^*\otimes v^*)(x,y)= u^*(x)v^*(y)\in \mathbb{R}.## We certainly can fill up this equation with arbitrary many ##w^{(k)}\in V## to get
$$
(u^*\otimes v^*\otimes w^{(1)}\otimes \ldots\otimes w^{(m)})(x,y)= u^*(x)v^*(y) \cdot w^{(1)}\otimes \ldots\otimes w^{(m)}
$$
but then we lose the possibility to compare it with the scalar ##0.##

I think you are approaching this from the wrong side. What do you want to achieve is the question you should ask, not how can I generalize something.

Since you asked in a mathematics forum, you may want to have a read:
https://www.physicsforums.com/insights/what-is-a-tensor/
https://www.physicsforums.com/insights/pantheon-derivatives-part-iii/
https://www.physicsforums.com/insights/pantheon-derivatives-part-iv/
 
  • #4
pitaly said:
Summary:: Need some clarifications on tensor calculus

1) Is there any clean notation for tensor differentiation? If A is of order 2 then the gradient of x⋅A⋅x with respect to x is 2x⋅A. How do I write the gradient of x⋅A⋅x with respect to x when A is a tensor of any order j>2?
Not an expert, but I know my way around some basic tensor math. Normally I would the tensor product in Einstein notation, ie ##A_{ijk}x^ix^j##. Then I would write the derivative as a tensor-like expression: ##\partial_i = \frac{\partial}{\partial x^i}##. And then we can apply it:
$$\partial_l A_{ijk}x^ix^j = A_{ijk}\partial_l(x^ix^j)$$ Knowing the Leibniz rule and that ##\frac{\partial x^i}{\partial x^j}=\delta^i_j##, we can expand out the derivative: $$ \begin{align} \partial_l A_{ijk}x^ix^j & = A_{ijk}((\partial_l x^i)x^j + x^i\partial_l(x^j)) \\ & = A_{ijk}(\delta_l^ix^j+x^i\delta_l^j) \\ & = A_{ljk}x^j+A_{ilk}x^i\end{align}$$ In the case that ##A## is symmetric, we can freely swap the first two indices to get the expression ##A_{jlk}x^j+A_{ilk}x^i##, which are really two sums with the same form, so we can change the index to be the same:
$$A_{ilk}x^i+A_{ilk}x^i=2A_{ilk}x^i$$ I guess it looks kinda hand-wavey, but it’s really just a compact way of writing sums along indices. Add as many indices beyond ##l## to get the same identity for order >3.
 
  • Like
Likes haushofer
  • #5
pitaly said:
Further assume that ##A## is symmetric (sometimes called supersymmetric), i.e., it is invariant under permutation of indices. For example, if the order is 3 then ##A_{ijk} = A_{ikj} = A_{jik} = A_{jki} = A_{kij} = A_{kij}##.
I've never seen the expression "supersymmetric" for this, and seems very confusing; supersymmetry is something completely different. But maybe it's common in engineering texts.
 
  • #6
pitaly said:
Summary:: Need some clarifications on tensor calculus

For example, if the order is 3 then Aijk=Aikj=Ajik=Ajki=Akij=Akij.
I think the last A term should be k j i .
 

FAQ: Need some clarifications on tensor calculus please

What is tensor calculus?

Tensor calculus is a branch of mathematics that deals with the analysis of tensors, which are mathematical objects that describe the relationships between vectors and scalars. It is commonly used in physics and engineering to describe the behavior of physical systems.

Why is tensor calculus important?

Tensor calculus is important because it provides a powerful mathematical framework for describing and analyzing complex physical systems. It allows scientists and engineers to model and understand the behavior of systems that involve multiple variables and dimensions.

How is tensor calculus used in science?

Tensor calculus is used in many areas of science, including physics, engineering, and computer science. It is commonly used to describe the behavior of physical systems, such as fluid dynamics, electromagnetism, and general relativity. It is also used in machine learning and data analysis to analyze and interpret large datasets.

What are some applications of tensor calculus?

Tensor calculus has a wide range of applications in various fields. In physics, it is used to model and analyze the behavior of physical systems. In engineering, it is used in fields such as structural analysis, fluid mechanics, and control theory. In computer science, it is used in machine learning, computer vision, and data analysis.

Is tensor calculus difficult to learn?

Tensor calculus can be challenging to learn, as it involves complex mathematical concepts and notation. However, with proper instruction and practice, it can be mastered. It is important to have a strong foundation in linear algebra and multivariable calculus before attempting to learn tensor calculus.

Similar threads

Replies
20
Views
3K
Replies
1
Views
1K
Replies
6
Views
954
Replies
6
Views
2K
Replies
9
Views
1K
Replies
5
Views
1K
Back
Top