Understanding Tensor Gradients in R3

In summary, the problem involves calculating the following quantities and simplifying the expressions: 1. The gradient of the i-th component of the given orthogonal tensor in R3, where the tensor is a function of x^2. 2. The product of the Levi-Civita symbol and the gradient of the i-th component of the tensor with respect to the j-th component, where the tensor is a function of x^2. However, the equations given in the book assume a tensor in the form of a matrix and a vector x which is dependent on xi and ei, which is not the case here. Only the first term with a is dependent on xi or xj, while the rest of the function is unknown. The
  • #1
EsmeeDijk
5
0

Homework Statement


We have the following orthogonal tensor in R3:
[itex] t_{ij} (x^2) = a (x^2) x_i x_j + b(x^2) \delta _{ij} x^2 + c(x^2) \epsilon_ {ijk} x_k [/itex]
Calculate the following quantities and simplify your expression as much as possible:
[itex] \nabla _j t_{ij}(x)[/itex]
and
[itex] \epsilon _{ijk} \nabla _i t_{jk}(x) = 0 [/itex]​

Homework Equations


The equations given in my book are:
[itex] (\nabla f)_i = \Lambda _{ji} \frac{\partial f}{\partial x_j} [/itex] ( with a tilda on the last xj
[itex] \nabla _i = \Lambda_i^j \nabla _j [/itex] (with a tilda "~" on the last nabla)

The Attempt at a Solution


My problem is that these equations that I have are all assuming that you have a tensor in the form of a matrix, but this is not the case I believe. Also in the book leading up to these equations you have a vector x which is dependent on xi and on ei. Which is now also not the case. Only the first term with a is dependent on xi or xj, but I can't imagine that the rest of the function just falls away..
 
Physics news on Phys.org
  • #2
Your notation is unfamiliar.
What do ##x,x^2## and ##x_i## represent in your formula?
Does ##t_{ij}(x^2)## represent the ##i,j## component of the tensor, in some assumed (but unstated) basis, calculated in terms of a parameter ##x^2##? Or does it represent the application of an order-2 tensor to a vector denoted by the symbol ##x^2##?
 

FAQ: Understanding Tensor Gradients in R3

What is the definition of the gradient of a tensor?

The gradient of a tensor is a mathematical operation that represents the rate of change of a tensor field in a particular direction. It is a vector that points in the direction of the greatest increase of the tensor field at a specific point.

How is the gradient of a tensor calculated?

The gradient of a tensor is calculated by taking the partial derivatives of the tensor with respect to each of its coordinates. These partial derivatives are then combined to form a vector that represents the gradient of the tensor field.

What is the physical significance of the gradient of a tensor?

The gradient of a tensor is important in physics because it allows us to understand how a physical quantity changes in different directions in space. It is used to calculate the force or flow of a vector field, and is also used in the study of fluid dynamics, electromagnetism, and other areas of physics.

Can the gradient of a tensor be negative?

Yes, the gradient of a tensor can be negative. The negative sign indicates that the tensor field is decreasing in that direction, rather than increasing. This can be seen in situations where there is a decrease in temperature or pressure in a particular direction.

How is the gradient of a tensor represented mathematically?

The gradient of a tensor is represented as a vector, usually denoted by the symbol ∇ (nabla). It is often written as ∇T or ∇x, where T or x represents the tensor field. In index notation, the gradient of a tensor T is written as ∂T/∂x^i, where i represents the coordinate direction.

Back
Top