Do Tensor Fields Commute When Multiplied?

In summary: BA of two matrices A and B are equal.In summary, the conversation discusses the commutativity of tensor products, specifically whether the product of two tensor fields is generally non-commutative. The discussion also touches on the difference between tensor products and matrix products, and how the latter is fundamentally noncommutative. The conversation ends by highlighting the importance of considering non-commutativity when dealing with tensors as matrices or a combination of matrices and tensors.
  • #1
Kreizhn
743
1

Homework Statement


The SR&GR guy aren't being very helpful, so maybe I can get this quickly resolved here.

I want to know if the product of two tensor fields is generally non-commutative. That is, if I have two tensor fields [itex] A_{ij}, B_k^\ell [/itex] do these representations commute?

The Attempt at a Solution



I feel generally quite conflicted about this subject, and I think it's because I don't fully understand what the representations mean. On one hand, I want to say that for a fixed i,j these simply represent scalar elements and so certainly commute. However, taken as general tensors (for example matrices), they would not commute. That is, if A and B were matrices, then [itex] AB \neq BA[/itex] in general, but given the representations [itex] (A)_{ij} = a_{ij}, (B)_{ij} = b_{ij} [/itex] then certainly [itex] a_{ij}b_{k\ell} = b_{k\ell} a_{ij} [/itex] - the only "non-commutativity" comes in the ordering of the indices. Can anybody shed some light on this situation?
 
Physics news on Phys.org
  • #2
If [tex]A[/tex] and [tex]B[/tex] are two [tex](0, 2)[/tex] tensors with components [tex]a_{ij}[/tex] and [tex]b_{ij}[/tex] respectively, then the [tex](0, 4)[/tex] tensor with components [tex]c_{ijkl} = a_{ij} b_{kl}[/tex] is the tensor product [tex]A \otimes B[/tex]. You are correct to observe that this tensor differs from [tex]B \otimes A[/tex], which has components [tex]c'_{ijkl} = b_{ij} a_{kl}[/tex], only in the ordering of the indices -- but since the order of the indices for tensors is very important, you can't think of [tex]\otimes[/tex] as a commutative product.

When you mention matrices, you are talking about a different product, which is noncommutative in a more fundamental way. In their role as linear transformations, you can think of matrices as [tex](1, 1)[/tex] tensors. If [tex]A[/tex] and [tex]B[/tex] are matrices expressed this way, with components [tex]a^i_j[/tex] and [tex]b^i_j[/tex], then the matrix product [tex]AB[/tex] (composition of linear transformations) has components [tex]c^i_j = \sum_k a^i_k b^k_j[/tex], or simply [tex]a^i_k b^k_j[/tex] in the Einstein summation convention. In tensor language, the matrix product (composition) is actually reflected as a contraction (which is also how the product of two [tex](1, 1)[/tex] tensors can be another [tex](1, 1)[/tex] tensor and not a [tex](2, 2)[/tex] tensor, as the tensor product would be).
 
  • #3
ystael said:
If [tex]A[/tex] and [tex]B[/tex] are two [tex](0, 2)[/tex] tensors with components [tex]a_{ij}[/tex] and [tex]b_{ij}[/tex] respectively, then the [tex](0, 4)[/tex] tensor with components [tex]c_{ijkl} = a_{ij} b_{kl}[/tex] is the tensor product [tex]A \otimes B[/tex]. You are correct to observe that this tensor differs from [tex]B \otimes A[/tex], which has components [tex]c'_{ijkl} = b_{ij} a_{kl}[/tex], only in the ordering of the indices -- but since the order of the indices for tensors is very important, you can't think of [tex]\otimes[/tex] as a commutative product.

When you mention matrices, you are talking about a different product, which is noncommutative in a more fundamental way. In their role as linear transformations, you can think of matrices as [tex](1, 1)[/tex] tensors. If [tex]A[/tex] and [tex]B[/tex] are matrices expressed this way, with components [tex]a^i_j[/tex] and [tex]b^i_j[/tex], then the matrix product [tex]AB[/tex] (composition of linear transformations) has components [tex]c^i_j = \sum_k a^i_k b^k_j[/tex], or simply [tex]a^i_k b^k_j[/tex] in the Einstein summation convention. In tensor language, the matrix product (composition) is actually reflected as a contraction (which is also how the product of two [tex](1, 1)[/tex] tensors can be another [tex](1, 1)[/tex] tensor and not a [tex](2, 2)[/tex] tensor, as the tensor product would be).

Some prefer to use (1,1) tensors as matrices and some say that (0,2) and (2,0) tensors are to be called the second-rank matrices which of course sounds so correct. The reason is that 4-by-4 matrices (on a 4d spacetime) are mixed tensors that are not that much identified among physicists and in their language you can find a lot of things like a metric tensor is a second-rank square matrix and if this is the case, then claiming mixed tensors as being of the nature of the same matrices does seem absurd. Besides, if we represent [tex]v^i[/tex] (i=0,...,3) as a 1-by-4 matrix (i.e. a row-vector) and a mixed tensor as a 4-by-4 matrix, then from the transformation formula

[tex]v^i=\frac{\partial x^i}{\partial \bar{x}^j}\bar{v}^j[/tex]

one would expect to have a [tex](4\times 4)(1\times 4) = (4\times 4)[/tex] vector which is absurd whereas if the transformation formula was written as

[tex]v^i=\bar{v}^j\frac{\partial x^i}{\partial \bar{x}^j}[/tex],

everything would be okay. The same situation happens to exist when one wants to take an upper index down or vice versa using the metric matrix [tex]g_{ij}[/tex], i.e.

[tex]v_i=g_{ij}v^j[/tex],

then taking the preceding path, a 4-by-4 matrix is to be assigned to a vector vi!

So to simply answer our OP's question about why non-commutativity does not hold for componential representation of matrices\tensors, I got to say that you can readily change the position of two numbers under the usual operation of multiplication, while you can't do the same stuff to an arrangement of numbers with a different law of multiplication which is, deep down, not commutative. So we you deal with tensors (actually with rank 2) as matrices, or a complex of matrices and tensors together, like [tex]Ag_{ab}C[/tex] where A and C are 4-by-4 matrices and [tex]g_{ab}[/tex] is the second-rank metric tensor, then non-commutativity is to be strongly considered in our calculations.

And I assume that you know for second-rank tensors, tensor rank and matrix rank are the same. So these things won't work if we are given something like a (0,3) tensor.

AB
 
Last edited:

FAQ: Do Tensor Fields Commute When Multiplied?

What is Tensor Field Commutativity?

Tensor Field Commutativity refers to the property of a tensor field, which is a mathematical object that assigns a tensor to each point in a space, to have its operations commute with each other. In simpler terms, this means that the order in which we perform operations on the tensor field does not affect the final result.

Why is Tensor Field Commutativity important?

Tensor Field Commutativity is important in many areas of science and engineering, particularly in fields such as physics, computer graphics, and data analysis. It allows for simplifications and transformations of complex tensor fields, making it easier to analyze and understand the underlying data.

How is Tensor Field Commutativity related to linear algebra?

Tensor Field Commutativity is closely related to the concept of linearity in linear algebra. A tensor field is said to be linear if it satisfies the properties of additivity and homogeneity. The commutativity property of tensor fields is a natural consequence of these linearity properties.

Can Tensor Field Commutativity be violated?

Yes, there are cases where Tensor Field Commutativity can be violated. This can occur when dealing with non-linear systems or when performing operations on a tensor field that is not well-behaved. In these cases, the order of operations can affect the final result.

How is Tensor Field Commutativity used in practical applications?

Tensor Field Commutativity has many practical applications, such as in image processing, signal analysis, and machine learning. It allows for efficient and accurate calculations of complex data sets, making it an essential tool in various scientific and engineering fields.

Back
Top