How Does Tensor Differentiation Simplify in Multiferroics Homework?

In summary: For any ##k##, we have:$$\frac \partial {\partial x_k}(\sum_j b_{ij}x_j) = b_{ik}$$The next trick is to be able to do that calculation using the compact notation.
  • #1
Nickpga
22
2
Summary:: help explaining notation with derivatives.

Mentor note: Thread moved from technical section, so no homework template is included
Sorry. I did not realize there was a dedicated homework problem section. Should I leave this post here?

Basically the following (homework) problem. I haven't dealt with tensors before. well. not explicitly tensors i suppose.

b_ij are constants, show that

(b_ij x_j)_,k = b_ik

what i know is that i will make a partial derivative

b_ij dx_j/dx^k = b_ik

how does the derivative simplify to the right handside?

for a little bit more context, i am taking an intro to multiferroics (not my choice, i am forced to by my university). and i am posting it in this thread since google search results abut tensors lead me to questions posted in this sub-forum. thanks for your time
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
You wrote
[tex](b_{ij}x_j)_{,k}=b_{ik}[/tex]
But thinking balance of dummy index j, it should be
[tex](b_{ij}x^j)_{,k}=b_{ik}[/tex]
Which is the right equation in the problem ?
 
  • Like
Likes Nickpga
  • #3
The first one. thanks
 
  • #4
Nickpga said:
The first one. thanks
What you want to calculate is: $$\frac \partial {\partial x_k}(\sum_j b_{ij}x_j)$$ We might as well be explicit about this for the sake of clarity.

Does it look clearer what to do now?
 
  • Like
Likes Nickpga
  • #5
Not really. Do you have any textbook recommendations? I have a lot of catching up to do.
 
  • #6
Nickpga said:
Not really. Do you have any textbook recommendations? I have a lot of catching up to do.
Have you ever taken a calculus course?
 
  • Like
Likes Nickpga
  • #7
Yes. I have taken lots of math all the way to multi-variable calculus. I am just terrible at it.
I am taking my last two courses to complete my BSEE...
 
  • #8
Nickpga said:
Yes. I have taken lots of math all the way to multi-variable calculus. I am just terrible at it.
So, you have two issues:

1) The basic calculus of (partial) differentiation.

2) Getting accustomed to the hyper-concise (my term) convention used in your course: with the summation convention and derivatives represented using commas.

What if we simplify the problem further and take ##k = 1##:$$\frac \partial {\partial x_1}(\sum_j b_{ij}x_j)$$
 
  • Like
Likes Nickpga
  • #9
Ok. I see, now, that you just rewrote the left hand side.

oh k.. so for k = j (take partial), you will get 1 but when k != j, you get zero. orthogonal?
such that, j is kind of replaced by k.

that explains why x_j goes away ( equals to one). and then k replaces, because the values of j (!=k) went to zero.

does that sound correct?
 
  • #10
Then
Nickpga said:
b_ij dx_j/dx^k = b_ik
Do you mean
[tex]b_{ij}\frac{\partial x_j}{\partial x^k}[/tex]
with index k is only one upside ?
 
  • Like
Likes Nickpga
  • #11
Well. I do not know what the difference is between up and down. I am basically learning tensors (specifically) for my first time. Weird how everyone in the course already knew them really well.
 
  • #12
You introduced ##x^k## by yourself. I just ask how you or the problem write index up or down. All downside ? OK it makes sense. Some are up and some are down? It is also OK if they follow the balance rule. So what is it ?
 
  • Like
Likes Nickpga
  • #13
Ok. I figure its all down. Nothing in the homework is actually up.
 
  • #14
Nickpga said:
Ok. I see, now, that you just rewrote the left hand side.

oh k.. so for k = j (take partial), you will get 1 but when k != j, you get zero. orthogonal?
such that, j is kind of replaced by k.

that explains why x_j goes away ( equals to one). and then k replaces, because the values of j (!=k) went to zero.

does that sound correct?
Sort of. It's really when ##j = k##, as ##j## is the index that is varying and being summed over. If we assume we have three dimensions, then we can expand the whole thing:
$$\frac \partial {\partial x_1}(\sum_j b_{ij}x_j) = \frac \partial {\partial x_1}(b_{i1}x_1 +b_{i2}x_2+b_{i3}x_3) = b_{i1}$$ For the reasons you gave - that ##x_1, x_2, x_3## are assumed to be independent variables.

Although we did this for ##k = 1##, we can that for any ##k## we have:
$$\frac \partial {\partial x_k}(\sum_j b_{ij}x_j) = b_{ik}$$

The next trick is to be able to do that calculation using the compact notation ...
 
  • Like
Likes Nickpga
  • #15
anuttarasammyak said:
You introduced ##x^k## by yourself. I just ask how you or the problem write index up or down. All downside ? OK it makes sense. Some are up and some are down? It is also OK if they follow the balance rule. So what is it ?
Not all tensor analysis uses upstairs indices.
 
  • Like
Likes Nickpga
  • #16
PeroK said:
Sort of. It's really when ##j = k##, as ##j## is the index that is varying and being summed over. If we assume we have three dimensions, then we can expand the whole thing:
$$\frac \partial {\partial x_1}(\sum_j b_{ij}x_j) = \frac \partial {\partial x_1}(b_{i1}x_1 +b_{i2}x_2+b_{i3}x_3) = b_{i1}$$ For the reasons you gave - that ##x_1, x_2, x_3## are assumed to be independent variables.

Although we did this for ##k = 1##, we can that for any ##k## we have:
$$\frac \partial {\partial x_k}(\sum_j b_{ij}x_j) = b_{ik}$$

The next trick is to be able to do that calculation using the compact notation ...
Alright. I suppose I could just write it like this and then "simplifiy" into the hyper-concise form my course uses. Wouldn't this be much better approach than doing it the "tricky" way?
 
  • #17
I see. So
[tex](b_{ij}x_j)_{,k}=b_{ij}x_{j,k}= b_{ij}\delta_{jk}=b_{ik}[/tex]
with convention that same index are summed up. ##\delta_{jk}## is Kronecker delta, 1 for j=k, 0 for j##\neq##k.
 
  • Like
Likes Nickpga
  • #18
anuttarasammyak said:
I see. So
[tex](b_{ij}x_j)_{,k}=b_{ij}x_{j,k}= b_{ij}\delta_{jk}=b_{ik}[/tex]
Alright. That makes sense. Only because the previous problem dealt with the Kronecker delta as well. Thanks!
 
  • #19
Nickpga said:
Alright. I suppose I could just write it like this and then "simplifiy" into the hyper-concise form my course uses. Wouldn't this be much better approach than doing it the "tricky" way?
Personally, I think it makes your course tougher as the human brain takes time to get used to new notation like this. I.e. missing out summation and partial derivative symbols.

Half the battle, perhaps, is to rationalise your difficulties into these two categories: do you understand mathematically what you are doing? and, can you interpret and work with the new notation.
 
  • Like
Likes Nickpga
  • #20
PeroK said:
Personally, I think it makes your course tougher as the human brain takes time to get used to new notation like this. I.e. missing out summation and partial derivative symbols.

Half the battle, perhaps, is to rationalise your difficulties into these two categories: do you understand mathematically what you are doing? and, can you interpret and work with the new notation.
Thanks for your time and help!
I will probably have to work in regular notation and then see how to condense into the new notation.
 
  • Like
Likes PeroK
  • #21
Just some remarks.
1) ##b_{ij}x_j## is not a tensor at least this expression does not keep its shape under changes of variables
2) the operation ##\partial/\partial x_i## takes tensors to not-tensors
3) if only linear changes are considered ##x_i=c_{ij}x'_j## then everything is ok
 
Last edited:
  • #22
Nickpga said:
Not really. Do you have any textbook recommendations? I have a lot of catching up to do.
Try Schaums' Tensor Calculus for an overview.
 
  • Like
Likes MD LAT 1492

FAQ: How Does Tensor Differentiation Simplify in Multiferroics Homework?

What is tensor differentiation?

Tensor differentiation is a mathematical concept used in the field of differential geometry to calculate the rate of change of a tensor field with respect to a given set of variables. It is an extension of traditional calculus, which deals with functions of real numbers, to functions of tensors.

What are tensors?

Tensors are mathematical objects that represent the relationships between different coordinate systems. They are multidimensional arrays that can hold a variety of data types, such as scalars, vectors, and matrices. Tensors are commonly used in physics, engineering, and data analysis.

How is tensor differentiation different from traditional differentiation?

Traditional differentiation deals with functions of real numbers, while tensor differentiation deals with functions of tensors. This means that the variables in tensor differentiation are multidimensional and can have multiple components, whereas in traditional differentiation, the variables are one-dimensional.

What are some applications of tensor differentiation?

Tensor differentiation has a wide range of applications in fields such as physics, engineering, and machine learning. It is used to study the curvature of space-time in general relativity, to model the behavior of materials under stress in engineering, and to optimize neural networks in machine learning.

What are some common techniques used in tensor differentiation?

Some common techniques used in tensor differentiation include the chain rule, product rule, and quotient rule. Other techniques include the use of index notation, which simplifies the calculation of derivatives of tensors, and the use of metric tensors to transform between different coordinate systems.

Back
Top