Why is ∂σ(δxσ) not equal to zero?

  • I
  • Thread starter GR191511
  • Start date
  • Tags
    Zero
In summary, the conversation is about the derivation process of Noether's theorem for fields and the confusion surrounding the equation J=1+∂σ(δxσ). The conversation also mentions the use of the Levi-Civita symbol and the importance of understanding the properties of δ in relation to differentiation. Finally, a resource is recommended for studying Noether's theorem for fields.
  • #1
GR191511
76
6
I'm studying Noether theorem.In the derivation process,I saw a equation:J=1+∂σ(δxσ),where J is the Jacobian,the second σ is superscript...Since δ has the property to commute with differentiation,why is ∂σ(δxσ) not equal to δ(∂σ(xσ))=δ (1+1+1+1) =δ (4) =0?
 
Physics news on Phys.org
  • #3
vanhees71 said:
From which book are you studying from? I cannot read your equations? Is it about Noether for fields? Then maybe Sec. 3.1 in

https://itp.uni-frankfurt.de/~hees/pf-faq/srt.pdf

is of some use.
https://physics.stackexchange.com/questions/534699/noethers-theorem-derivation-for-fields
the first answer said"the Jacobian J=1+∂σ(δxσ)"But I think:1+∂σ(δxσ)=1+δ(∂σxσ)=1+δ (1+1+1+1) =1+δ (4) =1+0=1 ...I don't know what I did wrong.Thank you.
 
  • #4
Could you please use LaTeX? I really can't read your formulae :-(. The derivation of Noether's theorem for fields, given in the stackexchange article is also in my above quoted FAQ article.

To get the determinant of a matrix ##\hat{A}=\hat{1} + \delta \hat{\omega}## to first order in ##\delta## just use the definition of the determinant using the Levi-Civita symbol (Einstein summation convention applies)
$$\mathrm{det} \hat{A} = \epsilon_{j_1 j_2 \cdots j_n} A_{1j_1} A_{2 j_2} \cdots A_{n j_n}.$$
It's also clear that all products occurring in this sum are of order ##\mathcal{O}(\delta^2)## or higher except the product of the diagonal elements, i.e., (summation convention doesn's apply in the next formula)
$$\mathrm{det} \hat{A} =\prod_{j} A_{jj} + \mathcal{O}(\delta^2) = 1 + \sum_{j} \delta \omega_{jj} + \mathcal{O}(\delta^2) = 1 + \mathrm{Tr} \delta \hat{\omega} + \mathcal{O}(\delta^2).$$
 
Last edited:
  • #5
vanhees71 said:
Could you please use LaTeX? I really can't read your formulae :-(. The derivation of Noether's theorem for fields, given in the stackexchange article is also in my above quoted FAQ article.

To get the determinant of a matrix ##\hat{A}=\hat{1} + \delta \hat{\omega}## to first order in ##\delta## just use the definition of the determinant using the Levi-Civita symbol (Einstein summation convention applies)
$$\mathrm{det} \hat{A} = \epsilon_{j_1 j_2 \cdots j_n} A_{1j_1} A_{2 j_2} \cdots A_{n j_n}.$$
It's also clear that all products occurring in this sum are of order ##\mathcal{O}(\delta^2)## or higher except the product of the diagonal elements, i.e., (summation convention doesn's apply in the next formula)
$$\mathrm{det} \hat{A} =\prod_{j} A_{jj} + \mathcal{O}(\delta^2) = 1 + \sum_{j} \delta \omega_{jj} + \mathcal{O}(\delta^2) = 1 + \mathrm{Tr} \delta \hat{\omega} + \mathcal{O}(\delta^2).$$
Thank you very much!I saw##J=1+\partial_\sigma \delta x^\sigma##in that thread(the first answer)But δ has the property to commute with differentiation,so I think it should continue to be equal to##1+\delta \partial_\sigma x^\sigma=1+\delta(1+1+1+1)=1+\delta(4)=1+0=1##...What did I do wrong?
 
  • #6
No, ##\delta x^{\sigma}## is a given function of ##x##, defining the transformation of the space-time coordinates, which is part of a symmetry transformation (e.g., Poincare transformations) of a field theory. See Sect. 3.1 in

https://itp.uni-frankfurt.de/~hees/pf-faq/srt.pdf
 
  • Like
Likes GR191511
  • #7
vanhees71 said:
Could you please use LaTeX? I really can't read your formulae :-(. The derivation of Noether's theorem for fields, given in the stackexchange article is also in my above quoted FAQ article.

To get the determinant of a matrix ##\hat{A}=\hat{1} + \delta \hat{\omega}## to first order in ##\delta## just use the definition of the determinant using the Levi-Civita symbol (Einstein summation convention applies)
$$\mathrm{det} \hat{A} = \epsilon_{j_1 j_2 \cdots j_n} A_{1j_1} A_{2 j_2} \cdots A_{n j_n}.$$
It's also clear that all products occurring in this sum are of order ##\mathcal{O}(\delta^2)## or higher except the product of the diagonal elements, i.e., (summation convention doesn's apply in the next formula)
$$\mathrm{det} \hat{A} =\prod_{j} A_{jj} + \mathcal{O}(\delta^2) = 1 + \sum_{j} \delta \omega_{jj} + \mathcal{O}(\delta^2) = 1 + \mathrm{Tr} \delta \hat{\omega} + \mathcal{O}(\delta^2).$$
vanhees71 said:
No, ##\delta x^{\sigma}## is a given function of ##x##, defining the transformation of the space-time coordinates, which is part of a symmetry transformation (e.g., Poincare transformations) of a field theory. See Sect. 3.1 in

https://itp.uni-frankfurt.de/~hees/pf-faq/srt.pdf
I get it!I appreciate your help very much
 
  • Like
Likes vanhees71

FAQ: Why is ∂σ(δxσ) not equal to zero?

Why is ∂σ(δxσ) not equal to zero?

This is because the partial derivative (∂) measures the rate of change of a function with respect to one of its variables, while the variable δxσ represents an infinitesimal change in the function. Therefore, ∂σ(δxσ) represents the rate of change of an infinitesimal change, which is not necessarily equal to zero.

What does ∂σ(δxσ) represent?

∂σ(δxσ) represents the rate of change of an infinitesimal change in a function. It is a way to measure the sensitivity of a function to small changes in one of its variables.

Can ∂σ(δxσ) ever be equal to zero?

Yes, it is possible for ∂σ(δxσ) to be equal to zero. This occurs when the function is not sensitive to small changes in the variable δxσ, meaning that the function does not change significantly as δxσ changes.

How is ∂σ(δxσ) related to the concept of a limit?

∂σ(δxσ) can be thought of as the limit of the function as δxσ approaches zero. This is because as δxσ becomes smaller and smaller, the rate of change of the function with respect to δxσ becomes closer and closer to the limit of the function.

Is ∂σ(δxσ) the same as the derivative of the function?

No, ∂σ(δxσ) is not the same as the derivative of the function. The derivative is a measure of the instantaneous rate of change of a function with respect to its variable, while ∂σ(δxσ) is a measure of the rate of change of an infinitesimal change in the function. However, in some cases, the derivative can be calculated using ∂σ(δxσ).

Back
Top