Is the concept of entropy addressed in the Lagrangian formalism?

In summary, the Lagrangian formalism is an approximation of how things really work, restricted to an ideal mechanical world. It does not explicitly include entropy because entropy is a statistical artifact that is a function of the possible configurations of a system while having the same macroscopically observable properties. The equations of motion described by the Lagrangian, Hamiltonian, and Newtonian mechanics only consider a single microscopic state and its time evolution, while entropy takes into account all possible microscopic states compatible with a given macroscopic state. Therefore, the entropy is always zero in this context.
  • #1
Jimster41
783
82
just working my way through Susskind's "Theoretical Minimum". At the Langrangian formalism I'm in novel territory so this may be a dumb question. Kind of multiple choice or fill in a real answer.

Why is there no term for the Entropy of a system in the Lagrangian?
Is it because time is an independent background variable?
Or is Entropy included in potential energy?
Or is it because the Lagrangian is approximate (like Newton) or something?
It's implied by "the principle of least action" wher the L is minimized in the action
Or it gets added later, just keep reading
Or am I missing it altogether?
 
Last edited:
Science news on Phys.org
  • #2
Because in the formulation involved, there is no dissipation of mechanical energy (to thermal energy).

Chet
 
  • #3
Okay, thanks. I found this in trying to understand your use of terms. Hadn't seen it laid out so definitively across the "Types of Energy".

http://www.energyeducation.tx.gov/energy/section_1/topics/forms_of_energy/index.html

Thermal Energy: The Energy of Moving or Vibrating Molecules
Mechanical Energy: The Energy of a substance or system because of its motion

But, uh... Isn't every "thing" that is moving is made of "moving molecules"?

So if I had to pick an answer form my list. Would I be correct in saying the Lagrangian (at least as Susskind is introducing it in Chapter in Lecture 6 which is about Mechanics) is an approximation of "how things really work", restricted to an ideal mechanical world, and at the end of the day any "complete" Lagrangian has to account for entropy's non-conservative effect (which is the only non-conserved... thing?) - in other words in a real system there is always some dissipation of mechanical energy to thermal energy and therefore entropy.

I'm open to just holding onto the fact that this seems a bit confusing because it's decomposing reality step-wise. I just I hate that feeling that I'm missing something other people find totally clear.
 
Last edited by a moderator:
  • #4
It think you pretty much captured the essence of it. As far as applying the Lagrangian formalism to a macroscopic system, my background is limited, so I don't know whether inclusion of dissipative effects is easily done.

Chet
 
  • #5
Thank you sir.

Later:Digging around some more trying to see if I could get a sense of how meaningful or meaningless this question might be, or if there might be a pool of answers waiting down some strenuous path of complicated stuff, I can at least hope to reach someday, I did find this.

http://en.wikipedia.org/wiki/GENERIC_formalism
 
Last edited:
  • #6
I could be wrong, but isn't entropy a statistical artifact that is a function of the possible configurations your system could be in while having the same macroscopically observable properties?

If so, there's no real reason for a lagrangian describing the specific microscopic dynamics to explicitly include the entropy.
 
  • #7
jfizzix said:
I could be wrong, but isn't entropy a statistical artifact that is a function of the possible configurations your system could be in while having the same macroscopically observable properties?

If so, there's no real reason for a lagrangian describing the specific microscopic dynamics to explicitly include the entropy.

Do you mean "macroscopic dynamics" in the second paragraph?

If so, I think I get you, and that's consistent with where I got to, and I hadn't thought of it that way.

What was bugging me is that as I understand it no isolated system trajectory through phase space can be perfectly reversible over time (no physically real one anyway). Configuration space has a probabilistic slope, with an energy equivalence, so there has to be some "entropic" energy penalty, tacked onto at least one of the Lagragian steps in that trajectory, eventually.

I'd be lyin if I said this was all totally clear (which is why I was asking for some clarification). I'm just trying to make sure I'm not too confused to make it to the next chapter.
 
Last edited:
  • #8
The equations of motion of Lagrangian, Hamiltonian and Newtonian mechanics describe the actual dynamics of a system. Systems of our everyday experience have a very large number of degrees of freedom (a typical order of magnitude is given by [itex]6.022 \times 10^{23}[/itex] which is the number of particles in a mole). This makes it impossible to use the equations of motions directly in general. Doing so is only possible in special cases. For rigid bodies, for example, it is sufficient to consider 6 degrees of freedom because all the distances between the constituents are approximated as fixed. Furthermore, you can often approximate the motion of a body by the motion of its center of mass.

But for gases, for example, such approximations can't be made and the equations of motion can't be used directly. People like Boltzmann and Gibbs used statistical reasoning instead which led to statistical mechanics and the derivation of thermodynamics from fundamental mechanics.

For macroscopic observables like temperature, many possible microscopic states correspond to the same value. Such observables are not related to the actual microscopic state of the system but to the ensemble of possible microscopic states compatible with a given macroscopic state. Entropy, for example, is a measure of how many such microscopic states are compatible with the macroscopic state. This is expressed in Boltzmann's formula.

If we use the equations of motion directly, we are only looking at a single microscopic state and its time evolution. So the entropy is simply zero all the time.
 
Last edited:
  • Like
Likes Jimster41
  • #9
I see, jfizzix, was saying it correctly. I believe I understand. I wouldn't have got confused if Susskind had qualified the introduction in that way, but that would have been awkward since he hasn't even introduced entropy yet. Part of the problem is that I'm coming at it with some (a teeny tiny bit) of knowledge and trying to get things back into the correct order...

Very helpful guys, thanks.
 
  • #10
Yes, I think it is a very good idea to give an overview of the general ideas first, instead of simply building things bottom up. I know the feel that I don't know how to relate new things to things I already know all too well.
 

FAQ: Is the concept of entropy addressed in the Lagrangian formalism?

What is entropy and how is it related to the Lagrangian?

Entropy is a measure of the disorder or randomness in a system. It is closely related to the Lagrangian, which is a mathematical function used to describe the dynamics of a system. The Lagrangian includes terms that account for the energy and forces within a system, and can be used to calculate the entropy of the system.

How does entropy impact the behavior of a system?

Entropy dictates the direction in which a system will naturally evolve. In isolated systems, entropy will always increase, meaning that the system will become more disordered over time. This is known as the second law of thermodynamics. However, in open systems, such as living organisms, entropy can decrease locally as energy is input into the system.

Can entropy be reversed?

In isolated systems, entropy cannot be reversed. However, in open systems, energy inputs can decrease entropy locally. For example, a plant uses energy from the sun to decrease its own entropy while increasing the entropy of its surroundings. This is possible because the overall system, including the plant and its surroundings, is not isolated.

How is the Lagrangian used in physics?

The Lagrangian is a fundamental tool in classical mechanics and is used to describe the dynamics of a system. It is a function that takes into account the kinetic and potential energies of a system, as well as any external forces acting on the system. The equations derived from the Lagrangian can then be used to predict the future behavior of the system.

How does the Lagrangian relate to other physical principles, such as conservation of energy?

The Lagrangian is closely related to the principle of conservation of energy. In fact, the Lagrangian is derived from the principle of least action, which states that the path a system takes between two points is the one that minimizes the action, which is a measure of the energy expended. This is why the Lagrangian includes terms for energy and forces within a system.

Similar threads

Back
Top