Problem regarding understanding entropy

In summary: The increase in entropy can be thought of as an increase in the number of ways the system can be configured, or a decrease in the number of ways it has been previously configured.
  • #1
mohamed_a
36
6
I was reading about thermodynamics postulates when i came over the differnetial fundamental equation:
1654880625391.png

I understand that the second element is just pressure and last element is chemical energy, but he problem is i don't understand what is the use of entropy and how does it contribute to a systems energy
For example, if i have a container of gas and decided to increase entropy i will just pump out some gas and so the entropy will be increased as less molecules will have more space to move to, but isn't this just equivalent to the change in number of chemical energies (molecules) that exited and change in pressure.
Another example: if i try to exclude the last two elements and imagine how could I increase the energy of a system without changing pressure and chemical energy i am just left with temperature. but if i change the temperature, i would increase pressure and energy of the chemical substituents so why would i need to add a third element(entropy) if i could describe the system wit two only

I think entropy is the measure of inverse the intermolecular energy (hydrogen bonds,etc) , pressure is the measure of extramolecular energy (that is generated by breaking the intermolecular constraints) and chemical energy is the enthalpy of compounds (intramolecular energy). Am I right in my analysis? (i,e that kinetic energy of compounds and elements isn't considered in the third term but rather is included in the entropy)

And if this is correct why isn't nuclear energy added to the system (##E=mc^2##) as a forth element (intramolecular intranuclear energy)
 
Last edited:
  • Like
Likes Delta2
Science news on Phys.org
  • #2
Say gas molecules are contained in a solid cylinder, dV=0, with no inlet/outlet, dn=0, the equation says
[tex]dU=\frac{\partial U}{\partial S}dS=TdS[/tex]
We know high temperature gas has high internal energy and low temperature gas has low internal energy. Your question would be reduced to what temperature is and its relation to energy and entropy, or to the first principle of thermodynamics.
During process we usually consider, there is no change in mass energy mc^2 so its contribution to dU is zero. If we are dealing with nuclear reaction in reactor, we should take it into our thermodynamics.
 
Last edited:
  • #3
anuttarasammyak said:
Say gas molecules are contained in a solid cylinder, dV=0, with no inlet/outlet, dn=0, the equation says
[tex]dU=\frac{\partial U}{\partial S}dS=TdS[/tex]
We know high temperature gas has high internal energy and low temperature gas has low internal energy. Your question would be reduced to what temperature is and its relation to energy and entropy, or to the first principle of thermodynamics.
During process we usually consider, there is no change in mass energy mc^2 so its contribution to dU is zero. If we are dealing with nuclear reaction in reactor, we should take it into our thermodynamics.
in your approach why would i need temperature when i could just use pressure difference as a scale of internal energy you said no change in volume and no change in chemical substituents but there is a change of pressure. If i defined temperature to be the average kinetic energy , so simply increasing temperature would increase the kinetic energy and therefore increase the force acted on unit area by similar amount. What's the motivation to define a state function entropy

NB: just to be more clear, how could the possible number of microstates a system can exist in contribute to its energy how could i increase the number of microstates a system can exist in and expect its energy to change?
 
  • #4
Why don't you revisit the first principle of thermodynamics
[tex]dU=dQ-pdV=TdS-pdV[/tex]
in your text ? Entropy and temperature are introduced there. This principle holds in general for the system where total energy is distributed to many small parts.
 
Last edited:

FAQ: Problem regarding understanding entropy

What is entropy and why is it important in science?

Entropy is a measure of the disorder or randomness in a system. It is important in science because it helps us understand the direction and extent of chemical and physical processes, and plays a crucial role in thermodynamics, information theory, and statistical mechanics.

How does entropy relate to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system will always increase over time. This means that the disorder or randomness in a system will always tend to increase, and energy will naturally flow from hotter to colder objects. Entropy is a quantitative way of measuring this tendency towards disorder.

Can entropy be reversed or decreased?

In a closed system, entropy cannot be reversed or decreased. However, in an open system where energy and matter can be exchanged with the surroundings, local decreases in entropy are possible. This is often seen in living organisms, where they are able to maintain low entropy by constantly exchanging energy and matter with their environment.

How does the concept of entropy apply to information theory?

In information theory, entropy is a measure of the uncertainty or randomness in a message or data. It is used to quantify the amount of information contained in a message, and the more uncertain or random the message is, the higher the entropy. This concept is important in fields such as data compression and cryptography.

Can entropy be used to predict the future?

Entropy is not a predictive tool, but it can give us information about the direction and likelihood of certain processes. For example, in thermodynamics, the increase in entropy over time tells us that energy will naturally flow from hotter to colder objects. However, predicting the exact outcome of a process based on entropy alone is not possible.

Back
Top