Is low entropy found in something very hot?

In summary, increasing entropy means spreading out energy, not decreasing order. Low entropy can be found in something very hot, but this does not necessarily mean it is highly ordered. The colloquial usage of the word "order" does not align with the scientific definition of entropy. It is important to use math to properly understand and explain concepts in physics.
  • #1
rolnor
117
14
TL;DR Summary
Is low entropy found in something very hot?
If we have a kg of something that is 100miljon Celsius degrees, and can controlably use this heat somehow, we can sustain life, grow crops, drive steam engines and with these we could build a whole city like New York, we can create a lot of mass with very low entropy, things that are very "ordered". But what we started with was not orderd, it was just very hot? I feel that the term "order" is wrong, increasing entropy means spreading energy out really, not "decreasing order". What do you think about this, is it semantics, is our popular word "order" something else than physics "order"? Also, the fact that entropy is inceasing since the big bang, an "infinitly" hot point, a point where its very hard to understand that there could be any order, the Plank temperature, how do you think about this? Its really a sign that the big bang model is simply wrong, no? I have heard one idéa that there is "order" in the gravity at this point in time, but that does not sound very much better, does it?
 
Science news on Phys.org
  • #2
rolnor said:
and can controlably use this heat somehow
How?

You can use it in a heat engine, but standard thermodynamics and standard entropy already accounts for all that. So if you are questioning standard thermodynamics and standard entropy, which is what you appear to be doing, you must have some other way of using that heat in mind. What? (Note that all of the specific activities you list can be powered by heat engines, so again standard thermodynamics and standard entropy accounts for them just fine.)

rolnor said:
we can create a lot of mass with very low entropy
Yes, but in the process, if you are using heat engines, you will also be creating a huge amount of entropy, which will appear in the exhaust of the heat engines. The amount of entropy created will be more than enough to balance out the entropy decrease you created by taking raw materials and building useful infrastructure out of them. That is what standard thermodynamics and standard entropy says.

rolnor said:
I feel that the term "order" is wrong
I would strongly recommend finding a specific, valid reference before making such a sweeping statement. What does an actual textbook or peer-reviewed paper on thermodynamics say? I strongly suspect that you will find it does not make any simplistic statement about "order" of the sort you appear to be imagining.

rolnor said:
entropy is inceasing since the big bang
That is because of the huge expansion the universe has undergone since then, plus the huge amount of gravitational clumping that has occurred in the matter in the universe. The simplistic equating of "hot" with "high entropy" and "cold" with "low entropy" that you appear to be using obviously does not work here. But of course standard thermodynamics does not claim that it should.

rolnor said:
the big bang, an "infinitly" hot point
No, that is not what the big bang is.

rolnor said:
Its really a sign that the big bang model is simply wrong, no?
Definitely not. You will not get anywhere with this attitude.

rolnor said:
I have heard one idéa that there is "order" in the gravity at this point in time
Where? Please give a specific reference.

rolnor said:
that does not sound very much better, does it?
On the contrary, a proper understanding of how thermodynamics and entropy work in the presence of gravity (which means in the context of General Relativity and curved spacetime) is essential to a proper understanding of our best current scientific models, and a huge amount of research spanning decades has been done on it.
 
  • Like
Likes rolnor and vanhees71
  • #3
Moderator's note: Thread level changed to "I".
 
  • #4
rolnor said:
But what we started with was not orderd, it was just very hot? I feel that the term "order" is wrong, increasing entropy means spreading energy out really, not "decreasing order".
Entropy is disorder” is a kinda OK explanation for people who aren’t going to do the math, but like all math-free explanations it is a poor foundation for building any deeper understanding. The mathematical definition of entropy does not depend on disorder and does give us a clear relationship between temperature, entropy, and what you are calling “spreading energy out”.
 
  • Like
Likes rolnor
  • #5
rolnor said:
TL;DR Summary: Is low entropy found in something very hot?

If we have a kg of something that is 100miljon Celsius degrees, and can controlably use this heat somehow, we can sustain life, grow crops, drive steam engines and with these we could build a whole city like New York, we can create a lot of mass with very low entropy, things that are very "ordered".
What you intuitively think of as being "ordered" is actually organized. Organization and order are different things. People in New York are organized, but usually not ordered. They can become ordered, for instance, when 30.000 of them sit still on a stadium. But in most other situations of city life, people are not ordered, they are just organized. The measure of disorder is called entropy. The measure of organization is called complexity. Usually, entropy tends to grow (things become less ordered), but at the same time complexity also can grow (things become more organized). See also the Appendix in my https://arxiv.org/abs/1703.08341 .
 
  • Like
Likes Drakkith and weirdoguy
  • #6
rolnor said:
TL;DR Summary: Is low entropy found in something very hot?

I feel that the term "order" is wrong, increasing entropy means spreading energy out really, not "decreasing order"
Yes. Something very hot has low entropy compared to the same amount of energy at a lower temperature. The colloquial usage of the word “order” doesn’t have much relation to the scientific definition of entropy. Spreading out does seem to capture it better, IMO. But at some point any English description will fail. That is why we use math.
 
  • #7
rolnor said:
increasing entropy means spreading energy out
Correct. When energy spreads out, entropy increases.
 
  • Like
Likes rolnor
  • #8
Mister T said:
Correct. When energy spreads out, entropy increases.
Tanx! As a amateur physisist I am doing some thinking that is at not wrong and this thinking is throwing popular science overboard, this i good start of the day!
 
  • #9
Nugatory said:
Entropy is disorder” is a kinda OK explanation for people who aren’t going to do the math, but like all math-free explanations it is a poor foundation for building any deeper understanding. The mathematical definition of entropy does not depend on disorder and does give us a clear relationship between temperature, entropy, and what you are calling “spreading energy out”.
Thats great, all my adult life I have disliked this term "disorder" and it appears in undergraduate textbooks and everyvare in popular science, they discus things like "things unsmasching" "things smasching" like a pair of binoculars thats you step on and they break, this has thing to do with entropy increasing, the term "order" is not the correct physics term and entropy increasing has nothing to do with smashing binoculars.
 
  • #10
PeterDonis said:
How?

You can use it in a heat engine, but standard thermodynamics and standard entropy already accounts for all that. So if you are questioning standard thermodynamics and standard entropy, which is what you appear to be doing, you must have some other way of using that heat in mind. What? (Note that all of the specific activities you list can be powered by heat engines, so again standard thermodynamics and standard entropy accounts for them just fine.)Yes, but in the process, if you are using heat engines, you will also be creating a huge amount of entropy, which will appear in the exhaust of the heat engines. The amount of entropy created will be more than enough to balance out the entropy decrease you created by taking raw materials and building useful infrastructure out of them. That is what standard thermodynamics and standard entropy says.I would strongly recommend finding a specific, valid reference before making such a sweeping statement. What does an actual textbook or peer-reviewed paper on thermodynamics say? I strongly suspect that you will find it does not make any simplistic statement about "order" of the sort you appear to be imagining.That is because of the huge expansion the universe has undergone since then, plus the huge amount of gravitational clumping that has occurred in the matter in the universe. The simplistic equating of "hot" with "high entropy" and "cold" with "low entropy" that you appear to be using obviously does not work here. But of course standard thermodynamics does not claim that it should.No, that is not what the big bang is.Definitely not. You will not get anywhere with this attitude.Where? Please give a specific reference.On the contrary, a proper understanding of how thermodynamics and entropy work in the presence of gravity (which means in the context of General Relativity and curved spacetime) is essential to a proper understanding of our best current scientific models, and a huge amount of research spanning decades has been done on it.
Yoy miss my point alltogether, I know that entropy increases when you use the heat from the kg of hot mass, but, from the start it was just a kg of material that was very hot, and such a kg of mass is not "ordered" by standard meaning of the word ordered. I am not proposing a perpetual mobile here at all (or a system that has 100% conversion of heat to some other form of energy), I know that entropy increases (heat spreads out as I call it) when you run a steam engine. If you check the answers later in the thread you will see that I am right, the term "ordered" is not correct when describing low entropy.
 
  • #11
I am not attacking physics textbooks, I am attacking the popular scince word "ordered" you also find this in undergraduate books and all over the webb, its a very common word to describe low entropy and that was my whole point, its wrong.

1690196097439.png
 
  • #12
PeterDonis said:
How?

You can use it in a heat engine, but standard thermodynamics and standard entropy already accounts for all that. So if you are questioning standard thermodynamics and standard entropy, which is what you appear to be doing, you must have some other way of using that heat in mind. What? (Note that all of the specific activities you list can be powered by heat engines, so again standard thermodynamics and standard entropy accounts for them just fine.)Yes, but in the process, if you are using heat engines, you will also be creating a huge amount of entropy, which will appear in the exhaust of the heat engines. The amount of entropy created will be more than enough to balance out the entropy decrease you created by taking raw materials and building useful infrastructure out of them. That is what standard thermodynamics and standard entropy says.I would strongly recommend finding a specific, valid reference before making such a sweeping statement. What does an actual textbook or peer-reviewed paper on thermodynamics say? I strongly suspect that you will find it does not make any simplistic statement about "order" of the sort you appear to be imagining.That is because of the huge expansion the universe has undergone since then, plus the huge amount of gravitational clumping that has occurred in the matter in the universe. The simplistic equating of "hot" with "high entropy" and "cold" with "low entropy" that you appear to be using obviously does not work here. But of course standard thermodynamics does not claim that it should.No, that is not what the big bang is.Definitely not. You will not get anywhere with this attitude.Where? Please give a specific reference.On the contrary, a proper understanding of how thermodynamics and entropy work in the presence of gravity (which means in the context of General Relativity and curved spacetime) is essential to a proper understanding of our best current scientific models, and a huge amount of research spanning decades has been done on it.
You dont know what the big bang is, there is debate over this, scientists are not in consensus? And you see I mark the word infinatley so I use the word loosly. And you and no one can say I am wrong, nobody knows what happened at the start of the uneverse, there are many theories.
 
  • Skeptical
Likes weirdoguy
  • #13
rolnor said:
You dont know what the big bang is, there is debate over this, scientists are not in consensus? And you see I mark the word infinatley so I use the word loosly. And you and no one can say I am wrong, nobody knows what happened at the start of the uneverse, there are many theories.
And textbooks or "peer reviewed papers" dont contain "the truth" they contain the latest theories at best.
 
  • #14
Nugatory said:
Entropy is disorder” is a kinda OK explanation for people who aren’t going to do the math, but like all math-free explanations it is a poor foundation for building any deeper understanding. The mathematical definition of entropy does not depend on disorder and does give us a clear relationship between temperature, entropy, and what you are calling “spreading energy out”.
Thanx. I think if you want to make a pair of binuculars (something "ordered" "low entropy" according to popular science) here on earth you need mass in the system that has a temperature of say 1600K, you need to melt sand, forge metal. But we can imagine that there is a metal alloy that melts at 20K and some sort of sand that melts at 20K on some other planet, then you can make binuculars at temperetures close to absolute zero and then you can have very high entropy (the energy being spread ut) in the system and still be able to make "ordered things" things with "low entropy". So the term "ordered" to describe low entropy is very confusing and absolutely incorrect.
 
  • #15
rolnor said:
You dont know what the big bang is, there is debate over this, scientists are not in consensus?

Of course we know what big bang is, because it's the name of one of the features of mathematical model that is in the best agreement with observational data out of all models we have. And it says nothing about the begining of the Universe.
 
  • #16
rolnor said:
And textbooks or "peer reviewed papers" dont contain "the truth"

Of course, but they are way closer to the truth than some random thoughts of people who are not physicists :smile: There are a lot of textbooks that say things that are not true. But that is not the point.
 
  • #17
weirdoguy said:
Of course, but they are way closer to the truth than some random thoughts of people who are not physicists :smile: There are a lot of textbooks that say things that are not true. But that is not the point.
"Random thoughts from people that are not physisists" And still I was right about the term "ordered" to describe low entropy is incorrect. So, not so random after all. Here is from wikipedia, they use randomnes, disorder to describe entropy:
1690199667992.png
 
  • #18
rolnor said:
"Random thoughts from people that are not physisists" And still I was right about the term "ordered" to describe low entropy is incorrect. So, not so random after all. Here is from wikipedia, they use randomnes, disorder to describe entropy:
View attachment 329586
And physisists dont agree on what happened at the big bang, I could ask you to show me that paper, how claims they "know" that? I am an amateur, not stupid.
 
  • #19
rolnor said:
"Random thoughts from people that are not physisists

I was not talking about you, just in general justifying why PF has this "textbook or peer-reviewed source" rule.
 
  • #20
weirdoguy said:
I was not talking about you, just in general justifying why PF has this "textbook or peer-reviewed source" rule.
Hm... Is that really true? Really?
 
  • #21
rolnor said:
Hm... Is that really true? Really?
And the wikipedia text, is that also "random thoughts"? I understand that wikipedia is not a source of high quality
 
  • #22
Yes, I know what I had in mind when I wrote that :smile: And all of us are just random people from the internet. I think we can end that off-topic.

rolnor said:
And the wikipedia text, is that also "random thoughts"?

Not in general, because writers are supposed to write textbooks stuff, not what they think about the topics they are writing about. But the more technical the topic is, the less people know enough about it, so some of the articles are not a good sources to learn.
 
  • Like
Likes berkeman and Nugatory
  • #23
rolnor said:
Hm... Is that really true? Really?
Really true that PF has that rule? Yes.
Really true that counterproductive threads like this one are the reason why? Yes.
 
  • Like
Likes phinds, berkeman, rolnor and 1 other person
  • #24
Entropy is a pretty difficile subject, and it also depends a bit on your taste, which definition of it you prefer. For me entropy was an enigma until I learnt about the information-theoretical approach in a special lecture about statistical mechanics.

In this approach entropy is a measure for the missing information, for a given probability (density) distribution, relative to having "complete knowledge" about the system. More intuitively you can say entropy is a "measure of surprise" given a probability distribution.

It can also be used as a means to make guesses for probability distributions, given some (incomplete) information about the system, which imply the least prejudice given this information. According to this approach the best guess is to choose that probability distribution that is compatible with the given information, which maximizes the entropy.

As it turns out, in physics, this missing-information measure, introduced by Shannon in the context of signal propagation through noisy channels, is precisely given by the von Neumann entropy of the statistical operator in quantum theory or the corresponding classical limit a la Boltzmann:
$$S=-k \mathrm{Tr} (\hat{\rho} \ln \hat{\rho}).$$
In the microcanonical ensemble, where the energy of gas is fixed, it's the ##S=k \ln \Omega##, where ##\Omega## is the degeneracy of the energy eigenspace for the given energy, and from that you can derive that this information-theoretical entropy definition coincides with the entropy definition as given in phenomenological thermodynamics.
 
  • Like
Likes rolnor
  • #25
rolnor said:
You dont know what the big bang is, there is debate over this, scientists are not in consensus?
Really? Please give references showing the different points of view on the big bang.
 
  • #26
@rolnor the problem here is that you have two completely unrelated topics in this thread. The first is your opinion that the word “order”, used by many pop-sci sources, is not a good way to describe entropy. The second is your claim that there is some entropy-related flaw in the Big Bang model.

The two topics are completely unrelated. The Big Bang model is based on the mathematical formulas of thermodynamics and general relativity. The words used in pop sci books are irrelevant. Scientists use the equations and experimental observations to determine the validity of the model, not pop sci descriptions. So regardless how bad the pop-sci terminology may be, it has nothing to do with the professional science.

You will find a lot of support here for your first topic, including from myself. Your second topic is not a consequence of the first, and does not correctly reflect the state of science. Currently, the main disagreements about the Big Bang focus on the correct values for a couple of different parameters in the model, not disagreement about the model itself.
 
Last edited:
  • Like
Likes berkeman, vanhees71, rolnor and 2 others
  • #27
Indeed, one should separate the two topics into different threads. The general notion of entropy in statistical physics is pretty clear. The information-theoretical approach is becoming more and more the foundational understanding of its meaning, given the amazing experiments with quantum dots etc. concerning the (quantum version) of Maxwell's demon, and the clear demonstration of the correctness of the information-theoretical foundation of entropy. Also using this foundation, the historical ones (phenomenological thermodynamics a la Clausius, Planck et al, Boltzmann's H-theorem concerning the off-equilibrium case, etc) can be derived.

The other issue is the issue with (quantum) statistical mechanics and gravitation. Already in the Newtonian realm, it's not so simple concerning the kinetic approach to entropy because of the long-range nature of the gravitational interaction. In contradistinction to the electromagnetic case, where the issue is resolved due to screening and the possibility to separate the long-ranged parts into the electromagnetic mean field (Vlasov-Boltzmann equation), this is not the solution for the gravitational interaction, and that's of course important to understand "structure formation". As I said, I'd separate this topic from the more general one about entropy.
 

FAQ: Is low entropy found in something very hot?

Is low entropy typically found in very hot systems?

No, low entropy is not typically found in very hot systems. Entropy is a measure of disorder, and higher temperatures generally lead to higher entropy because the particles in the system have more energy and can occupy more states.

Can a very hot system ever have low entropy?

In general, a very hot system will have higher entropy. However, if the system is highly ordered despite its high temperature, such as in certain states of matter like Bose-Einstein condensates, it could exhibit relatively low entropy. These are special cases and not the norm.

How does temperature affect entropy?

Temperature affects entropy because as temperature increases, the kinetic energy of particles in a system also increases. This leads to a greater number of possible microstates and thus higher entropy. Conversely, lower temperatures generally mean lower entropy.

What are examples of low entropy systems?

Examples of low entropy systems include crystalline solids at very low temperatures, where atoms are arranged in a highly ordered structure, and certain quantum states like Bose-Einstein condensates, where particles occupy the same ground state.

Is it possible to decrease entropy by increasing temperature?

Typically, increasing temperature increases entropy. However, in certain controlled systems, like laser cooling techniques used in quantum mechanics, increasing the energy input can lead to a decrease in entropy by forcing particles into a more ordered state. These are specialized and exceptional cases.

Back
Top