What exactly happens when entropy reaches its maximum value in the universe?

  • Thread starter Archosaur
  • Start date
  • Tags
    Entropy
In summary, John is trying to learn more about entropy and thermodynamics. He has some questions about the conservation of entropy law and the possible implications of a maximum entropy universe. He is also curious about the nature of white holes and black holes.
  • #1
Archosaur
333
4
Let me preface by saying this: I do not claim to know anything at all about entropy. It's a new thing to me and I'm trying to add the laws of thermodynamics to my belt of knowledge. Forgive me if I am completely off target.

This is what I understand entropy to be. I hear it defined as the amount of disorder of a system, and as the measure of the inability of a system's energy to do work. I also know that a system's entropy increases with every transfer of energy and that it can't decrease. Order can't be restored. I guess I understand this, too. If it were't true, a perpetual motion device wouldn't actually be impossible (right?)

Anyway. I'm having a hard time wrapping my head around the fact that the universe, as an isolated system, will become less and less ordered until total entropy reaches a "maximum value".
What exactly would our universe look like with maximum entropy? Sounds to me like it would consist of all particles in the universe equally distributed throughout the universe and having an enormous, constant temperature. Is that right? Also, would that be the end of the universe? I don't see how anything could come after that, but that's too much of a depressing end to me. Everything reduced to its parts. Nothing interesting left. Just particles whizzing around...

The thing I'm really stuck on is that there is no "law of conservation of entropy". The laws of physics always seemed so symmetrical and self-sufficient to me.

Energy is conserved.
Momentum is conserved.
Charge is conserved.
Matter is conserved.
Blah blah blah...

Now I'm hearing that, in an isolated system, a value can increase overall, but can't decrease. I don't like that!

I don't know nearly enough to start speculating, but I had a thought. I hope I don't make a complete fool of myself...

I've also heard that the theoretical "white holes" can't exist because they violate the second law of thermodynamics in that they decrease entropy. Well, what if white hole - black hole pairs stitch together all universes and an isolated system only appears to accumulate entropy, when in reality, it's being decreased at the same rate at another point in the universe by a white hole which is not "creating" order but "stealing" from another universe via its sister black hole. On the flip-side, black hole's increase in entropy more than compensates for the entropy of any object it swallows. Maybe the excess entropy is actually being drained from a white hole in another universe. Maybe entropy can be described as a current that flows throughout all universes and in reality, across all universes, entropy is conserved.

I realize that there is probably about a .03% chance of what I just said having any validity. Can someone give me some perspective?
 
Science news on Phys.org
  • #2
I think these are very reasonable, well-founded questions, and your intuition is good. A few comments:

Analogies for entropy such as "disorder" are problematic. Your disorder might be another person's order. An increase in order on one scale (ice cubes -> liquid water) can actually correspond to an increase in entropy. "Dispersal" has become a popular term, but when two counterspinning wheels come into contact and slow down, for example, nothing is really dispersed, but entropy increases considerably. So the best way to handle entropy, I think, is to use its actual definition,

[tex] S=-k\sum_i p_i \ln p_i[/tex]

where [itex]p_i[/itex] is the probability of being in microstate [itex]i[/itex], and to learn more about microstates (single arrangements of particles/atoms, each of which are compatible with bulk descriptions like pressure and temperature. For each bulk macrostate description, many compatible microstates usually exist.) More calculations regarding thermodynamic entropy can be found http://john.maloney.org/Papers/Determining%20entropy%20(5-2-07).pdf" and may help in getting comfortable with the equation side.

Your view of a system at maximum entropy (particles evenly scattered and in thermal equilibrium) is right on.

As for conservation laws: parameters like force and voltage aren't conserved either, so conservation isn't universal. On the other hand, these parameters aren't extensive, like energy and charge and... entropy. John Baez says that entropy is "paraconserved"; it is conserved in the absence of gradients, but otherwise it increases. It is indeed an odd variable, and this asymmetry is connected to (but not explained by and does not explain) our classification of energy transfer into heat and non-heat (i.e., work and matter).
 
Last edited by a moderator:
  • #3
Thank you very much! I read up on microstates wow does that help a lot. And thanks for the equation and the links!

The melting ice cubes example was helpful, too. I really hadn't been thinking about it the right way. (thanks wikipedia)
 
  • #4
In my opinion, entropy is one of those things that is easy to think about on the small scale and hard to think about on the large scale. Unfortunately, it is almost always applied on large scales!

Its definition is indeed made in terms of probability, but when you deal with lots of things it's easier to think about their "disorder" than their probability. The combinatorics of it all gets quite overwhelming.

Here's an example you might find helpful: http://gravityandlevity.wordpress.com/2009/04/01/entropy-and-gambling/
 
  • #5
Mapes's words are very illustrative. I also have trouble with words like "disorder" because they have a connotation for laypeople that doesn't quite jive with a physicist's precise meaning.

If you're into computer programming, then it might be helpful to think about entropy as the number of accessible states that a program can fall into. More robust software systems have fewer states, and less entropy.

By the way, if it troubles you to think that entropy can only increase... perhaps that is the definition of the forward flow of time.
 

FAQ: What exactly happens when entropy reaches its maximum value in the universe?

What is entropy and why is it important?

Entropy is a measure of the disorder or randomness in a system. It is important because it helps us understand how energy flows and how systems tend towards equilibrium.

How is entropy related to the laws of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system always increases over time. This means that energy tends to spread out and become more dispersed, leading to an increase in disorder or entropy.

Can entropy be reversed or decreased?

In a closed system, entropy can never decrease, but it can be temporarily reversed or decreased in a local area by adding energy or work. However, this always results in an overall increase in entropy.

How does entropy affect living organisms?

Living organisms are constantly taking in energy and converting it to useful forms, such as movement or growth. This process generates waste and increases the overall entropy of the universe.

What are some practical applications of entropy?

Entropy has many practical applications, such as in thermodynamics, information theory, and chemistry. It is also used in fields such as environmental science, economics, and engineering to understand and predict changes in systems over time.

Back
Top