How do living beings evolve in a universe with increasing entropy?

  • Thread starter Deepak K Kapur
  • Start date
  • Tags
    Entropy
In summary: But, if we spread the icing evenly(uniformly), it is 'ordered'...If icing is to be spread on a cake and we do it in lumps (boundaries), it is disorderedYou are confusing "ugly" with "disordered". Just because you have an aesthetic preference for smooth rather than lumps doesn't mean it is more ordered. If you start with big lumps with sharp boundaries and randomly perturb it (e.g. heat or vibration) then you can get smooth icing and the boundaries will reduce. If you start with smooth icing and randomly perturb it then you will not suddenly get big lumps with sharp boundaries. Despite your efforts, you will not be able to smooth out an arbitrarily
  • #1
Deepak K Kapur
164
5
Our universe is considered a closed system. Law says that the entropy of a closed system is bound to increase.

Then how could living beings evolve when they are an extremely ordered system?
 
Science news on Phys.org
  • #2
The Earth is not a closed system. It gets energy from the sun.
 
  • Like
Likes koushik pilla
  • #3
Deepak K Kapur said:
Our universe is considered a closed system. Law says that the entropy of a closed system is bound to increase.

Then how could living beings evolve when they are an extremely ordered system?

Within the closed system, there can be pockets where entropy decreases, where that is balanced out by pockets where entropy increases. It is just that the TOTAL overall entropy increases.

Zz.
 
  • Like
Likes rude man
  • #4
ZapperZ said:
Within the closed system, there can be pockets where entropy decreases, where that is balanced out by pockets where entropy increases. It is just that the TOTAL overall entropy increases.

Zz.

Any experimental proof for this statement.
 
  • #5
Deepak K Kapur said:
Any experimental proof for this statement.

Us. We exist.
 
  • Like
Likes rude man and Dale
  • #6
Deepak K Kapur said:
Any experimental proof for this statement.

Drop an ice cube into a cup of warm water. Compute!

Zz.
 
  • Like
Likes rude man and Dale
  • #7
ZapperZ said:
Drop an ice cube into a cup of warm water. Compute!

Zz.

OK. Fine.

But, I have a problem regarding this.

When the ice cube melts completely an equilibrium will be reached
i.e a state of highest entropy will be reached.

Isn't equilibrium itself a kind of 'ordered state' (a state where there is perfect balance).

Why to call such a 'balanced' state a disordered one?
 
  • #8
Deepak K Kapur said:
OK. Fine.

But, I have a problem regarding this.

When the ice cube melts completely an equilibrium will be reached
i.e a state of highest entropy will be reached.

Isn't equilibrium itself a kind of 'ordered state' (a state where there is perfect balance).

Why to call such a 'balanced' state a disordered one?

You have a strange way of defining "equilibrium". In fact, I find many of the stuff you've "acquired" along the way in many of your posts to be rather strange.

"Equilibrium" simply means, in this case, d(something)/dt = 0.

You need to really look and read the PHYSICS (not the pedestrian) definition of entropy. How about starting with reading the stuff they have on the entropy site:

http://entropysite.oxy.edu/

Zz.
 
  • Like
Likes rude man
  • #10
Deepak K Kapur said:
Any experimental proof for this statement.
Heat pumps work.
 
  • #11
ZapperZ said:
In fact, I find many of the stuff you've "acquired" along the way

Zz.

I think 99.999999% of people on this planet only acquire stuff along the way...

It's only a miniscule minority that ever says something new...
 
  • #12
Deepak K Kapur said:
Isn't equilibrium itself a kind of 'ordered state' (a state where there is perfect balance).
I think this may be a big source of your conceptual unease. You have this exactly backwards. Thermal equilibrium is not an ordered state.

Rigorously, you should stick with the standard and unambiguous notion of entropy, and not your colloquial concept of order. However, if you do insist on thinking in colloquial terms then you at least need to think carefully about your concept.

Colloquially, order is when the books are in the book case, the clean laundry is in the drawers, and the dirty laundry is in the hamper while disorder is when they are all on the floor. Order has boundaries and separations and disorder is uniform. Equilibrium is disorder.

Saying that equilibrium is ordered is wrong both rigorously and colloquially.
 
  • Like
Likes jerromyjon, mishima and NFuller
  • #13
Dale said:
I think this may be a big source of your conceptual unease. You have this exactly backwards. Thermal equilibrium is not an ordered state.

Rigorously, you should stick with the standard and unambiguous notion of entropy, and not your colloquial concept of order. However, if you do insist on thinking in colloquial terms then you at least need to think carefully about your concept.

Colloquially, order is when the books are in the book case, the clean laundry is in the drawers, and the dirty laundry is in the hamper while disorder is when they are all on the floor. Order has boundaries and separations and disorder is uniform. Equilibrium is disorder.

Saying that equilibrium is ordered is wrong both rigorously and colloquially.

If icing is to be spread on a cake and we do it in lumps (boundaries), it is disordered..

But, if we spread the icing evenly(uniformly), it is 'ordered'...
 
  • #14
Deepak K Kapur said:
If icing is to be spread on a cake and we do it in lumps (boundaries), it is disordered
You are confusing "ugly" with "disordered". Just because you have an aesthetic preference for smooth rather than lumps doesn't mean it is more ordered. If you start with big lumps with sharp boundaries and randomly perturb it (e.g. heat or vibration) then you can get smooth icing and the boundaries will reduce. If you start with smooth icing and randomly perturb it then you will not suddenly get big lumps with sharp boundaries. Despite your dislike for such lumpy icing, it is in fact more ordered.

It is clear that your intuitive concept of "disordered" is simply wrong. This is one of the reasons why we develop rigorous quantitative definitions. Please stick with the technical concept of entropy. Your intuition for this concept will improve over time, but right now you need to use the rigorous definition as you work to correct some faulty assumptions.

Don't worry and don't give up. This sort of thing happens all the time, and it can be overcome by consistently relying on the rigorous definition until the intuition builds later.
 
Last edited:
  • Like
Likes nasu
  • #15
Dale said:
Don't worry and don't give up.

So, I ask further...

To my mind, if we come to know everything about all the particles at equilibrium (just suppose), we wouldn't call it disorder AT ALL. Then the entropy at equilibrium would be zero.

So, it seems our 'inability' is translated as disorder...

Hope, there is some sense in my pedestrian views.
 
  • #16
Deepak K Kapur said:
To my mind, if we come to know everything about all the particles at equilibrium (just suppose), we wouldn't call it disorder AT ALL. Then the entropy at equilibrium would be zero.
What does the definition of entropy say? Apply the actual rigorous definition to the situation.
 
Last edited:
  • #17
Deepak, why do you come here? When you're told you're wrong, you just dig in harder. I'm afraid that doesn't make you right. It also isn't the behavior of a student, it's the behavior of a crackpot. Finally and most importantly, the process of learning is exchanging wrong ideas for right ones. If you don't give up the wrong ideas, you're not learning, and just wasting your time - and everybody else's.

Redefining entropy makes discussion impossible. As I wrote a week ago:

Humpty Dumpty smiled contemptuously. 'Of course you don't — till I tell you. I meant "there's a nice knock-down argument for you!"'

'But "glory" doesn't mean "a nice knock-down argument",' Alice objected.

'When I use a word,' Humpty Dumpty said, in rather a scornful tone, 'it means just what I choose it to mean — neither more nor less.'

'The question is,' said Alice, 'whether you can make words mean so many different things.'

'The question is,' said Humpty Dumpty, 'which is to be master — that's all.'

Alice was too much puzzled to say anything; so after a minute Humpty Dumpty began again. 'They've a temper, some of them — particularly verbs: they're the proudest — adjectives you can do anything with, but not verbs — however, I can manage the whole lot of them! Impenetrability! That's what I say!'

And here we are, Humpty-Dumpting away again. If you don't use standard definitions, we cannot communicate.
 
  • Like
Likes NFuller and Dale
  • #18
Deepak K Kapur said:
So, I ask further...

To my mind, if we come to know everything about all the particles at equilibrium (just suppose), we wouldn't call it disorder AT ALL. Then the entropy at equilibrium would be zero.

So, it seems our 'inability' is translated as disorder...

Hope, there is some sense in my pedestrian views.

On one hand, there is the idea of entropy as expressing our lack of knowledge of the system. On the other hand, we have the thermodynamics idea of entropy, which feature in the 3 laws of thermodynamics. They're related but maybe a little bit different.

If we had a system of particles in equilibrium, the thermodynamic entropy is a definite calculated nonzero value. But thinking of entropy as being our lack of knowledge, if we could see all the particles positions, the system has zero entropy.

As another example, if you have spheres in a box, at low density they are very randomly arranged, but if you decrease the size of the box, they order into a lattice pattern. So, the knowledge entropy decreases. But, the system has no energy involved, so it's not so clear how to interpret this in terms of the thermodynamic definition of entropy.
 
  • #19
BruceW said:
On one hand, there is the idea of entropy as expressing our lack of knowledge of the system. On the other hand, we have the thermodynamics idea of entropy, which feature in the 3 laws of thermodynamics. They're related but maybe a little bit different.

No, the concept of STATISTICS is the expression of our lack of knowledge of every individual particles in the system, not just entropy. So the whole field of thermodynamics and statistical mechanics are included in that. This is not the definition of entropy.

If we had a system of particles in equilibrium, the thermodynamic entropy is a definite calculated nonzero value. But thinking of entropy as being our lack of knowledge, if we could see all the particles positions, the system has zero entropy.

Where did you get that?

Again, as with the wrong idea exhibited by the OP, entropy is NOT disorder, or even a lack of knowledge.

http://news.fnal.gov/2013/06/entropy-is-not-disorder/
http://entropysite.oxy.edu/entropy_isnot_disorder.html
http://www2.ucdsb.on.ca/tiss/stretton/CHEM2/entropy_new_1.htm
http://home.iitk.ac.in/~osegu/Land_PhysLettA.pdf

Zz.
 
  • #20
This wiki article https://en.wikipedia.org/wiki/Entropy_(information_theory) is the kind of entropy I'm thinking about when I say the amount of disorder, or lack of knowledge. They actually use the word 'surprisal' but it is maybe a bit intimidating for beginners, which is why people say disorder instead (I would guess).

So yes, this information definition of entropy would be one way to express our lack of information and there are many other statistics that you could choose.

Thermodynamics and statistical mechanics are different disciplines. They should agree on all concepts where they overlap, but you get so much weird stuff happening in statistical mechanics that I feel it's best to specify if someone is talking about statistical mechanics or thermodynamics at the outset, for clarity.
 
  • #21
BruceW said:
This wiki article https://en.wikipedia.org/wiki/Entropy_(information_theory) is the kind of entropy I'm thinking about when I say the amount of disorder, or lack of knowledge. They actually use the word 'surprisal' but it is maybe a bit intimidating for beginners, which is why people say disorder instead (I would guess).

So yes, this information definition of entropy would be one way to express our lack of information and there are many other statistics that you could choose.
You are taking a very lose interpretation of the article which does not clarify anything for the OP.
BruceW said:
Thermodynamics and statistical mechanics are different disciplines. They should agree on all concepts where they overlap, but you get so much weird stuff happening in statistical mechanics that I feel it's best to specify if someone is talking about statistical mechanics or thermodynamics at the outset, for clarity.
Thermodynamics and statistical mechanics are the respective macroscopic and microscopic theories for the same physical process. They are not different theories but are instead quite complimentary. Entropy in particular, on the macroscopic scale, is defined such that its differential change is equal to the differential change of heat in a system at constant temperature
$$\frac{\delta Q}{T}=\delta S$$
Notice that there is nothing in this equation that is directly implying disorder or "lack of information".
On the microscopic scale, entropy is defined as proportional to the natural logarithm of the number of micro states ##W## at equilibrium
$$S=k\text{ln}(W)$$
Again, this is not implying "disorder". It says that entropy is related to the number of ways one can organize the constituent particles of a system while still maintaining the macroscopic configuration.
 
  • Like
Likes Dale
  • #22
hm, as an example, the free energy of the 1D Ising model is $$f(\beta ,h)=-\lim _{L\to \infty }{\frac {1}{\beta L}}\ln(Z(\beta ))=-{\frac {1}{\beta }}\ln \left(e^{\beta J}\cosh \beta h+{\sqrt {e^{2\beta J}(\sinh \beta h)^{2}+e^{-2\beta J}}}\right)$$ I would prefer to say this macroscopic quantity is a result from statistical mechanics rather than thermodynamics, but I guess it just depends on the preference. Probably you could list it in either sections of a journal.

Also, maybe these articles are a bit better than the other one I linked to https://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_information_theory https://en.wikipedia.org/wiki/Entropy_(statistical_thermodynamics) I think our main difference is that I would also like to interpret entropy as being related to information, but you would prefer not to.
 
  • #23
The Wikipedia disambiguation page on entropy shows 16 scientific definitions (presumably all correct). IMO that makes entropy particularly hard to discuss. People talk past each other with differing definitions in their heads.
 
  • #24
BruceW said:
I think our main difference is that I would also like to interpret entropy as being related to information, but you would prefer not to.
anorlunda said:
The Wikipedia disambiguation page on entropy shows 16 scientific definitions (presumably all correct). IMO that makes entropy particularly hard to discuss. People talk past each other with differing definitions in their heads.
I think our goal here should be to help the OP to understand the canonical definition of entropy rather than discussing higher level concepts such as information theory. In order to do this, we should stick to the definition of entropy as stated in introductory statistical mechanics books.

@Vanadium 50 has already stated that we should be using standard definitions here.
 
  • #25
Thanks everyone for their answers...

A thing comes to my mind here...

'Everything is debatable'...
Maybe this line is also debatable☺️☺️
 
  • #26
Deepak K Kapur said:
To my mind, if we come to know everything about all the particles at equilibrium (just suppose), we wouldn't call it disorder AT ALL. Then the entropy at equilibrium would be zero.

If you knew the positions and velocities of each of the particles you'd have a definition of one single microstate. But there are lots of other microstates that would also be states of equilibrium. Now, look at a nonequilibrium state and you find that there are again, lots of different microstates that can form that nonequilibrium state. It turns out, though, that there are more microstates for the equilibrium state than there are for the nonequilibrium state, thus the equilibrium state is more likely, usually far more likely. That is the reason things tend towards equilibrium.

The same scientific methods that produced the 2nd Law also produced the theory of evolution. So when you argue that one of those ideas is false because the other one is true you accept the validity of one of them based on science but you reject the the validity of the other, and you also base that on science! Thus you are saying that science is wrong because science is right. :wideeyed:
 
  • #27
Deepak K Kapur said:
Thanks everyone for their answers...

A thing comes to my mind here...

'Everything is debatable'...
Maybe this line is also debatable☺️☺️
So are you going to apply the rigorous definition of entropy to your last question? You cannot expect to learn without some personal effort
 
  • #28
Mister T said:
The same scientific methods that produced the 2nd Law also produced the theory of evolution. So when you argue that one of those ideas is false because the other one is true you accept the validity of one of them based on science but you reject the the validity of the other, and you also base that on science! Thus you are saying that science is wrong because science is right. :wideeyed:

Exactly. Once upon a time, I used to engage in debates with creationists who used the "entropy says things should get more disordered" as an argument against evolution. Over time, I came to the point of view that this is not only wrong, but in a way, the opposite of the truth. The tendency toward increasing entropy is what drives evolution and other living processes, in a similar way that the tendency for water to run downhill can drive waterwheels to generate power.
 
  • #29
Dale said:
So are you going to apply the rigorous definition of entropy to your last question? You cannot expect to learn without some personal effort

You may feel annoyed, but...

Sometimes, rigorous tends to mean 'accepted'.
 
  • #30
Deepak K Kapur said:
You may feel annoyed, but...

Sometimes, rigorous tends to mean 'accepted'.

I don't think that rigorous ever means that. Rigorous means careful, pains-taking. Avoiding hand-waving and appeals to intuition. A rigorous derivation or proof is one where the assumptions and rules are clear, and the steps are laid out in detail so that there is pretty much no room for doubt about the conclusion (as long as you also accept the assumptions and rules).
 
  • Like
Likes BruceW
  • #31
Deepak K Kapur said:
OK. Fine.

But, I have a problem regarding this.

When the ice cube melts completely an equilibrium will be reached
i.e a state of highest entropy will be reached.

Isn't equilibrium itself a kind of 'ordered state' (a state where there is perfect balance).

Why to call such a 'balanced' state a disordered one?
When the ice cube has melted, at microscopic level there is more of the imbalance. Different molecules have different speeds, that kind of imbalance.

There are some micro states where the molecules have the same speed, but not many. So very very likely the melted ice never during the lifetime of the universe visits those micro states.
 
  • #32
Deepak K Kapur said:
Sometimes, rigorous tends to mean 'accepted'.

No it doesn't. And we're back in Humpty-Dumptyland.
 
  • #33
Deepak K Kapur said:
Sometimes, rigorous tends to mean 'accepted'.

So what? Do it in such a way that it's rigorous, but you don't accept it.

For example, researchers were able to present a rigorous argument that the 2nd Law is based on probabilities and the existence of atoms, but it was not accepted by many physicists. The reason being that they didn't accept the premises upon which the rigorous argument was based.

The rest of the story is not relevant to this discussion, but I'll add it in for completeness. It was subsequently established that the premises are valid and that the conclusion is valid. It then makes it harder to not accept the 2nd Law as being valid, but there are still plenty of people who don't accept it despite its rigor. Just look at the number of attempted patent submissions for devices that violate the 2nd Law.
 
  • #34
Deepak K Kapur said:
You may feel annoyed, but...

Sometimes, rigorous tends to mean 'accepted'.
Do you have any intention of putting in some personal effort to actually learn this material?
 
  • #35
Deepak K Kapur said:
You may feel annoyed, but...

Sometimes, rigorous tends to mean 'accepted'.

The more precise your question, the better the answer (in science, at least). Ultimately, the idea of scientific endeavour is to build a theory that has practical implications. To do this, precise (rigorous) definitions are needed.

I'm still not 100% sure what your concerns were. Taking the lumpy cake to a more physics-y example, suppose you have a box with lots of gas atoms inside. Call the fraction of atoms on the left-hand side of the box ##f##. We expect ##f=1/2## since all the atoms are bouncing around, they are equally likely to end up on either side. If we measure ##f## again, we again expect ##f=1/2##. Your intuition seems to be that perhaps next time ##f=0.7## or ##f=0.2## or some other random fraction. This is like your lumpy cake. You are not expecting the cake to be homogeneous. But the important point is the number of microstates corresponding to each macrostate. In our example of the box of atoms, you can think of ##f## as indicating the macrostate. There are so many microstates corresponding to ##f## being very close to ##1/2## that it means ##f## will equal ##1/2## every time.

p.s. Also, what Dale said.
 

Similar threads

Replies
3
Views
2K
Replies
1
Views
1K
Replies
3
Views
1K
Replies
26
Views
2K
Replies
2
Views
1K
Back
Top