What is the real explanation for entropy and its role in chemical reactions?

  • Thread starter Senex01
  • Start date
  • Tags
    Entropy
In summary: I can't find them now. Entropy and specific heat capacity are both measures of the amount of heat energy required to raise a mass of a substance by a given number of degrees.
  • #1
Senex01
39
0
This must have been posted on here before, but I can't find any reference to it.

I've had to learn a little biochemistry related to my work. This led me to realize that I knew little chemistry, and set me out to learn some chemistry. Chemical reactions are significantly dependent, according to what I read, on entropy. This got me looking at entropy again, which I remember vaguely from school.

Some books try to explain entropy in terms of order/disorder. This seems a bit of a poor explanation to me. For example, they show a box with two gases in it separated by a barrier: the barrier is removed, and the gases mix. Thus order -> disorder, and the entropy increases. This still seems to beg the question of what "order" is. Also it helps not one whit when trying to explain what entropy means when you are trying to define Gibbs free energy.

An attempt I came across to tighten up the gas-dispersion explanation of entropy stated that it was a movement from low-probability state to a high-probability state. Of course, in this example this is nonsense, because any particular arrangement of gas particles is as equally probable as any other: we only have a "higher-probability", as we have grouped a larger number of particular arrangements into a single category, and so therefore a category with a large number of arrangements ("dispersed") has a higher probability than a category with a lower number of arrangements("compacted"). We can arbitrarily create any kind of categorization we like to describe arrangement of gas particles. Thus entropy and chemical reactions depend on what categories we choose to define. This seems very improbable, or at least a very confusing explanation.

Then there was the attempt to explain it in terms of information. We have good information on where the gas particles are at the start, the instant the barrier is removed. As the two mingle, we have less information on the relative locations of the gas particle of one kind relative to the gas particles of the other kind. This is really just the same as the order/probability explanation given before, but in different terminology. Still the same problem arises: does a chemical or physical reaction depend on how much knowledge we have about a system: surely the chemical and physical event occur even if don't know about them. And even if we had perfect knowledge, the reaction would still happen. We in fact do have pretty good knowledge of the reaction that occurs when a chromosome replicate itself: we know the structure of the chromosome, and the intermediate molecules that read it, and reconstruct its copy. Our near-perfect knowledge has no effect on the reaction.

So we'll drop this order/probability/knowledge analogy, unless someone explains it better to me.
 
Science news on Phys.org
  • #2
Entropy is the amount of information that you need in order to specify the exact microstate a system is in.

If a chemical reaction is happening in a test tube and you describe what is going on if you look at your test tube and take some measurements, then in terms of this imprecise macroscopic data, the number of microstates compatible with what you see/measure, is relevant.
 
  • #3
I've been reading Keeler and Wothers Why chemical reactions happen.

Then they bounce immediately into explaining entropy mathematically, in relation to heat. Of course, if we describe heat as roughly a measurement of the kinetic energy, or alternatively, the momentum of the movement of particles, there is a connection with the order/disorder explanations, but it is a bit tenuous.

So entropy S, itself, nobody seems to want to define in precise terms. However, they are happy to define the change in entropy dS, in terms of heat Q and temperature T.

dS = Q / T

Fine. So entropy is measured in Joules and is a measure of the heat absorbed. Then they, and people posting in Wikipedia and elsewhere happily substitute S for Q/T and vice versa.

How is this related to the specific heat capacity, which I remember studying in school? Specific heat capacity of a substance is the heat energy in Joules required to raise a certain mass of the substance by a certain number of degrees. (Kilogram per degree, when I did it, but you could use moles or Fahrenheit or whatever, I suppose.) The specific heat "c" for a body of mass "m" is given by

Q = mc * dT

So therefore

mc = Q / dT

and is specific heat is of course mesured in Joules as well. If you use mass as a single unit (one kilo or one mole) and also had the change in temperature a fixed unit, then specific heat capacity would equal entropy. What is the actual difference? Is there one?

Also we were given tables when I was at school that implied that specific heat capacity was constant at different temperatures, although it would appear perfectly reasonable to me if it were to change as temperature change, even to change for different substances in different ways.

The equation we had for entropy assumes a constant temperature during heat loss or gain, which is not particularly plausible, except for infinitesimal changes. The specific heat capacity equation assumes that the temperature is changes as the substance gains and loses heat. This seems far more reasonable. However, it makes entropy seem an even more confusing and ill-defined concept.
 
  • #4
Count Iblis said:
Entropy is the amount of information that you need in order to specify the exact microstate a system is in.

If a chemical reaction is happening in a test tube and you describe what is going on if you look at your test tube and take some measurements, then in terms of this imprecise macroscopic data, the number of microstates compatible with what you see/measure, is relevant.

That would mean that if I make more precise measurements - say a kind of Hamiltonian vector of all the particles (I remember Hamiltonian vectors from Penrose's Emperor's New Mind) - then the chemical or physical status of the reaction would be different, than if I just took a general measurement of its temperature, volume and density? I must be misunderstanding you.
 
  • #5
Senex01 said:
We can arbitrarily create any kind of categorization we like to describe arrangement of gas particles. Thus entropy and chemical reactions depend on what categories we choose to define. This seems very improbable, or at least a very confusing explanation.
You raise a good point. What they don't usually tell you is that the categories are determined by a certain set of thermodynamic properties.

Basically, any system can be in one of a large number of possible microstates. A microstate is just a particular arrangement of the particles in the system. Now, each microstate has a certain pressure P, temperature T, and number of particles N, and you can group the microstates by the values of those three variables, so that all the microstates in each group have the same values of P, T, and N. These groups, or macrostates, are the "categories" you're thinking of. By measuring the pressure, temperature, and number of particles (well in practice usually you'd measure something else, like volume, and then calculate number of particles), you can tell which macrostate a system is in, and the number of microstates in that macrostate determines the state's entropy.

Why do those particular variables, and no others, determine the groups? Well, actually you can use certain other variables, like volume in place of number of particles as I mentioned. But you still need 3 variables which are related to P, T, and N. I think the reasoning there is that the 3 variables represent the 3 ways in which a system can interact with its environment: it can transfer heat, it can do work, or it can gain or lose particles. And it stands to reason that from the point of view of the environment, the state of a system is fully defined by how it can interact - beyond that, what's really going on inside the system doesn't matter.
 
Last edited:
  • #6
Count Iblis said:
Entropy is the amount of information that you need in order to specify the exact microstate a system is in.

If a chemical reaction is happening in a test tube and you describe what is going on if you look at your test tube and take some measurements, then in terms of this imprecise macroscopic data, the number of microstates compatible with what you see/measure, is relevant.

Let me put it another way. I have test-tube A and test-tube B, that contain substances X and Y. I measure them in whatever way, and discover that the particles are more randomly distributed A than in B: X particles tend to be higher up the tube, say. Therefore, the mixture in test-tube B has higher entropy that that in test-tube A. Fine.

Then, by using some super measurement, I discover that there is a precise mathematical distribution, perhaps some kind of fibonacci sequence, that determines the exact position and movement of the particles in test-tube B: I wasn't aware of this before. So then, suddenly, test-tube A now has the higher entropy. Again, the problem is that we are making the chemical reaction dependent on my knowledge of the substances involved.

Which brings me to the same point: unless you have a precise technical definition of "microstate", to define the exact microstate of system, I have to describe all the positions/momentums of the particles in it. This is exactly the same quantity of information, no matter what state the system is in.
 
  • #7
diazona said:
Now, each microstate has a certain pressure P, temperature T, and number of particles N...

That's false. A microstate does not have a temperature at all.
 
  • #8
diazona said:
Basically, any system can be in one of a large number of possible microstates. A microstate is just a particular arrangement of the particles in the system. Now, each microstate has a certain pressure P, temperature T, and number of particles N, and you can group the microstates by the values of those three variables, so that all the microstates in each group have the same values of P, T, and N. These groups, or macrostates, are the "categories" you're thinking of.

Thanks, that, and the rest of your post makes sense to me. I can see how that connects with the 2nd law of thermodynamics, both when stated with and without reference to entropy.

Of course, the pressure, temperature and number of particles of system won't necessarily change in the spreading gas example, or in the other favourite example people use, the melting ice cube example, but I'll just put that down to misleading examples, and follow yours because it makes sense. So entropy is a relationship between a macrostate and a theoretical (but I presume practically incaculable) number of microstates.

How does this relate to heat transfer and Gibbs energy then? Or to the mathematical definition of entropy as a quantity of heat abosorbed at a certain temperature?
 
  • #9
Count Iblis said:
That's false. A microstate does not have a temperature at all.

so what actually is a "microstate" then?
 
  • #10
Senex01 said:
That would mean that if I make more precise measurements - say a kind of Hamiltonian vector of all the particles (I remember Hamiltonian vectors from Penrose's Emperor's New Mind) - then the chemical or physical status of the reaction would be different, than if I just took a general measurement of its temperature, volume and density? I must be misunderstanding you.


If you know the exact state of the system, and if it is an isolated sstem, you could simply predict the state it will be in some time later.


Let me put it another way. I have test-tube A and test-tube B, that contain substances X and Y. I measure them in whatever way, and discover that the particles are more randomly distributed A than in B: X particles tend to be higher up the tube, say. Therefore, the mixture in test-tube B has higher entropy that that in test-tube A. Fine.

Then, by using some super measurement, I discover that there is a precise mathematical distribution, perhaps some kind of fibonacci sequence, that determines the exact position and movement of the particles in test-tube B: I wasn't aware of this before. So then, suddenly, test-tube A now has the higher entropy. Again, the problem is that we are making the chemical reaction dependent on my knowledge of the substances involved.

Which brings me to the same point: unless you have a precise technical definition of "microstate", to define the exact microstate of system, I have to describe all the positions/momentums of the particles in it. This is exactly the same quantity of information, no matter what state the system is in.

A microstate of a closed system is the exact quantum state of the system. If you have one particle in a box, then it will have certain energy levels in which it can be in, just like an electron of the hydrogen atom. If you specify the the energy of the particle in the box with some small uncertainty, then it can be in energy level that falls within that uncertainty. The amount of information you need to specify exactly which one of the energy level the particle really is in is proportional the the logarithm of this number (the logarithm will be proportional to the number of digits of this huge number).

Now for an isolated system, all possible accessible microstates are equally likely. This leads to the conclusion that the entropy can only increase.
 
  • #12
Count Iblis said:
Now for an isolated system, all possible accessible microstates are equally likely. This leads to the conclusion that the entropy can only increase.

Don't those two sentences contradict each other? Unless you have a definition of "accessible"? Does "accessible" just mean "that we can predict"?

If I know something about the status of a system at time t, and it is changing randomly, then at time t+t' I know less about it, unless I make new measurements. That is obviously true, but it can't be what you mean. It would imply that if someone had taken measurements before me, then the entropy of the system would be, for them, higher than the entropy of a system for me.

So if we were measuring two different substances, and then the two substances reacted, the entropy of the two systems could be different for each of us, which would imply that we would each observe a different reaction. I am sure this is not what you mean.
 
  • #14
Count Iblis said:
To see how the definition S = k Log(Omega) yields the relation dS = dq/T, http://en.wikipedia.org/wiki/Second_law_of_thermodynamics#Proof_of_the_Second_Law"

What I'm looking for is a definition of entropy that does not use the term "entropy" in trying to define itself, which was why I was happier with the S = Q/T definition, even it it didn't make much sense on reflection. (As T could not remain constant under such conditions)
 
Last edited by a moderator:
  • #15
Senex01 said:
That is interesting, but it assumes the definition of "entropy" in defining "entropy", so I'm not so happy with the definition.


Well, S = k Log(Omega) is the definition of entropy for a system in thermal equilibrium. You can derive the relation dS = dq/T from that definition, not the other way around.
 
  • #16
Senex01 said:
Don't those two sentences contradict each other? Unless you have a definition of "accessible"? Does "accessible" just mean "that we can predict"?

If I know something about the status of a system at time t, and it is changing randomly, then at time t+t' I know less about it, unless I make new measurements. That is obviously true, but it can't be what you mean. It would imply that if someone had taken measurements before me, then the entropy of the system would be, for them, higher than the entropy of a system for me.

So if we were measuring two different substances, and then the two substances reacted, the entropy of the two systems could be different for each of us, which would imply that we would each observe a different reaction. I am sure this is not what you mean.

Accessible means that the state is a possible state the system can be in given what we know about the system. If the energy of a state does not fall within the known limits of the total eenrgy of the isolated system then, due to conservation of energy, that state is not an accessible state.

The fundamental postulate of statistical physics is that all accessible states are a priori equally likely. If a system is in equilibrium, then it can be found in all these accessible states with equal chance. If a system is not in equilibrium, then that means that it is more likely to be in some accessible states.

E.g. suppose you have a gas that was constrained to move in one part of an enclosure and you remove that constraint. Immediately after the constraint has been removed the gas has not had the time to move into the other part. So, not all of the states are equally likely. To say that they are a priori equally likely means that if you are given a system in which the gas can move freely as in this example and you are not told anything about its history, then all the states are equally likely. But the conditional probability distribution over all the accessible states, given that a constraint has just been removed, is not a uniform distribution.
 
  • #17
You are well named Count Iblis, the charm and devilry of your explanations does you credit.

I think we can avoid going into the explanation of eigenstates and the adiabatic theorem of quantum mechanics. The maths is entertaining. I notice though, that that definition depends directly on the total energy of the system, which brings us back to the definition that entropy is just a measurement of energy gain.

That definition is really equivalent to the definitnion that dia... whatver his/her name was ... gave: that it depends on the relationship between the microstates and the macrostate: the equation defines the macrostate in terms of the total energy in the system - presumably this would be the theoretical energy required to raise the system from absolute zero to its current temperature, although there could be other interesting complications ^- dia...whatever defined it in terms of ... was it pressure, temperature, number of particles, ... would ultimately amount to the same thing.

Entropy was originally created to describe the mechanics of steam engines, and I think we can more or less keep it at that level.
 
  • #18
I'm still very charmed.

The a priori equally liklihood of states is the assumption I'm making too.

If the entropy depends on our knowledge of the history of the system, then two observers who had different knowledge could observe different reactions.

Saying that it depends on knowledge is not the same as saying that it depends on the energy level of the system, and energy transfer between one system (0r one part of a system) and another. Which is what your fancy equation said.
 
  • #19
So if I have chart that says, inter alia, regarding the values of absolute entropy at 298 K

He (g): 126 J K-1 mol-1

This means that another system brought into contact with He at 298 K will gain 126 J per K of difference in temperature per mol of He. Is that correct?
 
  • #20
But then, this site

http://www2.ucdsb.on.ca/tiss/stretton/Database/Specific_Heat_Capacity_Table.html

says that the specific heat capacity of He Gas at 26 C (= 299 K??) is 5.3 J per gram... per degree K or C...

And http://www.ausetute.com.au/moledefs.html" , a mole of He is 4.003g,

SO

If a (cooler) substance is brought into contact with He gas at 299 K it will gain 5.3 x 4.003 = 21.2159 J per degree difference in temperature per mole.

I'm doing something wrong: what is the relationship between specific heat and entropy then? If any?
 
Last edited by a moderator:
  • #21
I mean, entropy is measured in joules per kelvin per mole,

and specific heat capacity is measured in joules per kelvin per mole, or else in joules per kelvin per kilo.

... and they both express the heat gained or lost by a system. So what is the difference between them, and why do charts give different values for them?
 
  • #22
Count Iblis said:
A microstate of a closed system is the exact quantum state of the system. If you have one particle in a box, then it will have certain energy levels in which it can be in, just like an electron of the hydrogen atom. If you specify the the energy of the particle in the box with some small uncertainty, then it can be in energy level that falls within that uncertainty.
...

Now for an isolated system, all possible accessible microstates are equally likely. This leads to the conclusion that the entropy can only increase.

Look, I'm sorry to be rude, but you didn't learn about quantum mechanics and energy levels of hydrogen, and the eigenstate functions and the other stuff mentioned in the Wikipedia section you cited, before you learned what entropy was. Otherwise you would never have learned any more physics at all.

I have a vague idea of quantum mechanics, like most literate people, but I'm just asking about entropy, and the basics of the laws of thermodynamics: either you're just showing off, or you're being incredibly perverse in bringing in the mathematical basis of quantum mechanics into the problem.

I really have to assume that you don't actually know what entropy is yourself but have just learned some equations without understanding what they mean, if you can't explain them without using concepts which beg the question. I can do the maths required by the problems, but they are completely meaningless, unless they have some relationship to reality. At the moment, I can't see any relationship to reality of heat transfer.

What is the difference between entropy and specifc heat?
 
  • #23
What I mean is "you weren't told to learn about eigenstates and the adiabatic theory of quantum mechanics before you learned what entropy was, otherwise you wouldn't have learned any more physics at all..." Why are you being so perverse?
 
  • #24
Even worse, once we get onto Gibbs energy, which is defined in the literature, both in Wikipedia and Keeler and Worthers as

dG = dH -TdS

G is the Gibbs free energy, the amount in joules of useful heat available for work, as I understand it. Here dG is the gain in free energy

H is the enthalpy, and dH the energy gain of the system, which under conditions of constant pressure is equal to dQ, the energy gain.

T is the absolute temperature of the system.

dS is the change in entropy of the system.

From the definition of entropy which we got from the mathematics, dS = dQ/T.

So, substituting, we get

dG = dH - TdQ/T
dG = dH - dQ

from dQ = dH under constant pressure, which is the norm for chemical reactions

dG = 0

which is also complete nonsense.

What's going wrong here??
 
  • #25
Sorry, but there is no way you can understand what entropy is before you have studied quantum mechanics. Historically, the concept of entropy was developed phenomenologically, i.e. without a deep understanding in terms of the fundamentals. This is what we call: "Classical Thermodynamics", and it is not taught at university anymore. The only correct way to learn thermodynamics is by first familiarizing yourself with the basics of quantum mechanics and then learning the basics of statistical physics.



The relation between entropy and specific heat is given by the relation:

dS = dq/T = c dT/T
 
  • #26
dG = 0

which is also complete nonsense.

What's going wrong here??

In your derivation you were implicitely assuming thermal equilibrium. G tends to decrease for a system that is not in equilibrium and is thus a minimum in equilibrium. So, if you assume equilibrrium from the start you can't find anything else than dG = 0.

See the detailed discussion about the analogous case of the Helmholtz free energy here:

http://en.wikipedia.org/wiki/Helmholtz_free_energy
 
  • #27
Count Iblis said:
Sorry, but there is no way you can understand what entropy is before you have studied quantum mechanics.

All right, I'm sorry to be difficult, thanks for being patient. Does this mean that they do not teach the laws of thermodynamics in high school any longer?
 
  • #28
Count Iblis said:
The relation between entropy and specific heat is given by the relation:

dS = dq/T = c dT/T

That's very helpful.

Dividing by T seems a strange thing to do, but if it produces the ratio you are looking for...
 
  • #29
Count Iblis said:
In your derivation you were implicitely assuming thermal equilibrium. G tends to decrease for a system that is not in equilibrium and is thus a minimum in equilibrium. So, if you assume equilibrrium from the start you can't find anything else than dG = 0.

See the detailed discussion about the analogous case of the Helmholtz free energy here:

http://en.wikipedia.org/wiki/Helmholtz_free_energy

I thought that was the equation for dG, but I suppose it's all the same.

I was assuming not just thermal equilibrium, but that H and TS both referred to the same system at the same moment of time. This obviously can't be the case if G is either positive or negative. Either H or TS have to refer to systems disassociated from each other in either space or time.
 
  • #30
Keeler and Wother don't explain G=H-TS, but from their examples it is clear they mean:

H = the energy produced or consumed by the chemical reaction we are arbitrarily considering

TS = the temperature of the system * the entropy of the system.

Since TS = TQ/T, I don't see why they just don't use Q, but since S seems to be available from tables, I suppose it has a purpose.
 
  • #31
Since dG = dH - TdS

and

dS = dQ/T = c dT/T

then

dG = dH - c*dT

Doesn't it? Actually, that makes real sense.
 
  • #32
In thermodynamics there are 2 sorts of quantities. Heat and work are "path dependent" and require a knowledge of the history of the system. Energy, entropy are "state variables" and are properties of the system at that point in time - no knowledge of the history of the system needed. In the formula dS=dQ/T the left hand side is a state variable, the right hand side is path dependent, so the right hand side must be specified along a reversible path for the equality to hold.
 
  • #33
As far as I see it though - going by the information/microstate definition of entropy, we con only say "higher probability" = "lower entropy" (by definition) if we add the corrollary

"lower entropy" = "higher probability" of having other states at the same energy level
 
Last edited:
  • #34
Senex01 said:
As far as I see it though - going by the information/microstate definition of entropy, we con only say "higher probability" = "lower entropy" (by definition) if we add the corrollary

"lower entropy" = "higher probability" of having other states at the same energy level

Yes - this is called the "microcanonical ensemble" if you'd like to read up on it. (I'm not sure about the information thing though).
 
  • #35
atyy said:
In thermodynamics there are 2 sorts of quantities. Heat and work are "path dependent" and require a knowledge of the history of the system. Energy, entropy are "state variables" and are properties of the system at that point in time - no knowledge of the history of the system needed. In the formula dS=dQ/T the left hand side is a state variable, the right hand side is path dependent, so the right hand side must be specified along a reversible path for the equality to hold.

I think I need "heat" explained to me as well then. I thought heat was the total kinetic energy of the particles of the system. And temperature is a somewhat arbitrary measurement of the kinetic energy or momentum (I'm not sure) that the particles transfer, directly or via radiation, on other particles. Therefore two systems will be "at the same temperature" when they transfer equal amounts of heat energy to each other. But two systems could have quite different ratios between internal kinetic energy and the energy they transfer to external systems.

I mean, that is not supposed to be a statement of fact, just what I thought I understood.
 
Back
Top